Silicon Valley-based Groq has secured a massive $750 million in funding, positioning itself as a serious contender to Nvidia in the race to power the next generation of AI.
Quick Summary – TLDR:
- Groq raised $750 million, pushing its valuation to $6.9 billion, more than double since last year.
- Over 2 million developers and Fortune 500 companies now rely on Groq for fast, affordable AI inference.
- Backed by major investors including Disruptive, BlackRock, and Samsung, Groq is expanding globally.
- The company is building the American AI infrastructure, in line with US policy to export AI tech abroad.
What Happened?
Groq, the AI chip startup known for developing its own language processing units (LPUs) instead of GPUs, has announced a fresh $750 million funding round. This new capital injection values the company at $6.9 billion, up from $2.8 billion just a year ago. The round was led by Disruptive, a Dallas-based investment firm, with participation from BlackRock, Deutsche Telekom Capital Partners, Neuberger Berman, and others.
This growth comes at a time when demand for AI inference computing is surging and the tech world is looking for viable alternatives to Nvidia’s dominance.
Demand for inference is insatiable.
— Groq Inc (@GroqInc) September 17, 2025
Groq just raised $750M to deliver more of it at the speed and cost devs need. 👇 pic.twitter.com/3rFqE91xcv
Groq’s Unique Approach to AI Chips
Unlike traditional AI hardware that relies on GPUs, Groq is betting big on its own proprietary architecture. Its LPUs (Language Processing Units) are specifically designed to optimize inference – the phase in AI where trained models are used to make predictions or generate outputs.
Groq’s inference engine, available both in the cloud and as on-premises hardware, supports open versions of leading AI models from Meta, Google, OpenAI, Mistral, DeepSeek, and others. This gives developers more flexibility while dramatically cutting costs and boosting performance.
- Cloud and On-Prem Options: Groq offers its technology both as a cloud service and as a physical server rack outfitted with its hardware/software stack.
- Compatibility with Major AI Models: It supports models from major companies like Meta and Google, ensuring broad applicability.
- Faster and Cheaper AI: Groq claims its solution offers equal or improved performance at significantly lower costs than current industry standards.
Major Backers and Global Expansion
The latest raise includes significant investments from tech giants like Samsung and Cisco, along with continued support from firms such as D1, Altimeter, 1789 Capital, and Infinitum. Notably, Disruptive has invested nearly $350 million into Groq to date.
Disruptive CEO Alex Davis emphasized Groq’s strategic importance, stating:
Groq is already running inference workloads for over two million developers and has a growing customer base that includes Fortune 500 companies. It has established data centers in North America, Europe, and the Middle East, including a newly launched facility in Helsinki, Finland to meet growing demand in Europe.
Boost from Global Partnerships
Earlier this year, Groq received a significant endorsement when Saudi Arabia’s AI firm HUMAIN, backed by the country’s sovereign wealth fund and chaired by Crown Prince Mohammed bin Salman, chose Groq to power its inference operations. This partnership cements Groq’s role in expanding AI capabilities beyond the United States.
Meanwhile, the White House has issued an executive order promoting the global deployment of US-origin AI infrastructure. Groq fits squarely into this vision, building American-made systems to lead the charge.
Groq’s Founder Brings Deep Expertise
Groq founder Jonathan Ross previously worked at Google, where he helped create the Tensor Processing Unit (TPU). That experience led to the foundation of Groq in 2016, the same year TPUs were announced. Today, Ross remains laser-focused on building a better, faster, and more cost-efficient way to deploy AI globally.
He remarked, “Inference is defining this era of AI, and we’re building the American infrastructure that delivers it with high speed and low cost.”
SQ Magazine Takeaway
I love seeing a company like Groq make such bold moves. They are not just another startup with an AI chip. They are reinventing the entire backbone of AI deployment. Inference is where the real-world applications of AI happen, and Groq is proving that speed and affordability can coexist. With serious funding, powerhouse investors, and a strategy that aligns with global AI trends, Groq looks like the first real David to Nvidia’s Goliath. If you’re watching the AI hardware space, this is the one to keep your eyes on.