AI startup Groq has successfully raised $640 million in a Series D funding round to accelerate the deployment of its cutting-edge AI inference computing technology. This latest investment, led by BlackRock Private Equity Partners with participation from Cisco Investments, Samsung Catalyst Fund, and Neuberger Berman, brings Groq's valuation to $2.8 billion.
The funding will be used to significantly expand Groq's Language Processing Unit (LPU) infrastructure, a key component of its GroqCloud platform. GroqCloud is a cloud-based service where developers can test and develop AI models at high speeds, and the startup aims to make it the largest AI inference compute deployment for a non-hyperscaler. As part of this expansion, Groq plans to add over 100,000 of its custom LPUs to power GroqCloud, with manufacturing handled by GlobalFoundries, expected to be completed by the end of Q1 2025.
Groq will also use the funds to scale its tokens-as-a-service (TaaS) offering, introduce new models and features to GroqCloud, and significantly expand its team, including roles in silicon engineering, supply chain operations, and sales management.
Jonathan Ross, Groq’s founder and CEO, highlighted the importance of inference computing in AI deployment, stating, "You can’t power AI without inference compute. We intend to make the resources available so that anyone can create cutting-edge AI products, not just the largest tech companies. Training AI models is solved, now it’s time to deploy these models so the world can use them."
Founded in 2016 by Ross, who previously led Google's custom hardware efforts, Groq has developed its own hardware optimized for running AI and software workloads. The startup claims its LPUs can run large models like Meta’s Llama 2 70B at over 300 tokens per second per user, while also supporting a wide range of models, including Meta’s new Llama 3.1 and smaller systems like Google’s Gemma, all while maintaining high speeds.
Currently, more than 360,000 developers use GroqCloud to power their AI training efforts, a number that is expected to grow as the platform continues to evolve.
Samsung Catalyst Fund's Marco Chisari, head of Samsung’s Semiconductor Innovation Center and EVP of Samsung Electronics, expressed enthusiasm for Groq’s advancements, stating, "We are highly impressed by Groq’s disruptive compute architecture and their software-first approach. Groq’s record-breaking speed and near-instant generative AI inference performance leads the market."
To ensure Groq's platform remains at the forefront of next-generation AI training, the company has brought on Meta’s chief AI scientist and Turing Award winner, Yann LeCun, as a technical advisor. LeCun will retain his role at Meta, where he is involved in the development of custom hardware, though his work at Meta primarily focuses on deep learning recommendation models rather than generative workloads.
In addition, Stuart Pann, a former senior executive from HP and Intel, has joined Groq as chief operating officer, further strengthening the company’s leadership as it scales its operations and technological offerings.