The AI chip wars just got their first real plot twist in years.
On Wednesday, Cerebras Systems — the company that builds processors the size of dinner plates — didn’t just go public. It detonated onto the Nasdaq. Shares priced at $185, opened at $350, peaked near $386, and settled at $311. That’s a 68% first-day pop, a $95 billion valuation, and the biggest U.S. tech IPO since Uber went public in 2019.
The message couldn’t be clearer: investors believe the AI hardware story is bigger than Nvidia alone.
The Numbers That Made Wall Street Choke on Its Coffee
Cerebras originally targeted $115–$125 per share. Demand was so intense they bumped it to $150–$160. Then priced at $185. Still not enough. The stock opened at nearly double the IPO price.
The haul: 30 million shares sold, raising $5.55 billion. With underwriter options, total proceeds could hit $6.38 billion. CEO Andrew Feldman’s stake touched $2 billion at IPO price. CTO Sean Lie cleared $1 billion.
And the company is actually making money now. Revenue hit $510 million in 2025, up 76% year-over-year. More importantly, Cerebras swung from a $481.6 million loss to $237.8 million in net income. That’s the kind of profitability pivot that turns skeptics into believers.
Why a Chip the Size of a Wafer Actually Works
Cerebras’ entire thesis sounds like it shouldn’t work. Every other chipmaker slices a silicon wafer into dozens of small chips. Cerebras uses the entire wafer as one massive processor.
Their WSE-3 (Wafer-Scale Engine 3) packs 4 trillion transistors across 900,000 AI-optimized cores, delivering 125 petaflops of compute. That’s 19x more transistors and 28x more raw compute than Nvidia’s B200. The chip is 8.5 inches across — literally the maximum size current semiconductor manufacturing allows.
The advantage is architectural: no bottleneck from shuttling data between dozens of networked GPUs. For inference workloads — where trained AI models serve real-time responses — this translates to meaningful speed and cost gains. And inference is where the growth is. As AI agents multiply and enterprises deploy models in production, inference compute demand is dwarfing training.
From Dead IPO to Blockbuster: The Comeback Story
A year ago, this IPO was roadkill. Cerebras first filed in September 2024, but got stuck in a CFIUS review over investments from Abu Dhabi’s G42 — which also happened to account for 85% of revenue. Customer concentration that extreme is a death sentence for a public company prospectus.
Cerebras pulled back, diversified, and returned with a healthier book. G42’s revenue share dropped to 24%. The Mohamed bin Zayed University picked up 62% (still concentrated in the UAE, but across different entities). More critically, new customers appeared: OpenAI signed a $10 billion deal, and AWS came on board.
The semiconductor market cooperated too. The VanEck Semiconductor ETF has jumped 58% in 2026. Intel, AMD, and Micron have all posted triple-digit gains. AI hardware appetite is insatiable.
Nvidia’s Chess Move — And Why It Matters
Every AI chip startup gets the “Nvidia killer” label. Cerebras is smarter than that. They’re not trying to displace Nvidia in training — that ecosystem, anchored by CUDA, is essentially a moat with alligators. Instead, Cerebras is targeting inference, where architectural advantages matter more and the CUDA lock-in is weaker.
Nvidia saw this coming. Their $20 billion Groq acquisition in late 2025, followed by Groq-based inference products announced at GTC 2026, was a direct response to challengers like Cerebras. The inference market is now a real battleground.
Cerebras is also shifting from hardware sales to cloud services — putting them in competition with AWS, Google Cloud, and Azure. That’s a harder fight, but the margins are better and the recurring revenue story is what public markets crave.
What This Means for the AI Ecosystem
This IPO is bigger than one company. It’s a market signal.
After only 31 tech IPOs in 2025 — down from 121 four years earlier — the floodgates are creaking open. SpaceX (merged with xAI) is preparing a share sale. OpenAI and Anthropic could go public later this year. Cerebras just proved there’s massive appetite for AI infrastructure plays.
A $95 billion valuation on $510 million in revenue is aggressive — roughly 186x sales. But investors aren’t pricing today’s revenue. They’re pricing a world where inference compute demand explodes, where AI agents need dedicated silicon, and where Nvidia doesn’t get to charge monopoly rents forever.
That’s a shift from the “Nvidia or nothing” mindset of 2023–2024. The market is growing up.
The Risks Are Real
Customer concentration remains the elephant in the room. UAE entities still dominate the revenue base. The OpenAI relationship is described as “complicated” and “circular” in filings — never reassuring language. And competing with Nvidia’s ecosystem at scale is a generational challenge.
There’s also the macro question. Barclays expects AI capex to peak around 2028. If spending cools before Cerebras builds a durable, diversified customer base, that 186x revenue multiple evaporates fast.
The Bottom Line
Cerebras’ IPO doesn’t just validate one company’s architecture. It validates the thesis that AI hardware is entering a multi-player era. The training-dominated, Nvidia-monopoly phase is giving way to an inference-heavy, architecturally diverse future.
For anyone building AI infrastructure, investing in AI companies, or just trying to understand where this industry is heading: the chip wars are no longer a one-horse race.
And if you got IPO allocation at $185, congratulations. You nearly doubled your money before lunch.