The AI chip market just got a lot more interesting.
Cerebras Systems — the company that builds AI chips the size of dinner plates — filed updated paperwork raising its IPO price range to $150–$160 per share. That’s up from an already ambitious $115–$125 range set just last week, putting the company on track to raise up to $4.8 billion in what’s shaping up to be the biggest IPO of 2026.
With demand reportedly exceeding supply by more than 20 times, a $49 billion fully diluted valuation, and an OpenAI mega-deal providing tailwind, Cerebras is making a statement: Nvidia doesn’t get to own the AI chip future unchallenged.
Trading begins on Nasdaq under ticker CBRS on May 14.
The Chip That Defied Convention
Most semiconductor companies cut tiny chips from silicon wafers. Cerebras looked at that process and said, “What if we just… didn’t cut?”
The Wafer-Scale Engine 3 (WSE-3) uses an entire silicon wafer — roughly 57 times larger than Nvidia’s biggest GPU die. It packs 900,000 cores and 44 gigabytes of ultra-fast SRAM directly on-chip. Data doesn’t travel across cables and connectors. Cerebras claims its cores can access that onboard memory pool in a single clock cycle.
The WSE-3 ships inside a 1.8-ton appliance called the CS-3. It’s not subtle hardware — it’s industrial-scale AI infrastructure, and it’s finding an audience among organizations that need raw inference speed more than Nvidia’s ecosystem flexibility.
From Half-Billion-Dollar Losses to 47% Margins — In One Year
Here’s the number that makes investors salivate: Cerebras posted $510 million in revenue in 2025, up 76% year-over-year, with a net income of $238 million. That’s a 47% net margin at an IPO-stage company.
For context, CoreWeave went public in March 2026 at a $23 billion valuation and wasn’t profitable at the same scale.
In 2024, Cerebras lost $485 million. The swing from deep losses to strong profitability in a single year speaks to the operating leverage that kicks in once you land hyperscaler-sized contracts.
The company has also evolved beyond selling hardware. Cerebras now fills its own data centers with WSE-3 chips and offers cloud compute services — putting it in competition not just with chip vendors, but with AWS, Azure, and Google Cloud.
The OpenAI Connection (And the Trial Drama)
The single biggest catalyst for this IPO is OpenAI. The two companies signed a multi-year agreement covering 750 megawatts of AI compute capacity — one of the largest AI infrastructure contracts ever disclosed, worth over $20 billion.
OpenAI uses Cerebras hardware specifically for code generation models, suggesting the WSE-3’s inference speed advantage is particularly well-suited to low-latency coding workloads.
But it gets juicier. During the ongoing Musk v. OpenAI trial, Greg Brockman testified that OpenAI had discussed merging with Cerebras. Brockman and Sam Altman were both personal investors in Cerebras — a fact neither disclosed to Musk, which his legal team is now using to argue unjust enrichment.
Whatever the legal implications, the testimony served as high-profile advertising for Cerebras right before its IPO. You couldn’t buy that kind of publicity.
Why 20x Oversubscription? The Inference Thesis
The timing isn’t accidental. AI is shifting from training (building models) to inference (running them billions of times daily for users). Training happens once. Inference happens constantly and is growing exponentially.
Cerebras’s architecture is built for this moment. The massive on-chip SRAM pool eliminates the memory bandwidth bottleneck that limits GPU-based inference. The single-chip design avoids inter-node communication overhead that plagues GPU clusters serving large models.
Investors are betting that inference-era winners won’t necessarily be the training-era champions. Nvidia’s CUDA ecosystem creates enormous switching costs for training, but inference is a more open market where raw performance per dollar wins.
Add the AWS partnership announced in March — bringing Cerebras chips into Amazon’s cloud data centers — and institutional investors have two hyperscaler endorsements backing the thesis.
Twenty-times oversubscription is the market saying: we believe.
The Risks Nobody Wants to Talk About
A $49 billion valuation demands scrutiny.
Customer concentration is severe. OpenAI is estimated to account for the vast majority of revenue. If that relationship sours, or if OpenAI stumbles, Cerebras’s financials could deteriorate fast. The AWS deal helps diversify, but it’s early.
Nvidia’s moat keeps getting deeper. Nvidia controls 70–80% of the AI accelerator market, and its CUDA software ecosystem represents years of engineering investment customers can’t easily abandon. Every generation, Nvidia chips get better at the exact workloads Cerebras targets.
The valuation is steep. At 51x trailing revenue, Cerebras needs to sustain 76%+ growth for years to justify the price. Any deceleration will be punished harshly by public markets.
G42 controversy lingers. Cerebras’s first IPO attempt in 2024 was derailed by concerns over UAE-based G42, which accounted for over 80% of revenue at the time. CFIUS ultimately cleared both companies, but the episode is a reminder that Cerebras’s customer base has geopolitical dimensions.
What This Means for the AI Hardware Landscape
This IPO validates something important: the AI chip market is big enough for multiple winners.
For too long, the narrative has been “Nvidia and everyone else.” AMD’s Instinct chips, Google’s TPUs, and various startups have chipped away at the edges, but none has achieved the kind of financial credibility that a successful IPO at this scale represents.
If Cerebras trades well, expect accelerated hiring, expanded data center capacity, and aggressive pursuit of additional hyperscaler deals. The $4.8 billion in IPO proceeds gives it a war chest to build out infrastructure at a pace that was previously impossible.
It also raises stakes for other AI chip companies eyeing public listings. Groq, SambaNova, and others are watching. A strong debut could trigger a wave of AI hardware IPOs in the second half of 2026.
The Bottom Line
Cerebras has real revenue ($510M), real profits ($238M), and a real anchor in OpenAI. The wafer-scale chip architecture isn’t a science project — it’s shipping, performing, and hyperscalers are buying it.
But at $49 billion, investors are pricing in a future where Cerebras becomes a foundational layer of AI infrastructure, not just a niche supplier. That’s a big bet on a company with significant customer concentration and a dominant competitor breathing down its neck.
The IPO prices May 13, trading starts May 14. Whether it’s a generational entry point or peak AI hype depends on the next four quarters.
Either way, the AI chip wars just got a serious new combatant.