
Cerebras IPO: $4.88 Billion Valuation—Bubble or New King Challenging NVIDIA?
TechFlow Selected TechFlow Selected

Cerebras IPO: $4.88 Billion Valuation—Bubble or New King Challenging NVIDIA?
The CBRS IPO is the most noteworthy AI hardware capital event of 2026.
By Xiaohei, TechFlow
Priced on May 13; began trading on May 14 on the Nasdaq under the ticker CBRS.
This is the largest global IPO so far in 2026. The underwriting syndicate—Morgan Stanley, Citigroup, Barclays, and UBS—secured 20x oversubscription during the roadshow, pushing the offering price up from an initial range of $115–$125 to $150–$160. The deal is expected to raise $4.8 billion, implying a valuation of $48.8 billion.
Just three months ago, Cerebras’ secondary-market valuation stood at $23 billion—meaning its book value more than doubled in the final stretch before IPO.
The “pitch” has been repeated ad nauseam: challenger to NVIDIA, wafer-scale chips, inference speed 21x faster than the B200, and a $1 billion minimum (up to $20 billion) compute contract with OpenAI. It’s a textbook “AI challenger” narrative—technical story, geopolitical angle, star client, massive order—every component precisely aligned with the 2026 AI infrastructure theme.
Yet reading the S-1 filing page by page reveals something odd: all public reports tell one story, while the prospectus tells another.
The Triple Paradox
Breaking down the prospectus reveals a company defined by a “triple paradox.”
First paradox: Technically a true alpha; financially, accounting magic.
The prospectus discloses: $510 million in 2025 revenue, up 76% year-on-year; GAAP net income of $237.8 million. That sounds impressive—a fast-growing, profitable AI hardware company, practically “mythical” in today’s valuation environment. CoreWeave was still unprofitable when it went public in March; Cerebras delivered a 47% net margin outright.
Yet that $237.8 million “net profit” includes a one-time, non-cash accounting adjustment of $363.3 million—arising from extinguishment of a forward contract liability tied to G42. Removing this item and adding back $49.8 million in stock-based compensation yields a real non-GAAP net loss of $75.7 million for 2025—worsening by 247% from the $21.8 million loss in 2024.
In other words, the market sees a “profitable + 76% growth” IPO golden child; the prospectus reveals a “rapidly growing company whose losses are widening.” Neither version is technically wrong—the difference lies in which one the market chooses to believe.
Second paradox: Superficially freed from G42, but actually locked into a recursive loop with OpenAI.
Cerebras’ failed 2024 IPO attempt had a simple backstory: G42—a UAE-based customer—accounted for 85% of first-half revenue; CFIUS launched a national security review; the company withdrew its application.
Now, 18 months later, the customer roster appears diversified, with heavyweight names like OpenAI and AWS added. Yet the May 2026 S-1 shows the 2025 customer mix as follows:
- MBZUAI (Mohamed bin Zayed University of Artificial Intelligence): 62%
- G42: 24%
- Combined: 86%
G42 merely ceded “weight” to MBZUAI—a UAE-based entity and affiliate of G42. MBZUAI alone accounts for 77.9% of accounts receivable.
And OpenAI—the so-called “redemption arc”—is itself a nested arrangement. This contract exceeds $20 billion in value, with OpenAI committing to purchase 750 megawatts of compute. Yet the same filing also discloses several critical facts: OpenAI extended Cerebras a $1 billion loan; OpenAI received nearly free warrants for 33 million shares; and OpenAI’s Master Relationship Agreement contains exclusivity clauses restricting Cerebras from selling to certain “named competitors.”
In short, OpenAI is simultaneously Cerebras’ customer, lender, soon-to-be shareholder, and, to some extent, strategic controller. As one anonymous analyst bluntly put it in a Medium analysis: “When revenue is circular, valuation is circular, and the IPO exists solely to let those generating that revenue cash out—that isn’t a market; it’s financial engineering.”
The phrasing may be sharp—but factually, it’s hard to refute.
Third paradox: Surface-level “NVIDIA challenger”; in reality, a “narrow-band complement” to NVIDIA.
This point is easiest for the market to overlook.
Cerebras’ technology is genuinely strong. The WSE-3 packs 4 trillion transistors, 900,000 AI cores, and 44 GB of on-chip SRAM—turning an entire silicon wafer into a single chip and bypassing all inter-chip communication bottlenecks inherent in GPU clusters. Independent benchmarks from Artificial Analysis show that on Llama 4 Maverick (400B parameters), the CS-3 delivers over 2,500 tokens per second per user—versus ~1,000 for NVIDIA’s flagship DGX B200, and 549 and 794 for Groq and SambaNova respectively.
The numbers don’t lie: Cerebras holds a generational advantage over GPUs—specifically in inference.
Key word: “inference.” Cerebras’ own prospectus makes this clear—it excels at latency-sensitive inference workloads. It neither possesses nor intends to develop capabilities—or ambition—to challenge NVIDIA in large-model training or general-purpose computing. CUDA’s ecosystem, built steadily since 2007, spans nearly two decades: training toolchains, developer communities, third-party libraries—all remain firmly within NVIDIA’s moat.
More critically, the market isn’t standing still. At GTC 2026, NVIDIA unveiled the Vera Rubin architecture—336 billion transistors, with claimed performance 5x beyond Blackwell. AMD’s MI400 has already reached 320 billion transistors. Google’s TPU v6, Amazon’s Trainium 3, and Microsoft’s Maia 2—all hyperscalers are building custom chips. NVIDIA spent over $18 billion on R&D in FY2025; in December last year, it acquired AI inference startup Groq’s assets for $2 billion; and in March, it invested $4 billion across two photonics startups.
So the more accurate description is: Cerebras doesn’t aim to replace NVIDIA—it’s seizing a differentiated niche within NVIDIA’s narrow “inference” band. It’s a real business—but a $48.8 billion valuation against $510 million in revenue implies a price-to-sales (P/S) ratio of 95x.
Andrew Feldman’s Third “Product Sale”
Beyond the numbers, we must consider the company’s central figure.
Andrew Feldman is an underappreciated “serial entrepreneur” in Silicon Valley. He is neither a technical prodigy nor an academic founder—he earned his MBA from Stanford, served as VP of Marketing at Riverstone Networks (which went public in 2001), and later as VP of Product at Force10 Networks (acquired by Dell for $800 million in 2011).
In 2007, he co-founded SeaMicro with Gary Lauterbach, building “energy-efficient servers” by clustering many low-power, small-core processors to compete against mainstream high-power, large-core servers. The idea was visionary—but the market wasn’t ready. AMD acquired SeaMicro in 2012 for $334 million; Feldman served as VP at AMD for two years before departing.
Then he founded Cerebras.
Viewing Feldman’s trajectory holistically reveals something interesting: he is not a “chip designer,” but a “compute infrastructure contrarian bettor.” SeaMicro bet on “small cores beating large cores”—a half-right bet. AMD bought SeaMicro primarily for its Freedom Fabric interconnect technology, intending to build its own server CPU platform—but that path ultimately failed, and the SeaMicro brand quietly faded. Cerebras bets on “large chips beating small chips”—a proposition diametrically opposed to SeaMicro’s thesis.
In a sense, Feldman does the same thing repeatedly: identifying overlooked, seemingly “impossible” paths in computing architecture, placing heavy bets, and then leveraging exceptional sales acumen to push them into the market. At SeaMicro, he commanded Force10’s sales team; AMD valued him precisely for his sales network. With Cerebras, his most crucial achievement was securing G42—transforming a hardware company whose 2024 revenue was still 80% dependent on a single Middle Eastern client into one capable of signing a $20 billion contract with OpenAI.
A telling footnote: Feldman is a product-sales CEO—not a technology-visionary CEO. His edge lies in selling “seemingly crazy” products to clients willing to pay a premium for differentiation.
Understanding this is critical—it directly shapes how we assess Cerebras’ investment value.
So, Is CBRS Worth Investing In?
Overlaying the three paradoxes, the answer is far more nuanced than a simple “buy” or “don’t buy.”
If your goal is to capture the IPO’s first-day pop—20x oversubscription, AI hardware as the hottest sector, and a dearth of pure-play NVIDIA-alternative public companies—CBRS is highly likely to surge on Day One. This is event-driven, short-term trading requiring minimal deep analysis.
But if you’re evaluating a long-term hold, three questions demand careful consideration:
First: Does Cerebras deserve a 95x P/S multiple?
CoreWeave priced near a 15x P/S multiple in its March IPO. NVIDIA trades at roughly 25x P/S today. A company with $510 million in 2025 revenue, 86% customer concentration, and ongoing operational losses priced at 95x P/S implies the market expects it to grow revenue to $3–4 billion annually within three to four years—and sustain profitability.
Can it deliver? Success hinges on whether the $20 billion OpenAI contract materializes as planned. Per the prospectus, ~15% of remaining performance obligations—roughly $3.5 billion—are expected to be recognized in 2026 and 2027. If that pace holds, Cerebras could reach $2+ billion in revenue by 2027, bringing its P/S ratio into a more reasonable range. But any delay, any strategic shift by OpenAI, or any additional customer attrition would instantly undermine this valuation.
Second: How wide is Cerebras’ moat?
The architectural advantage of the WSE-3 is real—but how long will it last? NVIDIA’s Vera Rubin, AMD’s MI400, and Google’s TPU v6 are all advancing rapidly. Chip-generation cycles run 18–24 months. A single misstep by Cerebras risks rapid catch-up by competitors. Its R&D spend as a share of revenue is already substantial—but in absolute terms, it remains orders of magnitude smaller than the giants’ investments.
A deeper question: Is the wafer-scale chip approach destined to become a mainstream architecture—or will it forever remain a “special forces” solution confined to niche applications? There’s no definitive answer. Optimists argue: as inference’s share of total AI compute workload rises from ~30% today to >70% in the future, Cerebras’ niche becomes the main battlefield. Pessimists counter: NVIDIA need only lift Rubin’s inference performance, and the niche stays niche.
Third: Governance structure and geopolitical risk
The prospectus highlights two easily overlooked but critical points:
First, Cerebras employs a dual-class share structure (Class A/Class B). Post-IPO, insiders retain 99.2% of voting power. Even if the founding team holds just 5% of outstanding shares, they retain full control—leaving external minority shareholders with virtually no voice in corporate governance.
Second, the company discloses two “material weaknesses in internal control over financial reporting.” As an emerging growth company, it qualifies for a five-year exemption from SOX 404(b) auditor attestation. Not a flashing red light—but a yellow flag worth noting.
Geopolitically, CFIUS cleared the G42 voting rights issue this time—but export controls (licenses for shipping CS-2, CS-3, and CS-4 to the UAE) remain an enduring variable. The Trump administration’s policy direction on AI chip exports to the Middle East remains unsettled; any policy shift could reignite tail risk for CBRS.
Conclusion
This CBRS IPO is, as an event, the most closely watched AI hardware capital event of 2026. It sets the valuation anchor for AI infrastructure in the public markets—and its performance will ripple across pricing for all related securities.
As a long-term holding, it represents a classic “high-upside, high-uncertainty” bet: on the macro narrative of “inference supremacy,” the micro-execution of “Cerebras leveraging OpenAI to dominate a narrow-band monopoly,” and the valuation assumption that “the market will continue paying a 95x P/S premium for AI hardware.” All three conditions must hold simultaneously for outsized returns—and failure of any one triggers severe drawdowns.
For institutional investors, typical positioning involves avoiding the first-day pop, waiting instead for Q3 earnings, key customer updates, and valuation digestion. For retail investors, allocating a small portion to CBRS as a “tail-risk asset” within an AI hardware portfolio is reasonable—but treating it as an all-in “faith-based” position warrants re-reading the triple paradox above.
More consequential than whether CBRS surges at tomorrow’s open is what this IPO signals at a deeper level: When a company deriving 86% of its revenue from two UAE-related entities—and still operating at a real loss—can command a $48.8 billion market cap, it tells everyone exactly how frenzied capital has become in the AI infrastructure race.
Join TechFlow official community to stay tuned
Telegram:https://t.me/TechFlowDaily
X (Twitter):https://x.com/TechFlowPost
X (Twitter) EN:https://x.com/BlockFlow_News














