Original author: TechFlow, TechFlow
Pricing was set on May 13, and trading began on May 14. The Nasdaq ticker symbol is CBRS.
This is the largest IPO globally so far in 2026. The underwriting syndicate consists of Morgan Stanley, Citigroup, Barclays, and UBS. This lineup achieved 20 times oversubscription during the roadshow, raising the offering price from the initial $115-$125 to $150-$160. It is expected to raise $4.8 billion, corresponding to a valuation of $48.8 billion.
Just three months ago, Cerebras' secondary valuation was $23 billion. In other words, in the final stretch before its IPO, the company's book value more than doubled.
The story's "selling points" have been repeated a thousand times: Nvidia's challenger, wafer-level chips, inference speed 21 times faster than the B200, and a computing power contract with OpenAI starting at $1 billion and reaching up to $20 billion. This is a perfect "AI challenger" script, with a technological narrative, a geographical narrative, star customers, and huge orders; every component is precisely aligned with the main theme of AI infrastructure in 2026.
But if you read through the S-1 document page by page, you'll find something strange: all the public reports tell the same story, while the prospectus tells a different one.
Triple Paradox
By dissecting the prospectus item by item, Cerebras presents itself as a target company composed of a "triple paradox".
First level: Technically, it's a true Alpha; financially, it's accounting magic.
The prospectus reveals that Cerebras' revenue in 2025 reached $510 million, a 76% year-over-year increase, with GAAP net profit of $237.8 million. This sounds incredibly impressive; a rapidly growing and profitable AI hardware company is practically a "legendary" stock in the current valuation environment. While CoreWeave was still operating at a loss when it went public in March of this year, Cerebras boasts a net profit margin of 47%.
However, of this $237.8 million "net profit," $363.3 million came from a one-time, non-cash accounting adjustment: a paper gain from the extinguishment of forward contract liability related to G42. Excluding this and adding back the $49.8 million in stock-based compensation, the actual non-GAAP net loss for 2025 would be $75.7 million, a 247% worsening from the $21.8 million loss in 2024.
In other words, the market sees a "profitable IPO with 76% growth," while the prospectus discloses a "rapidly growing company with continuously expanding losses." Neither version is wrong; the difference lies in which one the market chooses to believe.
The second layer: On the surface, it has gotten rid of G42, but in reality, it has been replaced by OpenAI's nested loop.
The story of Cerebras' first failed IPO in 2024 is not complicated: G42, a client with an UAE background, contributed 85% of the revenue in the first half of the year, CFIUS initiated an investigation, and the company was forced to withdraw its application.
A year and a half later, the client list appears more diversified, with the addition of heavyweight clients like OpenAI and AWS. However, looking at the S-1 report from May 2026, the client structure in 2025 looked like this:
- MBZUAI (Mohammed bin Zayed University for Artificial Intelligence): 62%
- G42: 24%
- Combined: 86%
G42 simply relinquished its "weight" to MBZUAI, which is also located in the UAE and is a related party of G42. MBZUAI, as a single customer, accounts for 77.9% of accounts receivable.
OpenAI's so-called "redemption line" is itself a nested structure. The contract, valued at over $20 billion, saw OpenAI commit to purchasing 750 megawatts of computing power. However, the same document also revealed several other things: OpenAI provided Cerebras with a $1 billion loan; OpenAI received nearly free warrants for 33 million shares of Cerebras stock; and OpenAI's Master Relationship Agreement included exclusivity clauses, restricting Cerebras from selling to certain "named competitors."
In other words, OpenAI is simultaneously Cerebras's customer, lender, upcoming shareholder, and, to some extent, strategic controller. An anonymous analyst made a very blunt statement in an analysis on Medium: "When revenue is cyclical, valuations are cyclical, and IPOs are for those who generate that revenue to cash out, that's not a market, that's financial engineering."
The wording may be too harsh, but in terms of facts, it is hard to refute this statement.
The third layer: On the surface, it is Nvidia's "challenger," but in essence, it is Nvidia's "narrowband replacement."
This is the point that is most easily overlooked by the market.
Cerebras' technology is indeed robust. The WSE-3 boasts 4 trillion transistors, 900,000 AI cores, and 44GB of on-chip SRAM, integrating the entire wafer into a single chip, bypassing the cross-chip communication bottleneck that all GPU clusters must overcome. Independent Artificial Analysis benchmark tests show that running Llama 4 Maverick (400 billion parameters), the CS-3 outputs over 2500 tokens per user per second, compared to approximately 1000 tokens for NVIDIA's flagship DGX B200, and 549 and 794 for Groq and SambaNova, respectively.
The numbers don't lie; Cerebras has a generational advantage over GPUs in the specific scenario of inference.
The key word is "inference." Cerebras' own prospectus clearly states that its strength lies in latency-sensitive inference workloads, and it has no intention or capability to challenge NVIDIA's capabilities for large model training and general-purpose computing. The CUDA ecosystem has been built up over nearly 20 years since 2007, and its model training toolchain, developer community, and third-party libraries are all still within NVIDIA's competitive advantage.
More importantly, the market hasn't stood still. Nvidia's Vera Rubin architecture, unveiled at GTC 2026, boasts 336 billion transistors and claims a performance leap of 5 times over Blackwell; AMD's MI400 has already caught up to 320 billion transistors; Google's TPU v6, Amazon's Trainium 3, and Microsoft's Maia 2—mass-scale manufacturers are all developing their own chips. Nvidia plans to invest over $18 billion in R&D in fiscal year 2025, and last December it spent $20 billion to acquire the assets of AI inference startup Groq, and in March it invested another $4 billion in two photonics technology companies.
Therefore, a more accurate statement is: Cerebras is not trying to replace Nvidia; it's trying to carve out a differentiated niche within Nvidia's narrow "inference" band. This is a genuine business, but a valuation of $48.8 billion corresponds to $510 million in revenue, meaning a price-to-sales ratio of 95.
Andrew Feldman's third time "selling products"
Beyond the numbers, we need to talk about the key figure behind this company.
Andrew Feldman is an underrated serial entrepreneur in Silicon Valley. He is not a technical genius founder, nor did he come from an ivory tower. He graduated from Stanford Business School, served as the Vice President of Marketing at Riverstone Networks (which went public in 2001), and the Vice President of Product at Force10 Networks (which was sold to Dell for $800 million in 2011).
In 2007, he co-founded SeaMicro with Gary Lauterbach to create "energy-efficient servers," clustering together a bunch of small-core, low-power processors to compete with the then-mainstream large-core, high-power servers. The idea was very forward-thinking, but the market was too early. In 2012, AMD acquired SeaMicro for $334 million, and Feldman left AMD after two years as VP.
Then he did Cerebras.
Looking at Feldman's path together reveals something interesting: he wasn't a "chip designer," but rather an "alternative bettor on compute infrastructure." SeaMicro bet on "small cores defeating big cores," and lost half the bet. AMD acquired SeaMicro intending to use its Freedom Fabric interconnect technology for its own server CPU platform, but this path failed, and the SeaMicro brand quietly disappeared. Cerebras, on the other hand, bet on "big chips defeating small chips," the exact opposite of SeaMicro's proposition.
In a sense, Feldman did the same thing: find those seemingly "impossible" paths in computing architecture that were overlooked by the mainstream, bet heavily on them, and then use his strong sales capabilities to bring them to market. Back then, SeaMicro was able to control the Force10 sales team, and AMD was attracted by his sales network. The most important thing Cerebras did right this time was to secure the G42, enabling a hardware company whose products still derive 80% of their revenue from a single Middle Eastern customer in 2024 to ultimately sign a $20 billion contract with OpenAI.
The footnote to this story is: Feldman was a product-selling CEO, not a technology visionary CEO. He excelled at selling a "crazy-sounding" product to customers willing to pay a premium for differentiation—that was his alpha.
Understanding this is important because it directly determines the assessment of Cerebras' investment value.
So, is CBRS worth investing in?
If you consider the three paradoxes above together, the answer is actually much more complex than simply "buy" or "not buy".
If the goal is to capitalize on the initial surge of an IPO, given the 20-fold oversubscription, the hottest sector of AI hardware, and the lack of pure Nvidia alternative listings, CBRS will likely see a sharp rise on its first day. This is an event-driven short-term trade that doesn't require much in-depth analysis.
However, if you want to make an investment decision to "hold for the long term," there are three things you must think about first:
First, is Cerebras worth a price-to-sales ratio of 95?
CoreWeave went public in March of this year with a price-to-sales ratio of around 15. Nvidia's current price-to-sales ratio is about 25. A company with $510 million in revenue in 2025, a customer concentration of 86%, and still operating at a loss is priced at a price-to-sales ratio of 95, which means the market expects it to achieve $3 billion to $4 billion in revenue in the next three to four years and maintain profitability.
Whether this deal will succeed depends on whether OpenAI's $20 billion contract can be finalized as scheduled. According to the prospectus, approximately 15% of the remaining performance obligations, or about $3.5 billion, will be recognized in 2026 and 2027. If things proceed at this pace, Cerebras' revenue could reach over $2 billion in 2027, potentially bringing its price-to-sales ratio down to a reasonable range. However, any delay at any point in time, any strategic adjustment by OpenAI, or any loss of a new customer could instantly shatter this valuation.
Second, how wide is Cerebras' moat?
The architectural advantages of the WSE-3 are real, but how long will these advantages last? Nvidia's Vera Rubin, AMD's MI400, and Google's TPU v6 are all pushing their own. The generational replacement cycle in the chip industry is 18-24 months. If Cerebras lags behind, its technological advantage will be caught up. While its R&D spending as a percentage of revenue is already considerable, the absolute amount is still orders of magnitude smaller compared to the giants.
A deeper question is: will the wafer-level chip approach be a mainstream path that will be widely adopted, or will it forever remain a "special forces" operation confined to niche scenarios? There's no definitive answer. An optimistic view is that as inference workloads account for 70%+ of total AI computing power, Cerebras' niche will become the main battleground. A pessimistic view is that as long as Nvidia improves Rubin's inference performance, the niche will forever remain just a niche.
Third, governance structure and geopolitical risks
The prospectus disclosed two easily overlooked but very important things:
First, Cerebras employs a Class A/Class B dual-class share structure, with insiders holding 99.2% of the voting rights after the IPO. Even if the founding team only holds 5% of the outstanding shares in the future, they will still control the company. This means that external minority shareholders have virtually no say in corporate governance.
Second, the company disclosed two "material weaknesses in internal control over financial reporting." As an emerging growth company, it could be exempt from SOX 404(b) auditor certification for five years after its IPO. This is a red flag, not a major one, but worth noting.
Geopolitically, CFIUS has cleared up the voting rights issue of G42 this time, but export controls (CS-2, CS-3, and CS-4 export licenses to the UAE) remain a long-term variable. The Trump administration's policy direction on AI chip exports to the Middle East has not been entirely stable to date, and any policy shift could reignite the tail risks of CBRS.
in conclusion
The CBRS IPO, as an event, is the most noteworthy AI hardware capital event of 2026. It defines the valuation anchor for the AI infrastructure sector in the secondary market, and its performance will be reflected in the pricing of all related targets.
As a long-term holding, it's a typical "high-odds, high-uncertainty" bet , betting on the macro narrative of "reasoning is king," the micro execution of "Cerebras can achieve a narrowband monopoly through OpenAI," and the valuation assumption that "the market is willing to continue paying a 95x price-to-sales premium for AI hardware." All three conditions must be met simultaneously for the returns to be enormous; if any one fails, the drawdown will be severe.
For institutional investors, the typical strategy for building a position is to avoid chasing the market on the first day, waiting for the Q3 earnings report, progress on key clients, and valuation digestion. For individual investors, treating it as a small tail asset in an AI hardware portfolio is acceptable; however, if you treat it as an all-in, unwavering investment, please reread the triple paradox above.
More noteworthy than whether CBRS will surge at the opening tomorrow is another layer of significance: when a company that derives 86% of its revenue from two related entities in the UAE, and whose actual operations are still losing money, can be valued at $48.8 billion, this in itself tells everyone just how crazy the capital frenzy in the AI infrastructure sector has become.




