Chainfeeds Summary:
The ultimate winner of this AI arms race may not be the one with the best model, nor the one with the most funding.
Article source:
https://www.techflowpost.com/zh-CN/article/31383
Article Author:
TechFlow TechFlow
Opinion:
TechFlow TechFlow: Let's look at some figures first. OpenAI's disclosed total infrastructure commitments are: $250 billion with Microsoft Azure, $300 billion with Oracle's Stargate project, and $138 billion with Amazon AWS (including the original $38 billion plus an additional $100 billion, for an 8-year term). This totals over $680 billion, while OpenAI's annualized revenue is only about $25 billion. A company with an annual revenue of $25 billion has signed a computing power bill exceeding $680 billion. OpenAI has essentially sold itself to computing power providers; it is now an anchor customer of the three major cloud vendors. Anthropic is in a similar situation. Just last week, it signed an expanded partnership with Amazon, committing to spend over $100 billion on AWS over the next ten years in exchange for 5 gigawatts of computing power. Four days later, it signed a 3.5 gigawatt TPU capacity agreement with Google and Broadcom, expected to go live in 2027. With Google's announcement last week of a potential $40 billion investment, Anthropic is now locked in by two cloud giants. These two leading AI companies are trading their future for computing power. Looking back at Microsoft's $1 billion investment in OpenAI in 2019, what did they buy? Exclusive distribution rights to models. Azure exclusively owned the GPT series; if customers of other cloud providers wanted to use OpenAI's models? Sorry, move to Azure. That was the era of "model scarcity." GPT was the only large language model that could be considered competitive; whoever owned it had pricing power. But the reality in 2026 is: models are no longer scarce. Anthropic's Claude, Google's Gemini, and Meta's open-source Llama all run on multiple cloud platforms. Ramp's enterprise spending data shows that 79% of enterprises paying for Anthropic are also paying for OpenAI. Enterprise customers simply don't want to be locked into a single platform. OpenAI itself understands this. Chief Revenue Officer Denise Dresser clearly stated in an internal memo in March: "Our partnership with Microsoft laid the foundation for us, but it also limited our ability to meet the actual needs of our enterprise customers." In other words, the exclusive binding that was once an advantage is now a shackle. The model layer is rapidly commoditizing, and when all mainstream models can run on all mainstream clouds, the value of exclusive model distribution rights approaches zero. Electricity is the oil. Returning to the revised agreement between Microsoft and OpenAI, on the surface, OpenAI gained freedom to sell models on AWS and Google Cloud. While Microsoft lost exclusivity, it retained a 27% stake and non-exclusive IP licenses until 2032. The shift from exclusive to non-exclusive sounds like a win for OpenAI, but the $250 billion Azure procurement commitment remains, and OpenAI products will still be prioritized for deployment on Azure unless Microsoft chooses not to support them. This hasn't changed either. This isn't untying the knot; it's replacing a chain with a pipe. Previously, it was based on contracts; now it's based on infrastructure. OpenAI's current predicament is this: it has simultaneously signed computing power contracts with Azure (250 billion), AWS (138 billion), and Oracle (300 billion). Each contract is for multiple years and comes with specific chip architectures and deployment plans. While it has achieved multi-cloud freedom technically, it is financially tied to three cloud providers simultaneously. This looks more like going from one landlord to three landlords.
Content source




