This article is the latest reflection from @farzyness , an independent analyst with 360,000 followers. He began investing in Tesla in 2012 and led the team at Tesla from 2017 to 2021. Compiled and translated by PANews .
One person owns a battery company, an AI company, and a rocket company, and they all support each other. I've been thinking about this for months, and honestly, I really don't see how Musk could lose.
This isn't from a "fanatic's" perspective, but from a structural one. The Tesla-AI-SpaceX triangle is evolving into something unprecedented: an industrial-grade, synergistic, cash-generating flywheel behemoth. While it sounds convoluted, it's a remarkably accurate description.
Let me break down what's going on here, because I think most people are looking at these companies in isolation, while the real focus is on the connections between them.
The starting point of the flywheel: energy
Tesla manufactures batteries, and in massive quantities. They deployed 46.7 gigawatt-hours (GWh) of energy storage systems in 2025, a 48.7% year-over-year increase. Their 50 GWh factory in Houston will begin production this year. Total planned capacity is 133 GWh per year. The gross margin for this business is 31.4%, compared to only 16.1% for the automotive business. This seemingly "boring" energy storage business generates almost twice the profit per dollar of revenue compared to the automotive business.
Why is this important? Because xAI just purchased $375 million worth of Tesla Megapacks (US-Canada energy storage) to power Colossus, the world's largest AI training facility. 336 Megapacks have already been deployed.
These batteries provide backup power and demand response capabilities for this system, which has 555,000 GPUs and consumes more than 1 gigawatt of power (enough to power 750,000 homes).
Breaking free from Nvidia: Chip self-sufficiency
Tesla not only sells batteries, but is also developing its own AI chips.
Currently, Nvidia monopolizes AI training hardware, controlling approximately 80% of the market. All major AI labs (OpenAI, Google, Anthropic, Meta) are vying for Nvidia's quotas. The H100 and now the Blackwell chip are the bottlenecks for the entire industry. Jensen Huang's pricing power is something most monopolists dream of.
If you were Elon Musk and wanted to build the world's largest AI system, what would you do? You can't rely on Nvidia forever. That's your Achilles' heel, a lever someone else holds in their hands, especially when you plan to power hundreds of millions of robots over the next 10 to 20 years.
Incidentally, Musk's Tesla plan is to create as many robots as there are humans.
Tesla's AI5 chip is slated for release between the end of this year and 2027. Musk claims it will be the world's most powerful inference chip, especially in terms of cost per unit of computing power. In other words, it will be extremely efficient.
A $16.5 billion foundry contract has been signed with Samsung for the AI6 chip. The key point is that Musk stated the AI6 is designed for the "Optimus Robotics and Data Center." This means that Tesla products and xAI products will share the same chip.
Nvidia currently wins in "training," but "inference" is the long-term profit driver. Training only happens once, but every time someone uses the model, inference is generated. If you're running millions of Tesla cars, millions of Optimus bots, and billions of Grok queries, inference is where the real computing power demand lies.
By building its own inference chips, Tesla and xAI have "decoupled" themselves from Nvidia, which is focused on training. This is like bypassing a fortified front and flanking the enemy.
Space-based AI computing
Musk mentioned "space-based AI computing" in Tesla's Dojo 3 roadmap. Their reboot of the Dojo 3 project is precisely for this vision. And when you do the math, this seemingly crazy idea makes perfect sense.
If you wanted to deploy 1 terawatt of AI computing power in space annually (on the scale of global AI infrastructure), according to Musk, at current chip costs, you would need more money than the total amount of currency in existence. The Nvidia H100, priced between $25,000 and $40,000, is simply not economically feasible.
But if you have chips that are extremely low-cost, specifically designed for inference, mass-producible, and highly energy-efficient, the mathematical model changes. Tesla's goal is to manufacture AI chips using "the lowest-cost silicon chips." This is key to enabling large-scale space computing.
Without affordable chips, space AI is just a fantasy; with affordable chips, it becomes inevitable.
StarCloud, a competitor backed by Nvidia, trained its first AI model in space last December. This proved the concept was feasible. Therefore, the focus now is not on validating the hypothesis, but on creating an environment for large-scale deployment.
Imagine this: SpaceX sends orbital data centers into low Earth orbit via Starship, each rocket carrying 100 to 150 tons. These data centers run models developed by xAI, using Tesla-designed chips, and are powered by solar energy and Tesla batteries. Free solar energy, zero-cost cooling. Inference results are transmitted directly to Tesla cars and Optimus robots on Earth via Starlink.
Data and connection closed loop
SpaceX already has nearly 10,000 Starlink satellites in orbit and has been authorized to launch another 7,500. They have 6 million direct-connect mobile phone customers. The V3 satellite launched this year has a downlink capacity of 1 terabit per second (1Tbps), 10 times that of current models.
The flywheel spins wildly here:
- xAI builds models (Grok 3 has 3 megaparameters, Grok 4 won the global test, and Grok 5 with 6 megaparameters will be released in Q1 2026).
- These models are integrated into Tesla vehicles. Grok has been available in-car since July 2025, providing conversation and navigation, and the same Tesla chip is used for the car's autopilot function.
- Grok will become the "brain" of the Optimus robot. Optimus plans to produce 50,000 to 100,000 units this year and reach 1 million units by 2027.
This means that: xAI models, Tesla manufactures chips, Tesla manufactures robots to perform the tasks, Tesla manufactures batteries to provide power, SpaceX provides global connectivity and space access, xAI is trained using all the data from Tesla and SpaceX, and commands are sent from space via solar-powered satellites.
An insurmountable moat
This kind of moat is inevitable.
- Tesla has 7.1 billion miles of FSD driving data, more than 50 times that of Waymo. Real-world data trains better models, better models improve vehicle performance, and better vehicles collect even more data.
- X (formerly Twitter): xAI has exclusive access to real-time human data generated by approximately 600 million monthly active users. This differs from YouTube or search data; it is raw, unstructured, and real-time human thought. When Grok experiences hallucinations, they can correct them against real-time consensus faster than anyone else.
What can our competitors use to catch up?
- Google has vertical integration (TPU chips, Gemini, YouTube), but Waymo is too small and lacks a launch vehicle and real-time social media feed.
- Microsoft has Copilot and Azure, but relies on OpenAI and lacks physical hardware, space infrastructure, and autonomous driving data.
- Amazon has AWS, custom chips, and logistics robots, but lacks consumer AI products with large-scale adoption, a fleet of cars, and launch capabilities.
- Nvidia monopolized the training process, but it didn't have a "physical layer." They didn't have data-collecting cars or robots in factories, nor a global satellite network. They sold chips, but didn't control the application terminals.
To compete with Musk, you would need to simultaneously found or acquire five different top companies, and he consolidates his advantage every day.
in conclusion
Most analysts treat Tesla, xAI, and SpaceX as separate investments, but that's a complete misconception. The value lies not in any single part, but in how they complement each other.
xAI is valued at $250 billion, SpaceX at approximately $800 billion and seeking a $1.5 trillion IPO, and Tesla at $1.2 trillion. The total enterprise value exceeds $2 trillion, and this doesn't even include the premium for synergies.
Each link enhances the other:
- Tesla's success provides xAI with more training data.
- xAI is a success, making Tesla cars and robots smarter.
- SpaceX is successful; the entire system has global coverage.
- The energy business was successful, and electricity costs decreased across all facilities.
- The chip strategy was successful, freeing them from dependence on Nvidia.
- Optimus is successful; the total potential amount of labor market (TAM) exceeds $40 trillion annually.
Am I missing something? If you can spot any flaws I haven't seen, I'd love to hear them. Because after observing for so many years, I really can't find a single one.




