This article is machine translated
Show original
After reading @xmaquina and $DEUS's latest articles, I have many thoughts.
To put it simply, the robotics industry is currently witnessing a "standards war," much like the battle between Intel and Microsoft in the computer industry, or Android in the mobile phone industry.
Everyone wants to become that indispensable underlying supplier.
But who can, and how, still needs the test of time.
1️⃣ The first test is: who can handle ultra-large-scale computing?
To move a metal lump weighing several hundred pounds without delay requires extremely high hardware specifications.
NVIDIA is the "big brother": their Jetson Thor chip is currently the industry benchmark. Their advantage lies in their mature ecosystem; almost everyone involved in robotics uses their toolchain.
However, there are competitors, such as
Etched takes an "extreme route," specifically customizing chips for Transformer models. These chips run AI extremely fast, but can't do anything else besides AI.
Hailo takes a "power-saving" approach, focusing on low power consumption. After all, robots carry batteries, and if their brains consume too much power, they won't be able to keep up.
🤔 The competition at the hardware level ultimately boils down to a balance between "energy efficiency" and "versatility." While Nvidia is strong, if a specific algorithm (like Transformer) completely dominates the robotics field in the future, dedicated chips like Etched might achieve a leapfrog development.
2️⃣ How to teach robots to think like humans?
Previously, robots were "dead"; you had to write code to tell them how to move step by step. The current trend is to give them a "large model" and let them figure it out themselves.
However, Physical Intelligence (π) and Skild AI are both working on the same thing: developing a universal model. This means that the same "soul" can run in a robot dog and do housework in a humanoid robot, without needing retraining.
Covariant has already proven itself in warehouses, teaching robots to handle messy goods like humans.
OpenAI, while not building robots, provides "language and reasoning plugins" to various robotics companies. If robots can understand human speech and reason logically, they are not far from true intelligence.
🤔The core competitiveness at the software level is actually data. Whoever can enable robots to accumulate more experience (robot hours) through "watching videos" or "simulation training" will have a smarter model.
This is no longer a competition of writing code, but a competition of accumulating data.
3️⃣ How to move from the lab to real-world scenarios
Robots don't just walk around in labs; they need to go to construction sites, power line inspections, and even disaster relief.
Currently, Sanctuary AI particularly emphasizes the dexterity of "hands," because if the hands are not dexterous, even the smartest brain is useless.
📖In summary
In the current field of physical AI, people are no longer just competing on whose robot looks more human, but on whose chip is more energy-efficient, calculates faster, and whose model is more versatile and intelligent.
The current situation is somewhat like the eve of the personal computer boom in the 1990s; everyone thought it was a huge piece of meat, and everyone was vying for dominance in underlying protocols and core hardware. In the end, the winner might not be the fastest robot, but rather the "operating system" and "central processing unit" that support the operation of all the robots.

From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share
Relevant content






