Besides AI Agents, embodied robots are another vertical landing scenario in the AI era. Morgan Stanley once predicted in a report that the global humanoid robot market could exceed $5 trillion by 2050.
With the development of AI, robots will gradually evolve from mechanical arms in factories to companions in our daily lives, relying on AI to gain perception and understanding, and even the ability to make independent decisions. The problem is that today's robots are more like a group of "mute" robots that cannot talk to each other: each manufacturer uses its own language and logic, with incompatible software and unable to share intelligence. It's like buying a Xiaomi and a Tesla, but they cannot even judge road conditions together, let alone collaborate to complete tasks.
What OpenMind wants to change is precisely this "every man for himself" situation. They do not make robots, but want to build a collaborative system where robots "speak the same language, follow the same rules, and complete things together". To draw an analogy, iOS and Android led to the explosion of smart phone applications, Ethereum provided a common foundation for the crypto world, and what OpenMind wants to do is create a unified "operating system" and "collaborative network" for global robots.
In one sentence, OpenMind is building a universal operating system for robots, enabling them not only to perceive and act, but also to collaborate safely and at scale through decentralized coordination in any environment.
Who is Supporting This Open Foundation
OpenMind has completed a seed round and Series A totaling $20 million, led by Pantera Capital. More importantly, the "breadth and complementarity" of capital has almost brought together all the key puzzle pieces of this track: on one side are long-term forces from Western tech and financial ecosystems—Ribbit, Coinbase Ventures, DCG, Lightspeed Faction, Anagram, Pi Network Ventures, Topology, Primitive Ventures—who are familiar with the paradigm shift of crypto and AI infrastructure, and can provide models, networks, and compliance experience for the "intelligent agent economy + machine internet"; on the other side is Eastern industrial momentum—represented by Sequoia China's supply chain and manufacturing system—who deeply understand what technological and cost thresholds mean to "turn a prototype into a scalable product". The superposition of these two forces means OpenMind not only gets funding, but also gets the path and resources "from laboratory to production line, from software to underlying manufacturing".

This path is also aligning with traditional capital markets. In June 2025, when KraneShares launched the global humanoid and embodied intelligence index ETF (KOID), they used the humanoid robot Iris, jointly customized by OpenMind and RoboStore, to ring the opening bell at Nasdaq, becoming the first "robot guest" in exchange history to complete this ceremony. This is both a synchronization of technological and financial narratives and an open signal about "how machine assets can be priced and settled".
As Nihal Maunder, partner at Pantera Capital, said:
"If we want intelligent machines to run in open environments, we need an open intelligent network. What OpenMind is doing for robots is like Linux for software, Ethereum for blockchain."
A Team from Laboratory to Production Line
OpenMind's founder Jan Liphardt is a vice professor at Stanford University and former Berkeley professor, long researching data and distributed systems, with deep roots in both academic and engineering sides. He advocates promoting open-source reuse, replacing black boxes with auditable and traceable mechanisms, and integrating AI, robotics, and cryptography through interdisciplinary methods.
OpenMind's core team comes from OKX Ventures, Oxford Robotics Institute, Palantir, Databricks, Perplexity and other institutions, covering key links such as robot control, perception and navigation, multimodal and LLM scheduling, distributed systems and on-chain protocols. Meanwhile, an advisory team composed of academic and industry experts (such as Stanford Robotics Director Steve Cousins, Oxford Blockchain Center's Bill Roscoe, Imperial College Safe AI Professor Alessio Lomuscio) also provides guarantees for robot "safety, compliance, and reliability".
OpenMind's Solution: Two-Layer Architecture, One Order
OpenMind has built a reusable infrastructure that allows robots to collaborate and exchange information across devices, manufacturers, and even countries:
Device Side: Provides an AI-native operating system OM1 for physical robots, connecting the entire chain from perception to execution, allowing robots of different forms to understand the environment and complete tasks;
Network Side: Builds a decentralized collaborative network FABRIC, providing identity, task allocation, and communication mechanisms to ensure robots can identify each other, assign tasks, and share states when collaborating.
This combination of "operating system + network layer" allows robots not only to act individually but also to cooperate, align processes, and complete complex tasks together in a unified collaborative network.
OM1: An AI-Native Operating System for the Physical World
Just as phones need iOS or Android to run applications, robots also need an operating system to run AI models, process sensor data, make reasoning decisions, and execute actions.
OM1 is born for this purpose, an AI-native operating system for real-world robots that enables them to perceive, understand, plan, and complete tasks in various environments. Unlike traditional, closed robot control systems, OM1 is open-source, modular, and hardware-agnostic, capable of running on humanoid, quadruped, wheeled, and robotic arm forms.
Four Core Links: From Perception to Execution
OM1 breaks down robot intelligence into four universal steps: Perception → Memory → Planning → Action. This process is fully modularized by OM1 and connected through a unified data language, achieving composable, replaceable, and verifiable intelligence capability building.

OM1's Architecture
Specifically, OM1's seven-layer link is as follows:
Sensor Layer collects information: multi-modal perception input from cameras, LIDAR, microphones, battery status, GPS, etc.
AI + World Captioning Layer translates information: multi-modal models convert visual, voice, and status inputs into natural language descriptions (e.g., "You see someone waving").
Natural Language Data Bus transmits information: All perceptions are converted into time-stamped language fragments and passed between different modules.
Data Fuser combines information: Integrates multi-source inputs to generate complete context (prompt) for decision-making.
Multi-AI Planning/Decision Layer generates decisions: Multiple LLMs read the context and generate action plans combining on-chain rules.
NLDB downstream channel: Transmits decision results through the language intermediate layer to the hardware execution system.
Hardware Abstraction Layer takes action: Converts language instructions into low-level control commands, driving hardware execution (movement, voice broadcast, transactions, etc.).
Quick Start, Broad Implementation
To quickly turn "an idea" into a "robot-executable task", OM1 creates a streamlined development path: developers define goals and constraints using natural language and large models, generating reusable skill packages within hours without months of hard coding; multi-modal pipelines natively connect LiDAR, visual, and audio, eliminating complex manual sensor fusion; model side pre-connects GPT-4o, DeepSeek, and mainstream VLM, with voice input and output directly available; system layer fully compatible with ROS2 and Cyclone DDS, seamlessly adapting to Unitree G1, Go2, Turtlebot, and various robotic arms through HAL adaptation layer; simultaneously natively linked with FABRIC's identity, task orchestration, and on-chain settlement interfaces, enabling robots to execute individually or join global collaborative networks with metered billing and auditing.
In the real world, OM1 has completed multi-scenario verification: quadruped platform Frenchie (Unitree Go2) completed complex site tasks at the 2024 USS Hornet defense technology exhibition, humanoid platform Iris (Unitree G1) completed on-site human-machine interaction at the 2025 EthDenver Coinbase booth, and entered American university courses through RoboStore's educational project, expanding the same development paradigm to the frontline of teaching and research.
FABRIC: A Decentralized Human-Machine Collaboration Network
Even if single-machine intelligence is sufficiently strong, if they cannot collaborate under a trusted premise, robots will still fight their own battles. The fragmentation in reality stems from three fundamental problems: identity and location cannot be standardized and verified, making it difficult for external parties to trust "who I am, where I am, and what I am doing"; skills and data lack controllable authorization paths, preventing safe sharing and invocation among multiple subjects; control rights and responsibility boundaries are unclear, with frequency, scope, and feedback conditions difficult to predetermine and trace afterward. FABRIC provides a system-level solution to these pain points: establishing verifiable on-chain identities for robots and operators through a decentralized protocol, providing an integrated infrastructure around this identity for task publishing and matching, end-to-end encrypted communication, execution recording, and automatic settlement, transforming collaboration from "temporary docking" to a "well-documented system".
In terms of operational form, FABRIC can be understood as a network plane that combines "positioning, connection, and scheduling": identities and locations are continuously signed and verified, allowing nodes to naturally possess a "mutually visible and trustworthy" proximity relationship; point-to-point channels are like an encryption tunnel established on-demand, enabling remote control and monitoring without public IP and complex network settings; the entire process from task publishing to order acceptance, execution, and acceptance is standardized, enabling automatic profit sharing and deposit return during settlement, and allowing retrospective verification of "who completed what, when, and where" in compliance or insurance scenarios. On this basis, typical applications naturally emerge: enterprises can remotely maintain equipment across regions, cities can transform cleaning, inspection, and delivery into a callable Robot-as-a-Service, vehicle fleets can report real-time road conditions and obstacles to generate shared maps, and can dispatch nearby robots for 3D scanning, architectural surveying, or insurance evidence collection when needed.
As identity, tasks, and settlement are hosted by the same network, collaboration boundaries are clearly defined in advance, execution facts are verified afterward, and skill invocation has measurable costs and benefits. In the long term, FABRIC will evolve into an "application distribution layer" for machine intelligence: skills will circulate globally with programmable authorization terms, and data generated from invocations will feed back models and strategies, allowing the entire collaboration network to continuously self-upgrade under trusted constraints.
Web3 is Writing "Openness" into Machine Society
The robotics industry is rapidly concentrating on a few platforms, with hardware, algorithms, and networks locked in closed stacks. The value of decentralization lies in allowing robots from any brand or region to collaborate, exchange skills, and settle transactions in an open network without relying on a single platform. OpenMind encodes this order through on-chain infrastructure: each robot and operator has a unique on-chain identity (ERC-7777), with verifiable hardware fingerprints and permissions; tasks are published, bid, and matched under public rules, with execution generating encrypted proofs with time and location stored on-chain; contracts automatically settle profit sharing, insurance, and deposits after task completion, with results instantly verifiable; new skills can set invocation times and compatible devices through contracts, protecting intellectual property while enabling global circulation. Thus, the robot economy is born with anti-monopoly, composable, and auditable genes, with "openness" written into the underlying protocol of machine society.
Enabling Embodied Intelligence to Escape Isolation
Robots are moving from exhibition halls to daily life: patrolling hospital wards, learning new skills on campuses, conducting inspections and modeling in cities. The real challenge is not stronger motors, but enabling machines from different sources to trust each other, exchange information, and collaborate; for scalability, distribution and supply are more critical than technology.
OpenMind's implementation path therefore starts with channels rather than parameter stacking. Partnering with RoboStore (one of Unitree's largest distributors in the US), they are developing OM1 as a standardized textbook and experimental kit, simultaneously promoting integrated hardware and software supply across thousands of US colleges. The education system represents the most stable demand side, directly embedding OM1 into the developer and application increment of the coming years.
For broader social distribution, OpenMind leverages its investor ecosystem to create a platform-based "software export channel". Large crypto ecosystems like Pi also add imagination to this model, gradually forming a positive flywheel of "people writing, people using, people paying". With educational channels providing stable supply and platform distribution bringing scale demand, OM1 and upper-layer applications thus possess a replicable expansion trajectory.
In the Web2 era, robots were locked in single vendor closed stacks, with functions and data difficult to cross-platform flow; after connecting textbook standards and distribution platforms, OpenMind makes openness the default setting: the same system enters campuses, moves towards industry, and continuously spreads through platform networks, making openness the default starting point for large-scale implementation.
Click to learn about BlockBeats job openings
Welcome to join the BlockBeats official community:
Telegram Subscription Group: https://t.me/theblockbeats
Telegram Communication Group: https://t.me/BlockBeats_App
Official Twitter Account: https://twitter.com/BlockBeatsAsia





