
Understanding OpenMind: Building a Global Operating System and Trust Network for Embodied Intelligence
TechFlow Selected TechFlow Selected

Understanding OpenMind: Building a Global Operating System and Trust Network for Embodied Intelligence
OpenMind is building a general-purpose operating system for robots, enabling them not only to perceive and act but also to safely and scalably collaborate in any environment through decentralized coordination.
Why We Need an Open Robotics Era
In the next 5–8 years, the number of robots on Earth will surpass one billion units—a turning point as they evolve from "standalone demos" to participants in "social division of labor." Robots will no longer be just robotic arms on assembly lines, but rather "colleagues, teachers, and partners" capable of perception, understanding, decision-making, and collaboration with humans.
In recent years, robot hardware has grown rapidly—like muscles—equipped with more dexterous hands, stable gaits, and richer sensors. But the real bottleneck isn't metal or motors; it lies in how to give them a shared and collaborative mind:
-
Software across manufacturers is incompatible, preventing robots from sharing skills and intelligence;
-
Decision logic is locked within closed systems, making external verification or optimization impossible;
-
Centralized control architectures limit innovation speed and raise trust costs.
This fragmentation prevents the robotics industry from transforming AI model advances into replicable productivity: standalone robot demos keep emerging, yet lack cross-device transferability, verifiable decisions, and standardized collaboration—making scalability difficult. OpenMind aims to solve this "last-mile" problem. Our goal isn’t to build a robot that dances better, but to provide a unified software foundation and collaboration standard for the world’s vast heterogeneous robots:
-
Enable robots to understand context and learn from each other;
-
Empower developers to quickly build applications on an open-source, modular architecture;
-
Allow humans and machines to securely collaborate and settle transactions under decentralized rules.
In short, OpenMind is building a general-purpose operating system for robots—one that enables them not only to perceive and act, but also to cooperate safely and at scale through decentralized coordination in any environment.
Who Is Betting on This Path: $20M Funding and Global Lineup
OpenMind has now raised $20 million (Seed + Series A), led by Pantera Capital, with participation from top-tier global tech and capital players:
-
Western tech and capital ecosystem: Ribbit, Coinbase Ventures, DCG, Lightspeed Faction, Anagram, Pi Network Ventures, Topology, Primitive Ventures, and Amber Group—longtime builders in crypto and AI infrastructure, betting on the foundational paradigm of "agent economies and machine internet";
-
Eastern industrial strength: Sequoia China deeply involved in robotics supply chains and manufacturing systems, well aware of the full difficulty and barriers in "building a machine and scaling its delivery";
Meanwhile, OpenMind maintains close engagement with traditional capital market participants like KraneShares, jointly exploring pathways to incorporate the long-term value of "robots + agents" into structured financial products, enabling bidirectional integration between crypto and equity markets. In June 2025, when KraneShares launched the world’s first humanoid and embodied intelligence index ETF (KOID), it selected the humanoid robot "Iris," co-customized by OpenMind and RoboStore, to ring the opening bell at Nasdaq—the first time in exchange history that a humanoid robot performed this ceremony.
As Pantera Capital partner Nihal Meaunder put it:
"If we want intelligent machines to operate in open environments, we need an open intelligent network. What OpenMind is doing for robots is what Linux did for software and Ethereum for blockchain."
Team and Advisors: From Lab to Production Line
OpenMind's founder Jan Liphardt, an associate professor at Stanford University and former Berkeley faculty, has long researched data and distributed systems, with deep roots in both academia and engineering. He advocates advancing open-source reuse, replacing black boxes with auditable and traceable mechanisms, and integrating AI, robotics, and cryptography through interdisciplinary approaches.
The core team comes from institutions such as OKX Ventures, Oxford Robotics Institute, Palantir, Databricks, Perplexity, covering key areas including robot control, perception and navigation, multimodal and LLM orchestration, distributed systems, and on-chain protocols. Additionally, an advisory board composed of academic and industry experts—including Steve Cousins (Stanford robotics lead), Bill Roscoe (Oxford Blockchain Centre), and Alessio Lomuscio (Imperial College Professor of Safe AI)—ensures the safety, compliance, and reliability of robotic systems.
OpenMind's Solution: Two-Layer Architecture, One Unified Order
OpenMind has built a reusable infrastructure enabling robots to collaborate and exchange information across devices, manufacturers, and even national borders:
-
Device Layer: Provides OM1, an AI-native operating system for physical robots, closing the loop from perception to execution so robots of different forms can understand environments and complete tasks;
-
Network Layer: Builds FABRIC, a decentralized collaboration network offering identity, task allocation, and communication mechanisms to ensure robots can recognize each other, assign tasks, and share status during collaboration.
This combination of "operating system + network layer" allows robots not only to act independently but also to coordinate, align processes, and jointly complete complex tasks within a unified collaboration network.
OM1: AI-Native Operating System for the Physical World
Just as smartphones require iOS or Android to run apps, robots need an operating system to run AI models, process sensor data, make decisions, and execute actions.
OM1 was built exactly for this purpose—an AI-native operating system designed for real-world robots, enabling them to perceive, understand, plan, and perform tasks across diverse environments. Unlike traditional, closed robot control systems, OM1 is open-source, modular, and hardware-agnostic, running on various robot types including humanoids, quadrupeds, wheeled robots, and robotic arms.
Four Core Stages: From Perception to Action
OM1 breaks down robotic intelligence into four universal steps: Perception → Memory → Planning → Action. This entire workflow is fully modularized within OM1 and connected via a unified data language, enabling composable, swappable, and verifiable construction of intelligent capabilities.

OM1 Architecture
At the architectural level, OM1 consists of seven layers:
-
Sensor Layer: Collects inputs from cameras, LIDAR, microphones, battery status, GPS, and other multimodal sensors.
-
AI + World Captioning Layer: Translates visual, auditory, and state inputs into natural language descriptions (e.g., "you see a person waving").
-
Natural Language Data Bus: Transmits all perceptions as timestamped text snippets across modules.
-
Data Fuser: Integrates multi-source inputs to generate a complete contextual prompt for decision-making.
-
Multi-AI Planning/Decision Layer: Multiple LLMs read the context and generate action plans based on on-chain rules.
-
NLDB Downstream Channel: Passes decision results via a language middleware layer to hardware execution systems.
-
Hardware Abstraction Layer: Converts language instructions into low-level control commands to drive hardware actions (movement, speech output, transactions, etc.).
Rapid Development and Broad Deployment
To accelerate the deployment of ideas into robot-executable tasks, OM1 includes these tools:
-
Rapid Skill Addition: Add new robot behaviors in hours using natural language and large models, instead of months of hard coding.
-
Multimodal Integration: Easily combine LiDAR, vision, and audio sensing without developers needing to write complex sensor fusion logic.
-
Pre-configured LLM Interfaces: Built-in support for GPT-4o, DeepSeek, VLMs, and other language/vision models, enabling voice interaction.
-
Broad Hardware/Software Compatibility: Supports mainstream protocols like ROS2 and Cyclone DDS, seamlessly integrating with existing robotics middleware. Works directly with Unitree G1 humanoids, Go2 quadrupeds, Turtlebot, robotic arms, and more.
-
FABRIC Integration: Native support for identity, task coordination, and on-chain payments, enabling robots to participate in a global collaboration network beyond standalone operation.
OM1 is already deployed in multiple real-world scenarios:
-
Frenchie (Unitree Go2 quadruped): Completed complex field missions at the USS Hornet Defense Tech Showcase 2024.
-
Iris (Unitree G1 humanoid): Conducted live human-robot interaction demos at EthDenver 2025's Coinbase booth, with plans to enter U.S. university curricula via RoboStore's education program.
FABRIC: Decentralized Human-Machine Collaboration Network
Even with powerful brains, robots remain isolated if they cannot collaborate securely and reliably. In reality, robots from different manufacturers often operate on proprietary systems, unable to share skills or data. Cross-brand or international collaboration lacks trusted identities and standardized rules. This leads to several challenges:
-
Identity and Location Proof: How can a robot prove who it is, where it is, and what it’s doing?
-
Skill and Data Sharing: How to authorize robots to share data or invoke skills?
-
Control Rights Definition: How to set conditions for skill usage frequency, scope, and data feedback?
FABRIC is designed to solve these issues. It is OpenMind’s decentralized human-machine collaboration network, providing unified infrastructure for identity, tasks, communication, and settlement. Think of it as:
-
Like GPS: lets robots know where others are, proximity, and whether collaboration is feasible;
-
Like a VPN: enables secure peer-to-peer connections without public IPs or complex networking;
-
Like a task scheduler: automatically publishes, receives, and logs the entire task execution process.
Core Application Scenarios
FABRIC supports a wide range of practical use cases, including but not limited to:
-
Remote Operation and Monitoring: Securely control robots from anywhere without dedicated networks.
-
Robot-as-a-Service Marketplace: Summon robots like ride-hailing to perform cleaning, inspection, or delivery tasks.
-
Crowdsourced Mapping and Data Collection: Robot fleets upload real-time road conditions, obstacles, and environmental changes to build sharable high-definition maps.
-
On-Demand Scanning/Surveying: Temporarily deploy nearby robots for 3D modeling, construction surveying, or evidence collection in insurance claims.
FABRIC makes it possible to verify and trace "who did what, where, and when," establishing clear boundaries for skill invocation and task execution.
In the long term, FABRIC will become the App Store for machine intelligence—skills can be globally licensed, and generated data will feed back into models, driving continuous evolution of the collaboration network.
Web3 Is Writing "Openness" Into Machine Society
In reality, the robotics industry is rapidly centralizing, with a few platforms controlling hardware, algorithms, and networks, shutting out external innovation. The value of decentralization is that regardless of who builds the robot or where it operates, it can collaborate, exchange skills, and settle payments on an open network—without depending on a single platform.
OpenMind uses on-chain infrastructure to codify collaboration rules, skill access permissions, and payment distribution into a public, verifiable, and improvable "network order":
-
Verifiable Identity: Each robot and operator registers a unique on-chain identity (ERC-7777 standard), with hardware traits, responsibilities, and permission levels transparent and auditable.
-
Public Task Assignment: Tasks are not assigned in closed black boxes but published, bid, and matched under public rules. All collaboration generates time- and location-stamped cryptographic proofs stored on-chain.
-
Automated Settlement and Revenue Sharing: After task completion, profit sharing, insurance payouts, or deposit deductions execute automatically, with all parties able to verify outcomes in real time.
-
Free Flow of Skills: New skills can be governed by on-chain contracts specifying usage counts and compatible devices—protecting IP while enabling global circulation.
This is a collaborative order that anyone can use, monitor, and improve. For Web3 users, this means the robotics economy is born with anti-monopoly, composable, and verifiable DNA—not just an investment opportunity, but a chance to embed "openness" into the very foundation of machine society.
Bringing Embodied Intelligence Out of Isolation
Whether patrolling hospital wards, learning new skills in schools, or inspecting and mapping city blocks, robots are gradually stepping out of "demo booths" to become stable contributors in human social分工. They operate 24/7, follow rules, have memory and skills, and can naturally collaborate with people and other machines.
To scale these scenarios, we need not only smarter machines but also a foundational order enabling mutual trust, interoperability, and collaboration. OpenMind has laid the first "roadbeds" on this path with OM1 and FABRIC: the former enables robots to truly understand the world and act autonomously; the latter allows these capabilities to circulate globally. The next step is extending this road into more cities and networks, making machines reliable long-term partners in the social fabric.
OpenMind’s roadmap is clear:
Short-term: Complete OM1 core prototype and FABRIC MVP, launch on-chain identity and basic collaboration features;
Mid-term: Deploy OM1 and FABRIC in education, home, and enterprise settings, connect early nodes, and grow developer communities;
Long-term: Establish OM1 and FABRIC as global standards, enabling any machine to join this open collaboration network as easily as connecting to the internet, forming a sustainable global machine economy.
In the Web2 era, robots were often locked in closed systems of single vendors, unable to move functions and data across platforms. In OpenMind’s envisioned world, they are equal nodes in an open network—free to join, learn, collaborate, and settle, co-creating a trustworthy, interconnected global machine society with humans. OpenMind provides the scalable capability that makes this transformation possible.
Join TechFlow official community to stay tuned
Telegram:https://t.me/TechFlowDaily
X (Twitter):https://x.com/TechFlowPost
X (Twitter) EN:https://x.com/BlockFlow_News














