
This week we did a live demo we’re pretty proud of: large-scale autonomous humanoid navigation using the Auki network.
Terri McKenna, our Unitree G1 humanoid resources intern, started the session like this:
We walked him to one of our floor markers. As soon as he saw the QR code, he:
From there, Phil simply told Terri: go to Nils’s desk. And he did it, autonomously.
We showed the LiDAR view on screen, the planned path, and the remote lying untouched on the table. Apart from one comic moment when a door confused his obstacle avoidance (classic robotics…), he walked the route himself.
Next Tuesday we’ll do the same thing at the WOW Summit in Hong Kong, in a venue Terri has definitely never seen before.
Before the demo, we zoomed out a bit.
In January, Jensen Huang said we’re moving from generative AI to agentic AI, and that the final frontier is physical AI – systems that understand space and can act in the physical world.
The reason we care so much about that: 70% of the world’s GDP is still tied to physical locations and labor.
Going from agentic AI (purely digital) to physical AI represents at least a 3× increase in TAM for the whole AI industry.
That’s why we exist. Auki makes the physical world accessible to AI and robots.
We do that with what we call the real world web – our network that lets digital devices browse physical locations the way browsers visit websites, making them:
Each venue can self-host its own digital representation. Robots and other devices connect to local edge nodes instead of one giant central map in someone’s cloud.
We also revisited the six layers of tech general-purpose robots need before they’ll be truly useful in everyday life:
Today, most of the robotics hype is about 1 and 2. Even with the best demos (and some impressive VLAs), you still can’t reliably tell a robot, “Go empty all the trash cans in this office.”
Because:
As we like to put it: “A robot has about as much use of a GPS as you have of a fax machine.”
GPS doesn’t work indoors and can’t tell you where your desk or kitchen is. So we focus on the middle layers. That’s what the real world web provides: an external sense of space that robots can read from and write to.
Our view is that you don’t need perfect humanoids to start deploying physical AI. Modern smartphones are already capable of spatial computing, and they provide an early and viable form factor for AI copilots.
Just like white collar workers now have AI copilots in tools like ChatGPT, we think every physical job will eventually get a copilot too.
Our first one is Cactus, the spatial AI for retail. It runs on phones today and will run on smart glasses by the end of the year.
Cactus lets stores:
We’ve been able to reduce the walking distance for click-and-collect staff by up to 40%.
We’re already:
All this is happening now, with phones.
From phones, we’re moving into smart glasses and robots, all using the same underlying network.
On glasses, we’ve partnered with Mentra, who are building open, programmable camera glasses. By giving these smart glasses spatial awareness, we open up powerful new ways to interact with AI that can see what you see and provide guidance for physical work.
But the same external sense of space is already helping robots too.
After we showed Terri, we brought out a simpler wheeled robot in our fake grocery store and asked it: “Where is the Finnish Powerball Ultimate All-in-One?”
It queried the same map Terri uses via the Auki network, planned a path, drove to the correct bay and pointed to the product.
Then we did the same thing again with… an iPhone.
Same shared map. Three very different “bodies”:
All browsing the real world web.
To recap what we’re aiming for:
This week was a big milestone on that path: a humanoid navigating a “new” venue via our network, a small robot and a phone sharing the same map, and a clear story for how we go from copilots to robots at scale.
If you want to see Terri do it live, come find us at WOW Summit in Hong Kong. And as always, if you want the off-record bits, join us in Discord.
Auki is making the physical world accessible to AI by building the real world web: a way for robots and digital devices like smart glasses and phones to browse, navigate, and search physical locations.
70% of the world economy is still tied to physical locations and labor, so making the physical world accessible to AI represents a 3X increase in the TAM of AI in general. Auki's goal is to become the decentralized nervous system of AI in the physical world, providing collaborative spatial reasoning for the next 100bn devices on Earth and beyond.
X | Discord | LinkedIn | YouTube | Whitepaper | auki.com