
We opened with a quick recap of why we exist at all. Earlier this year, Nvidia’s Jensen Huang described the trajectory from generative AI to agentic AI, and then to physical AI—AI that understands space and physics and can act in the real world.
Roughly 70% of global GDP is still tied to physical locations and labor, so going from agentic to physical AI effectively triples AI’s addressable market. If you want to build something bigger than OpenAI, you have to play in physical AI.
That’s why, back in 2021, we started building what we now call the real world web: a way for digital devices to browse physical locations the way humans browse websites.
Imagine if the internet was just everyone uploading their information to a specific website. No. The internet is many different websites on many different web servers, and discovery and linkage between them. That’s exactly what the real world web is.
We revisited our view of the six necessary layers for robotics:
Our focus is squarely on collaborative perception, mapping, and positioning, plus a growing ecosystem of applications.
Phones and glasses are, in Nils's words, “Really just like a robot but with no arms and legs.”
By starting with handhelds and AR smart glasses, we’ve been able to:
That groundwork is exactly why we can now do what we showed next.
We showed a demo of a fresh-out-of-the-box utility robot with almost no prep work:
Here’s the flow:
Less than two minutes of setup, and this robot is good to go. No field engineer crawling around with a laptop. No tedious one-off mapping. Just “plug in, show QR, done.”
This is one of the core reasons we’re building the real world web: so robots can arrive in new places and instantly know where they are and what to do.
Then we handed the mic to Victor and Aaron from Rice AI, our new neighbors in the Level 10 research center in Kowloon Bay, Hong Kong.
Victor summed up the reality of running a robotics business: “You have a perfect hardware, you have a perfect product, but then you also need to work with a ton of third-party partners to make the whole scenario work… You need a lot of field engineers to go on-site and deploy robots.”
This is exactly the pain we’re trying to alleviate: reduce the time, cost, and expert labor needed to deploy robots in the real world by letting them tap into existing domains and maps.
Rice has now moved into our shared robot lab. Nils says, “Now we’re literally neighbors. There’s a lot of Rice robots all over now.”
Over the coming days and weeks, we’ll be:
Nils: “We will make sure that the people in the Rice community can set up real world web domains, scan their place, connect it to the Rice robot so that the Rice robot knows its way around.”
Hong Kong robotics OGs plus our spatial network is a very natural fit.
Rice isn’t new to Web3 either. Victor shared how they launched an AI companion robot with Floki:
This bot acts as an AI companion, performs data collection tasks, and rewards users with $RICE tokens in return.
Aaron added more detail on the upcoming protocol:
From our side: “Physical AI needs to get trained on data about the real world, and Rice is giving the community an opportunity to contribute data and get rewarded.”
We also stepped back to talk about where all this is happening.
We pulled up a map of the Greater Bay Area (GBA), which includes:
Some stats we shared:
It’s also one of the four global hubs for hardware manufacturing and robotics.
As Victor said: “If you’re working in hardware, you want to do something great in robotics, you want to be there. It’s the best place.”
For robotics and physical AI, it’s about as good a spawn point as you can get.
We also touched on Rice’s collaboration with peaq:
From our angle, Auki + peaq + Rice is a strong interoperability triangle:
We also hinted at another well-known robotics/Web3 project that might be joining our lab soon, but kept that part for off-record discussion.
The main takeaways from this update:
As always, if you want the unfiltered version, join us live in the Discord. We do the official part on X, then shut off the stream and talk off the record with the people actually in the room.
Auki is making the physical world accessible to AI by building the real world web: a way for robots and digital devices like smart glasses and phones to browse, navigate, and search physical locations.
70% of the world economy is still tied to physical locations and labor, so making the physical world accessible to AI represents a 3X increase in the TAM of AI in general. Auki's goal is to become the decentralized nervous system of AI in the physical world, providing collaborative spatial reasoning for the next 100bn devices on Earth and beyond.
X | Discord | LinkedIn | YouTube | Whitepaper | auki.com