
Coming to you live from an intense four days at CES in Las Vegas, this week’s update is packed with massive commercial milestones, industry-shifting conversations, and the realization of our "perception-first" robotics thesis.
Here is what went down at CES and the major wins happening behind the scenes.
One of the standout moments from CES was a two-hour breakfast with Neil Trevitt, President of the Metaverse Standards Forum and Khronos (OpenXR). We discussed the future of augmented reality, and he offered the best definition of the metaverse we’ve heard yet:
"The metaverse is when spatial computers talk to each other."
Our vision for the real world web resonated deeply with these standards bodies. In fact, we were told that Auki was the coolest conversation they had all week. We will be working much more closely with these organizations to help shape the future of the spatial internet.
This week was massive for our robotics rollout. We officially signed an LOI and agreed on terms with one of Europe's biggest retailers.
This deal proves a crucial point about the unit economics of spatial AI. Retailers are willing to pay us $500 a month per location just for the Cactus AR software on employee phones. To put that in perspective: our perception software is worth as much per month as leasing a 1X Neo humanoid robot.
The industry is finally catching on to the narrative we’ve been pushing all year: highly valuable perception tasks are ready today, and they are often worth more than the complex manipulation tasks everyone else is chasing.
Back in the Hong Kong lab, our internal demo days are showing exponential progress. We are actively refining our store simulator, which uses genetic algorithms to reverse-engineer shopping personas from historical sales data.
The goal is to simulate stores forward in time. This will allow Cactus to become a proactive AI manager that can tell retailers, "If you move this product to this shelf, you will increase sales." By automating the data capture using robots and smart glasses, we eliminate the manual labor of building these 3D maps, giving retailers unprecedented insight into their physical spaces.
While at CES, Nils attended a panel with a VC investing in physical AI, where the topic of privacy came up. If robots need to learn our homes and offices, how do we protect our data?
The real world web is the only viable answer. Instead of sending sensitive home data to a centralized corporate server, you can host your home's spatial domain on a local Auki compute node. When you buy a robot, you grant it permission to train on your local domain. The robot learns your space, and the manufacturer never sees your data.
We are already building the infrastructure to integrate the real world web into the world's most important robotics simulation tools to make this a reality.
2026 is moving incredibly fast, and we are just getting started. If you want to join the off-the-record AMA sessions that happen after every stream, make sure to come hang out with us on Discord.
Auki is making the physical world accessible to AI by building the real world web: a way for robots and digital devices like smart glasses and phones to browse, navigate, and search physical locations.
70% of the world economy is still tied to physical locations and labor, so making the physical world accessible to AI represents a 3X increase in the TAM of AI in general. Auki's goal is to become the decentralized nervous system of AI in the physical world, providing collaborative spatial reasoning for the next 100bn devices on Earth and beyond.
X | Discord | LinkedIn | YouTube | Whitepaper | auki.com