Physical AI might be the truly transformative technology

Applying AI to human creativity is one thing, applying it to the physical world something quite different. Are you ready for Physical AI?

Interesting things happen when the digital and the physical intersect. For all their later troubles, 2010s startups like AirBnB and Uber reshaped how we use the space around us, by allowing us to book rooms or cabs from our phones. Apple is betting that spatial computing — mixed reality — will beat out the metaverse as the next paradigm of user interface.

So what happens when we take the latest disruptive technology — AI — and apply it to the physical world? That’s what one of our NEXT24 speakers, Dr Ivan Poupyrev, co-founder of Archetype AI, aims to find. Can we build models that analyse behaviours in the physical world, and provide real-time reactions to them? He thinks so.

AI is already surprisingly good at recognising the physical world. Consumer-level devices have been using machine learning to identify creatures, plants, objects, and places in photos for years, but the new wave of AI is supercharging that. One blogging service I use now provides AI-powered alt text of images — and the suggestions are scarily good descriptions of what’s in the image.

Sensors everywhere — but what to do with the data?

Poupyrev came from Google, where he was working on projects that integrated sensors onto everyday objects, like a jacket. Indeed, most of us are surrounded by an ever-growing cloud of sensors. Our phones and smartwatches are full of them. But that data is difficult to really analyse and use meaningfully by traditional methods. And that’s where AI comes into play.

As Steven Levy explained in a recent WIRED article:

When LLMs appeared, Poupyrev and colleagues realized that with modifications they could make sensor data more powerful by providing a way for humans to easily explore and monitor data collected across vast swaths of time and space. Instead of a large language model it would be a large behavior model.

As Brandon Barbello, Archetype’s COO, told Levy:

“We put sensors in all kinds of things to help us, but sensor data is too difficult to interpret. There’s a potential to use AI to understand that sensor data—then we can finally understand these problems and solve them.”

Physical AI in use

They give two obvious examples: tracking a parcel and flagging up when it might have been dropped and damaged, like this:

Or imagine a car being able to sense when its owner is returning laden with shopping, allowing it to automatically pop the trunk for them. That’s one of the focuses of their collaboration with Volkswagen.

But these are baby steps: imagine applying that process to traffic flows in a city. We’ve already filled our streets with sensors and CCTV cameras. What if that data could feed into smart, AI-driven traffic management systems that could react to drivers’ behaviour and reroute them before traffic builds up into a jam?

AI itself is one of a growing number of technologies that put a strain on our energy infrastructure. Again, as energy-using devices gain both sensors and connectivity, smart management of the energy supply grid becomes ever more feasible. No so much new energy, as better use of existing energy.

Behavioural AI

But these examples are just scratching the surface. Just as existing LLMs can consume and learn from visual and written information, so too can these behaviour models extract insight from any form of sensor data. As the introduction to Physical AI article on the Archetype website puts it:

Our foundation model learns about the physical world directly from sensor data – any and all sensor data. Not just cameras and microphones, but temperature sensors, inertial sensors, radar, LIDAR, infrared, pressure sensors, chemical and environmental sensors, and more. We are surrounded by sensors that capture rich spatial, temporal, and material properties of the world.

In other words, they can use any sensor-based source to determine the behaviour and intent of objects moving through space. Which sounds great, until you realise that one of those objects moving through space might be you. Is this going to impact privacy? Levy put that question to the team:

“The customers we are working with are focusing on solving their specific problems with a broad variety of sensors without affecting privacy,” says Poupyrev.

But still, one can’t help but wonder what will happen if someone ties behaviour AI to identity-spotting AI…

Reshaping the world with Physical AI

But let’s not allow potential difficulties to deter us. We’re taking a nuanced approach to risk and reward ’round these parts. AI has the potential to do more than manage our existing physical environment better — it could completely reshape it, starting with our energy needs. As Ray Kurzweil wrote in The Economist:

In all of history until November 2023, humans had discovered about 20,000 stable inorganic compounds for use across all technologies. Then, Google’s Gnome AI discovered far more, increasing that figure overnight to 421,000. Yet this barely scratches the surface of materials-science applications. Once vastly smarter AGI finds fully optimal materials, photovoltaic megaprojects will become viable and solar energy can be so abundant as to be almost free.

Now, that’s what I call new energy

So much of our discussion about AI is trapped in thinking about how it might enhance — or replace — the creative processes of our brains. But humanity’s power has always been to reshape the world around us, for our convenience. That power has grown to such an extent that our mistakes — our reliance on technologies that produce atmospheric carbon — have become a threat to us.

But if we can harness AI to better understand, react to and change the physical environments around us, we might be well on the way to correcting our mistakes, and building a better future for all.

As we might have said last year, Let’s get Physical with AI…

Picture by Possessed Photography | Unsplash.