Apple didn’t launch a headset: it released a spatial computing platform

Apple Vision Pro, the first headset from the company, isn’t a Metaverse device. It’s Apple’s first true spatial computing product.

Like many people, I was glued to my screen during the keynote that opened Apple’s annual developer conference. I’d heard the same headset rumours as everyone else, and they turned out to be true: the Apple Reality Pro will be available in early 2024. And it looks like an incredible device, if also an incredibly expensive one.

But there was one moment that transformed my understanding of what was happening. One presenter was talking about the real-time R1 chip in the headset that processes the information from the huge array of sensors in the device. And suddenly, it hit me: the two big “secret” Apple projects of the past decade — the car and the headset — aren’t separate projects at all. They’re both product manifestations of what’s really going on.

Apple is building the hardware and software for spatial computing.

The path to spatial computing

Oh, sure, the clues were there. Look how often in the presentation they said, “spatial computing” (often) compared to Metaverse (zero). But the real giveaway is the R1 chip that powers the headset. While working alongside the M2 chip that powers current Macs and iPads, it does something distinctly different: it processes data from the sensor in near-real time. This is great for things like preventing sickness when you use the headset, so there’s no lag between what you see on screen and what you feel around you.

But do you know where else real-time chips are incredibly useful? No, more than that, essential? Self-driving cars.

Spatial computing is, in essence, computing that is aware of the physical space around your device, and able to interact with it. Its use in autonomous vehicles is and should be obvious: the car needs to be aware of processing data from its sensors in real-time to adjust for changes in the physical landscape. In Vision Pro, it needs to know where objects are around you, so it can place the screen you’re using — or flag up that there’s someone nearby and let the person break through the digital image. This is Apple getting physical.

Spatial computing: the next major digital platform

This is the obvious next stage of the major journey of the 21st century: the digitisation of everything, as we predicted back in 2019. To truly achieve that goal, and start harnessing computing power to better manage our physical environment, computers have to understand physical space at a profound level.

Little clues as to this work have been lurking in older Apple products: the way the big HomePods actually use sound to define the physical space they’re operating in, for example. And now it knows temperature and humidity, too. Or the AirPods Pro and their ability to mix music and real-world sounds, while taking down the loudest external inputs. Or the Apple Watch Ultra, with its two styles of GPS to create a device that is incredibly aware of exactly where it is in space. FaceID depends on the ability to read and learn a physical object: your face.

These are all little gifts to the rest of the Apple ecosystem from the spatial computing work. The HomePod benefited from the spatial audio work, and the AirPods Pro from the idea of blended reality: real-world and digital sounds mixing. The watch pushes the idea of knowing precisely where you are to an extreme. These are all foundational technologies for devices aware of the world around us.

When digital gets physical

Many of the most exciting uses of technology heading our way are dependent on these sorts of ideas. To create a truly smart city, you need your computing system to be able to both understand the physical environments of the urban world — and adapt in real-time to the changes in it. That requires both software and hardware, in terms of real-time chips and sensor arrays. Apple is building a platform with all of that.

Any autonomous device, from delivery robots to drones, from autonomous cars to better robot vacuum cleaners could benefit from this tech. Smart homes get even smarter when the devices understand the physical reality of the space on multiple levels.

And that means that the companies who should be worried about what Apple announced in early June might not be the ones you expect.

Sidelining the Metaverse

Many people have made the connection between the Apple headset and Meta’s quest line to see them as natural competitors for the next wave in computing. That’s the wrong way of looking at it. They’re travelling in opposite directions, but using similar devices to do so.

Meta’s Quest is fundamentally a VR product. It takes you away from the constraints of physical reality, and transports you to a digital space. This is the idea of the Metaverse, which Zuckerberg bet big on a couple of years ago. And you access it through a headset.

Apple is very much leaning into digitally augmented reality. While the Apple Vision Pro is capable of becoming a VR device and tuning out reality entirely — a blessing on long-haul flights — its main use cases are around putting digital objects into the real world. You stay aware of your office or living room, and the other people in it, even while you interact with a digital object only you can see.

Apple is adding elements to existing reality. Meta is allowing you to absent yourself from physical reality. In the post-pandemic age, when we all are experiencing an urge to get physical again, I certainly wouldn’t bet against Apple.

The first spatial computing product

If I were Tesla, I would be looking at what Apple did here with a little concern. If I made smart home or smart city kit, I might wonder where Apple is going with this.

The Apple Reality Pro, as eye-wateringly expensive as it is, is the spatial computing version of the little original Mac back in 1984. If you looked at that, you wouldn’t see the future iPhone, would you? And yet, there’s a direct line from one to the other.

What devices will spatial computing bring into our lives in 10 years? In 20? How about in 23 years, the time it took between the first Mac and the first iPhone? I don’t know. Apple doesn’t know — but it knows more than me right now. And it’s taught me a valuable lesson: it’s time to think less about escaping into virtual worlds, and think more about what happens when physical gets digital.

Photo by apple on apple.com