Google’s Project Glass: innovation or intrusion?
For the last week, the talk of the web has been Google Glass: a project to provide augmented reality glasses as, essentially, the next evolution of the mobile phone.
However, for better or for worse, Project Glass is setting us off on a road to an even more intimate relationship with the Internet.
Other, well, less so:
The tricky questions are whether anyone actually wants a head-mounted display. If you were a soldier, for example, you would want to know if there were snipers hiding behind a wall, and if you could bring in a drone (pilotless aircraft) to fire at them. As a pedestrian crossing the street, the sudden appearance of a special offer from a nearby carpet retailer might have less happy consequences (since this is Google, you can bet ads are part of the master plan).
The core concept is lovely, and the video almost exciting: but it's just a touch too intrusive for me. And here why:
I've needed glasses since I was born. I spent the first three years of my life with my parents thinking I was slightly stupid – before they realised my lack of reaction to many visual stimuli was simply because I couldn't see them. One visit to the optician later, and the issue was resolved. I've worn glasses since before I started school. However familiar you get with it, putting glasses on your face is an intrusion, a burden in the way that carrying a phone isn't. The palpable sense of relief I felt in my first day of wearing contact lenses has never quite left me. I don't wear contacts every day – in fact, I've spent most of the Easter Bank Holiday in glasses – but I look forwards to my next day in contacts.
There's something in what Google in doing here – socially- and geographically-aware persona assistants with voice interaction are great – but I'm not sure I want them floating in front of my eyes all the time. Transmitted to me via headphones? Sure. Much of the time when I'm walking around I have my headphone in and connected to my phone. In fact, I can access the phone in several ways – via headphone, via speech, via the screen. Glasses need to be an, and are focused on visual interaction. And with the number of screens and connections around us becoming so pervasive that they are almost invisible – is this level of intrusion really needed? Wouldn't a location-aware device that can talk simply to screens nearby – or directly to you – be better?
I'm open to being convinced – and I love the fact that Google are exploring these ideas – but I'm starting one step back from most folks. AR contact lenses? Now we're talking…