Scott Smith: the internet knows us better than we do

We've only just started thinking about the implications of the tech we've developed, says Scott Smith. How intelligent will monitor tech get - and how comfortable are we with that?

Scott Smith is a critical futurist

WARNING: Liveblogging – prone to error and inaccuracy. Will be updated/improved over the next 48 hours.

The New Normal lasts for about 15 seconds, and then something new comes along. Kickstarter and IndieGoGo are heaving forwards new things all the time.

Joseph Woodland was thinking about the future of technology as long ago as 1948. He worked on the Manhattan Project – and he also invented the barcode. That started propelling us towards the world of Big Data. It turned physical objects into things that were digitally countable and trackable. In the early 80s, someone invented loyalty cards, which made identity and data linkable. It made us what we buy – and then we buy what we are. The companies that figured this out got decades of head start. Some people now have parties to swap loyalty card to confuse the databases underlying them…

Amazon is using data to create a mini-future of what you might do next – and all sorts of other companies are doing something similar now. We model ourselves, through Facebook and the like, constructing digital personas that follow us out onto the web, and through our mobile phones.

Data brokers buy all this information, assemble it and sell it on. Many of us have huge profiles sitting in these data brokers.

Passive monitoring of our actions

It used to be that if we didn’t actively use loyalty cards and the like, we weren’t part of this. But now the internet has made all of us part of this data capture process. iBeacons and other low energy tech are starting to bring this into the physical environment. There’s now facial recognition tech that can identify you – and track your eyeline around the store. Cameras are becoming cheap enough that products are starting to watch you…

Ad screens analyse your face and flash up demographically-based adverts appropriate you. These cameras could also check if employees are appearing happy enough.

Wearable tech is the next big boom. The Quantified Self is this idea that we’ll be tracking ourselves all the time. If you move more – can you be charged less for insurance? Or will your employer use this tech to track you and your work, as part of your contract?

The technology that’s always listening

The Moto X is the first phone that listens to you all the time, waiting for you to say “OK Google”. This will spread to other phones – and to browsers on your laptop.

Google Now predicts the information you need – and monitors where you park your car, without you having to do that manually. It’s an example of forwards-leaning tech built off assumptions that can backfire. If you go to the grocery store at 5pm every evening, it’ll assume you live at the grocery store. Could this be problematic? Phones get locations wrong sometimes. That data could create problems if you appear to be somewhere you’re not.

Transport for London left some personal identifiers in the dataset for the Boris bikes – this allowed patterns of behaviour to be tracked, and good guesses made as to who the people were.

LIDAR is moving from military use, to civil use. It will capture consumer data based on “watching” you. The Kinnect for Xbox does something similar with an more simple version of the tech. Nest can learn about you and your home, and that data can be sold to companies to optimise the power grid – but you can identify a lot about a person from that data. You can identify what people are cooking based on common patterns of electricity use.

Predictive Tech

Could robot vacuum cleaners share object identification data to get better at cleaning – and could that data be useful to other companies? Amazon now has enough data that it can speculate as to what you might want next, and pro-actively ship it to a warehouse near you, to wait for your order…

Both the EU and US governments are starting to think about these uses – but they’re barely scratching the surface of the discussions we need to have.

Five things to think about

  • What do these models look like to people with more power than you?
  • What does it look like to live in a post-choice world?
  • To what extent will we perform for the data?
  • What are the implications of long data?
  • How will we exercise the right to be forgotten – or not to be seen at all?