Are we using our digital tools — or are they using us?
We've welcomed digital devices into our homes, our lives and our pockets. But have we really thought about whose agenda lies embedded in the apps we use?
As many sites have noted, the iPhone has been in people’s hands for almost exactly 10 years now. Never, in the history of mankind, has a technology spread so fast amongst the majority of humanity. The smartphone — a misnomer for a pocket computer — has become one of the most widespread pieces of technology ever, in less than a decade. Be it Apple, Android or another flavour, these devices are part of our everyday existence.
And yet, have we really thought through the impact of them? So many of the apps on our devices are driven by a combination of data collection and algorithmic analysis — and this combination was already having a profound effect on our societies:
[…] in the coming 10 to 20 years around half of today’s jobs will be threatened by algorithms. 40% of today’s top 500 companies will have vanished in a decade.
That’s a big enough challenge in its own right as the Scientific American piece suggests; but what about when those little digital nudges are coming from our most personal, our most intimate devices?
Some software platforms are moving towards “persuasive computing.” In the future, using sophisticated manipulation technologies, these platforms will be able to steer us through entire courses of action, be it for the execution of complex work processes or to generate free content for Internet platforms, from which corporations earn billions. The trend goes from programming computers to programming people.
Programming people…? There’s a terrifying thought. These trusted little device in our pockets or bags could actually act as prompts, steering us unconsciously to take action as determined by algorithms, themselves written by others. Have we really considered the power we’ve given up — and the possible consequences of that?
It’s easy to fall to the language of the digital evangelist, and think of disruption as an unquestionable good. But life is ever so much more complicated than that. Tools are, in of themselves, neutral. It’s the use they are put to that defines their good or evil nature. And if we don’t actively consider who we’re letting influence us through our devices, we run some very serious risks:
Nonetheless, experiments with manipulative technologies, such as nudging, are performed with millions of people, without informing them, without transparency and without ethical constraints. Even large social networks like Facebook or online dating platforms such as OkCupid have already publicly admitted to undertaking these kinds of social experiments. If we want to avoid irresponsible research on humans and society (just think of the involvement of psychologists in the torture scandals of the recent past), then we urgently need to impose high standards, especially scientific quality criteria and a code of conduct similar to the Hippocratic Oath. Has our thinking, our freedom, our democracy been hacked?
Perhaps not yet. But it could be.
Digital sucks — because we don’t pay enough attention to who is using its tools to influence us. The more we take control of our devices, and the software we let run on them, the better the chance that we use this new, pervasive level of technology to make the world better, not worse.