2018: The Year of Tech Consequences
You’re never going to hit the jackpot when writing about the future, because you can never quite predict the huge changes that are coming. The unexpected is usually what changes everything and, by its very definition, it’s not expected — and thus, unpredictable.
But sometimes you can look at how everything is lining up in the declining few weeks of the years, and you can see a clear direction ahead. This is one of those years. For once, it’s nothing to do with a new technology, product or brand. It’s about everything that surrounds technology. It’s about politics, and legislation and trust. It’s about choices. And it doesn’t look fun or easy.
In short, 2018 looks set to be the year of consequences. This is the year when the tech world has to grow up, and the rest of the world has to start taking it really seriously. And that’s going to mean some serious growing pains, as the Silicon Valley dreamers clash with pragmatic legislators, for control of our information ecosystems.
Only two years ago an article like this would have seemed unlikely:
Outside the bubble, things are different. We’re not egging on startups that willingly flaunt regulations. We’re wary of artificial intelligence and its potential to eliminate jobs. We’re dubious of tech leaders’ promises to make their products safe for their kids to use. We are all sick of the jokes that no longer feel funny: lines about the lack of women in tech, about obscenely rich 20-somethings, about awkward coders with bad people skills, about “hustling” and growth at any cost. It all feels inappropriate.
That’s from WIRED, once seen as the bible of technology. And when the bible is calling you out, maybe you need to rethink your moral centre a little. If the article is correct — and there’s plenty of evidence it is — there’s a fertile soil for politicians to start campaigning on reigning in the power of the tech behemoths.
That future wasn’t birthed this year, but back in 2016, when suddenly the world became aware of what a few reporters had already been picking up: our new tools were being used to manipulate us.
In particular, Russia has emerged as a new powerhouse in information warfare, inserting its propaganda seamlessly into discussions worldwide, changing the tenor of debates - and possibly even shifting the results of elections.
These global impacts are being felt on a personal level, as we’re becoming aware of the effects of our new tools on our politics, our opinions and even our mental health.
But let’s start at the beginning.
The “Fake News” revelation
2017 has seen us picking our way through the new dynamics of our politics. It was a world that already existed, but which was slipping under people’s radar. As early as 2015, the New York Times was reporting on Russia’s trolling operations, but the story didn’t get anywhere the notice it should have done, just highlighted in passing by the occasional blogger.
In the year since US President Donald Trump’s election, the world has become much more aware of this — and the danger it poses to their own governments. Step by step, Russia’s involvement in political contests in many countries has been unpicked, and with increasing sophistication. In the first instance, people went down the obvious route of looking at the adverts bought within social media. That was always going to miss the point — the truth is inevitably more complicated than that. Social media is facilitated by code, but it is driven by us.
Social platforms make it cheap and easy for propagandists to create fake profiles and spread their messages - but they still spread through playing to our own psychological weakness, especially confirmation bias. And we’re all prone to that, even those of us who make part of our living working on verification skills.
The world is now grappling with how to safeguard our societies in light of this ability to manipulate individual citizens en masse, and that will inevitably find its way both into education and legislation — and that’s likely to be a big trend of 2018. The tech world, long seen as an engine of economic growth and thus given light-touch regulation - is likely to come under ever-greater legal scrutiny. And politicians will make their names on anti-tech platforms.
The Tech Backlash hits legislation
There are good reasons for this. Smartwatches have become a vector for covert surveillance, and smartphones have become complicit in mental health problems. While there’s evidence that three or more hours of phone use a day can have a negative effect on you mentally, a few hours each day seems to be absolutely fine.
The move to ban technology is a rather predictable response. It’s a crowd-pleasing action beloved of politicians, which does very little to address the core problems, as the ban will only serve to make the devices more attractive, and to deprive us of the chance to teach the next generation to use them responsibly. Rather than banning our tech, we should be training our children how to use it better — and also being more mindful of how we use it ourselves.
Even Facebook is finally admitted that maybe, just maybe, social media can be bad for you. The site’s solution is pretty predictable: use Facebook more, but also more actively. This response has not gone down well with the author of the original research:
You have to think about the way that most people are using the app or using the site, especially the people who are spending the most time on Facebook or other social media sites, who are the most at risk for these mental health issues,” she says. “How are they using their time on the site? Unless we know that, it’s not particularly relevant to ask, well, what are people doing?
The battle over how to use the tech we’ve invented safely and productive has only just got going this year. In 2018, I’d expect it to turn into a major battle between techno-utopians and neo-luddites. The more this battle gets expressed in legislation, the more limited the scope we’ll have to play in.
Just this week, France is cracking down on Facebook mining data from WhatsApp. We’ll read more stories like that in the coming year.
And so, we’ll all need to engage in this process, particularly those of us who have a deep understanding of the technology.
Perhaps we should all be asking ourselves two questions throughout 2018:
- How does this product makes people’s lives better?
- How could this product be used to make people’s lives worse - and how can I design to prevent that happening?
That seems like the best way to fight off an ever-tightening legal regime.
It’s not just about photos of your lunch anymore
However, there is one area of technology that will see serious investment: cyberwar will get serious. Russia has shown how both intelligent social manipulation, as well as large-scale technological intervention can turn online attacks into powerful tools of propaganda and data-gathering. North Korea is playing in this space, too, with Malware that has crippled organisations for days or weeks. How well prepared are you for that?
Other countries will up their game during 2018 — or, at the very least, will be granted funds to do so. On the other end of the scale, we need to think much more actively about cyber-security both at a personal level, and at a corporate level. What harm could your products do if they were compromised? What are you doing to prevent — or mitigate — that?
On a personal level, whom are you giving your data to, and how much do you trust them to keep it safe? Trust may become a bigger part of the online decision making then ever before.
A decade ago, many people were mocking social networks as trivial places — where people “poked” each other and shared photos of their lunch. Now, they’re reshaping our democracies. Everyone needs to start taking tech seriously, and weighing its upsides and downsides carefully. We’ve already seen that starting in 2017, but the trend will only accelerate into 2018.
Let’s look at some examples.
Smart emerging technology
I think voice interfaces will continue to develop through 2018. Apple and Sonos adding voice to their offerings, with the Apple HomePod due in the early part of the year, will help bring this technology further into the mainstream. Both will use vocal command of music as a gateway to speech-based ecosystems, and will start to build the missing part of the connected home equation: voice commands.
Anyone who has experimented with smart lighting systems, for example, will have been frustrated by the intersection of traditional switches and the new phone-controlled lights. If you turn them off at the switch, the home automation breaks down. However, if you turn them off via your mobile, turning the on from the switch becomes a hassle. In-room voice solves those problems.
However, at some point, people are going to start wondering if having always-listening devices in their home is a good idea. It’s like inviting surveillance in –— and paying for the privilege. The two major players in the space — Google and Amazon — make pretty good money off learning as much about you as they possibly can. That might well come back to bite them — and open the door for Apple, a late entrant into the space, to sell its expensive speakers off the back of a privacy story.
However, there is another solution - just speaking to your wrists, Dick Tracy-style. Many people have forgotten about wearables but they seem to be growing steadily. It’s a function of our current obsession with tech that if something isn’t a massive success day one, it must be a failure. That idea forgets the fact that devices like the iPod and the iPhone took years to become the breakout hits they were — or are — at their peak.
Much like AI is starting to deal with some of the issues we face from big data, wearables will begin to address the distraction problem we’ve created for ourselves. They’re also starting to tell compelling stories around health tracking — and health alerts. But, again, that opens up a can of worms. Who has that data? And why? And how safe is it?
We live in an age where smart sex toys are getting hacked. Unless exhibitionism is your things, I’m pretty sure most of us want that kind of data kept really secure.
So, here’s the challenge for 2018:
- How can government keep us safer in the digital age, without crushing innovation?
- How can companies built better, more secure products, with the risks of downsides addressed and designed around?
- How can we, as consumers, make smarter choices about the devices and services we buy or invest time in?
Actually, to me at least, that sounds like a more exciting year than one with a few new bits of tech in it, because that’s doing the foundational work of building a truly digital society; one that enables us while also protecting us.
That sounds worth fighting for, doesn’t it?