Jamie Susskind: the power and politics of digital

We can't afford to just pay attention to the consumer impact of digital. It's creating rules that will define how we live, suggests Jamie Susskind at NEXT19.

Jamie Susskind is the author of Future Politics: Living Together in a World Transformed by Tech, and is a former Fellow of Harvard’s Berkman Klein Center. He spoke during the morning session of NEXT19 on 19th September 2019. These are our liveblogged notes.

We tend to think that tomorrow will be a bit better and faster than today. But there’s another vision that says that technology could make the world completely different.

And Susskind thinks that we’re on the cusp of a transition towards that right now. Here’s why:

1. Increasingly capable systems

There are tools that are becoming as competent as us at doing things like translation, classification and beating us at games. For years it was assumed Go was too complex for an AI to beat a human. But in the last few years it has been done. AlphaGo Zero beats the AlphaGo version that beat Go masters 100 times in a row. We went from “this is impossible” to “we will never beat these machines” in 18 months.

2. Increasingly integrated technology

We’ve moved from room-sized computers, to keyboard-driven ones, to today’s glass screens. In the future, devices will be embedded everywhere. The idea of the physical world and the digital world being separate will be strange to our children.

3. Increasingly quantified society

We generate as much data every few hours that we did in the entire period from the dawn of time to 2003. This is measured, stored and analysed. Our morals and mores need to catch up with technology and what it is recording – and learning.

The Impacts

Jamie Susskind talking at NEXT19


When then-PM Gordon Brown first visited President Obama, he took £20,000 worth of gifts for the US president and his wife. In return he got 20 DVDs of classic movies. He was mocked for it in the press, and when he got home, found he couldn’t play them, because they were region-locked. It’s hard to get around the rules coded into a system. Imagine rushing to the hospital – but not being allowed to break the speed limit, because your car is coded not to allow it. This is a form of power.

The more you know about people, the easier it is to influence them. That’s why the Cambridge Analytica revelations were so important. We learned they were producing tailored views of candidates, targeted at individuals.

The people who chose what comes top of your searches, or your social media feeds have power. They are determining true and false, wrong and right. They control our perception of the world.


Have you ever gone to a restaurant with a buffet, and taken more than your fair share? Have you dodged a fare, or paid someone cash-in-hand knowing that tax would not be paid on it? We’re all allowed to be a little naughty at the edge of the laws in a free society. It’s very hard to have that flexibility if the laws are coded in. In some parts of China, your toilet paper use in public facilities is regulated by face recognition…

There’s a big difference between a door with a sign saying “do not enter” and a locked door.


The internet has changed politics – in particular how people organise. But now it’s starting to influence how people deliberate. People tend to choose people they agree with to follow, and news sources that reinforce that. That creates fragmentation, which then creates polarisation.

We’re seeing false information spreading. In the 90s we thought that the internet would make lying impossible, because of how quickly we could fact check. Look how that turned out.

Direct democracy has been a dream since the fall of Athens. Now, it could come to us. But do we want it? Do we want rid of politicians entirely? Are the areas of politics that we might entrust to non-human systems?

They are no longer impossibilities. So someone will suggest them – and we need to decide who we feel about them.


Many CVs go unread by human eyes, because the first sort is done by computers. Algorithmic systems are deciding who gets what – that’s a matter of justice. There are facial recognition systems that won’t recognise people of colour. There are voice recognition systems that can’t hear people with strong regional accents.

Software engineers are becoming social engineers. The rules they write are the ones we have to follow. Too many of them have no sense of the moral dimension of their work, and their lack of gender and racial diversity is a limitation.

But equally, we have regulators who just don’t understand the technology. And we, as the people, need to move beyond thinking about consumer consequences of digital, to the political ones. How much of our lives do we want controlled by technology?