NEXT17 Summary #1: Where digital went wrong
For two days in September, the digital world met in Hamburg and a single question was on our lips: where did we go wrong? Here are the answers we found…
It's fair to say that there were more than a few raised eyebrows when we announced the theme of NEXT17: digital sucks.
It goes against almost everything that tech conferences have said and done over the last decade. We've been singing from one big hymn sheets, raising the praises of startups, platforms, innovation and disruption to the rafters.
OK, we did sound a word of warning five years ago, but that was a quiet voice of concern in a sea of jubilation.
And yet, the attendees and the speakers warmed to the theme quickly. And in the first part of a two part summary of NEXT17, we'll explore what they had to say about the source of the problem: where did we go wrong?
— and then we'll address how to make it better.
The empty heart of disruption culture
Dave Mattin had some early answers for us - he suggested that the early ideology of the web, derived from the hippie era which was still conscious of the terrible global effects of division and war, that characterised much of the 20th century. As tech has moved into the hands of people who can barely remember the 20th century, that imperative - that sense of consequence - has lessened, and decisions are being made with no regard to the consequences.
Or, in as illustrated in a quote from Blogger, Twitter and Medium founder Ev Williams, we live in a tech ecosystem built from naiveté:
I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place. I was wrong about that.
This has manifested itself in technocratic creed that is, at its heart, utterly flawed:
- All human beings are essentially good people.
- That means you don’t really need politics, society and social conventions. They’re holding you back.
- If you look at the tech world’s leaders, they’re a very interesting mix of left and right.They’re suspicious of government - they want small government.
- All people need is each other connected by us to do amazing things.
There's so many false, simplistic assumptions built into that first statement that have contributed to the problems we find ourselves in now. We haven't designed for humanity's flaws and weaknesses - we've designed for a world where those shortcomings don't exist. But they do exist. And we're paying the price of that flawed assumption.
In that sense, we're creating digital product that suck because we lack a sense of history, because history is the story of consequences.
We need to implement innovation
This was echoed by Tricia Wang's compelling talk. Innovation isn't actually a problem, she argued - it's implementing it within a company. Why? Because we don't understand that company structures we use. They were designed for centralised decision making in the railroad era. That's a fantastic organisational model in a period of stability - where known data can be passed up a hierarchy and decisions made remotely. In periods of profound change, that sort of distance between the innovators and the decision makers is catastrophic.
(It's almost a cliché to quote Steve Jobs in these conversations, but I was reminded of his deeply hands-on approach to product development by Tricia's talk. Many have noted that he functioned less like a CEO and more like a Chief Product Officer, with Tim Cook picking up the more traditional CEO tasks, even before he actually took on the title. People tend to take the wrong lessons from Jobs - they behave like an arsehole, rather than being a hands-on product person.)
Let's build a network of trust and privacy
We need to start learning these lessons now - because if we don't we'll miss opportunities to make the most of emergent technologies. Shermin Voshmiger made a compelling case that we're in that position with Blockchain right now - it's an incredibly powerful technology which distributes trust in transitions between parties who don't necessarily trust one another in a technically elegant way. This tech, if shepherded correctly over the next few years, could create a decentralised way of owning your own data, and sharing it in a secure way. If we miss this opportunity, then we run the risk of creating one of the most frightening centralised control machines we've yet generated.
We messed up during the Web 2.0 era, allowing the emergence of dynamic web applications to create platforms that have, in some senses, supplemented the web. If we allow that centralisation of power to continue into the Web 3.0 era, the impact on our societies could make the shockwaves of the last 18 months seems like minor perturbations.
If the running theme through these talks was anything, it's that we need to start seeing technology as a neutral tool again, and start shaping it towards the good of humanity, rather than falling for the unquestioning assumption that technology is good, ergo more technology is better.
The evolution of conversation – and what we can learn from it
One pair of speakers delved really deep into history to give us some perspective on the sorts of societal shifts we're passing through now. The history of conversation is deeply intertwined with the history of technology, suggested Mark Curtis and Thomas Müller of Fjord. Language was the first tool of conversation, and then we've facilitated it through drugs - like wine and coffee - with accompanying etiquette. And then we've developed new forms of etiquette around letter writing and e-mails.
We need new etiquettes and behaviours around instant communications, and around interactions via bots. We can't go backwards, but we can learn from what our predecessors did, and figure out protocols that allow the machines to adapt to how humans operate, and not making us adapt to the machines.
Let's starting asking moral questions of the technology decisions we make
Is there a moral component to this? James Williams certainly thought so. The information age has brought us a scarcity of attention - we have more information available to us than we can reasonable expect to consume, so we have to make hard decisions about how to spend our time. And services which are designed to consume our attention to the benefit of the service provider rather than our own are making, at heart, an immoral decision. (See also the idea of the ludic loop.)
We need to align our design and analytics goal to people's one life goals, and that way we'll build healthier, more sustainable services for people. We need to avoid the distraction triad:
- Functional distraction - interferences with what we want to do
- Existential distraction - interferes with our being goals, stops us being who we want to be
- Epistemically distraction - frustration of our underlying capabilities like reason, willpower and reflection.
If we enable these processes, we are, in effect, dehumanising people, not connecting them or enabling them to live better lives. We are putting corporate value over human value.
Another technology in the nascent stage is the various forms of mixed reality - augmented reality and virtual reality. Our planet discussion expressed some concerns about the rapid consumerisation of a technology which has a profound effect on how people actually see the world, which a consummate impact on their neurological structures. If we change our perception of reality, we change how we think about reality.While there's incredible potential to allow people to experience - and understand - things they couldn't do otherwise, do we need to start actively deigning these systems for empathy, understanding and connection, rather than division and conflict?
Design for the user, not for the company
Ame Elliott suggested that we need this kind of user protection thinking built into the very initial design phase of any new product. Highlighting the risk of cheap internet of things devices being easily compromised and allowing digital intrusion into our homes - into our children's bedrooms - being an example of a mix of enthusiasm for tech, and a disregard for the consequences of that tech. And, with no pathway to update and patch the security holes in these devices, the consequences of these mistakes could be very long lasting.
Lily Kollé, a senior designer at Raft Collective, elaborated on this point, suggesting that we need to ruthlessly edit our designs to facilitate the ideal path to getting what the user actually wants (rather than what we could offer the user). When we deviate from the bare minimum of what they need too get the job done, we should be conscious of why we're deviating from that - and the costs of doing so.
In essence, this was an expansion of the theme of user-centric design to think not just about the job to be done by our product, but how that job fits into the overall direction of our users' lives. Are we making a net positive or negative contribution to what they are doing with their limited span on this planet?
The fault, as ever, is in ourselves
Could it be that the problems we're seeing manifesting in technology are just an expression of deeper underlying problems in the way we structure our world?
In one of the final talks of NEXT17, Sebastian Deterding a senior research fellow at Digital Creativity Labs and principal designer at coding conduct posed a disturbing question: why, with the abundance of labour-saving technology we're all surrounded with, do we have so little free time? What went wrong?
In an echo of Tricia Wang's perspective from earlier in the event, he suggested that our approach to corporate life is the problem. As many of the traditional sources of community have broken down - including the church - we've replaced duty to God or our community, with duty to our businesses. And the Calvinistic work ethic is creating a situation where working long hours is a badge of virtue - despite extensive research showing that working long hours is destructive on creativity and productivity. We're destroying our value to our companies to prove our value to our companies - and our employers are praising us for it. Silicon Valley is the peak of this, with total commitment expected to the job, and an ethos of "move fast and break things" which can be great for the company in the short term, but horribly corrosive on the world around them.
Our lives support corporation, while our corporations should support our lives.
The solution? Take back digital for humanity
How do we make tech suck less? Recenter it on humanity, and remember the lessons of the past as we push towards the future.
If nothing else, #NEXT17 was a wake-up call to us all not to sleepwalk into a bleaker future, blinded and distracted by our devices, and exhausted by our own over-work and attention overload. Step back, be mindful, and take control of these tools for a better future.
And, happily, there were many lessons on how we can do that — but let's save those for the concluding part of this review…
This is the first part of a two part summary of NEXT17, which was held in Hamburg in September 2017. You can read part two here.