Digital needs to grow up: and that means learning ethics

Right now, the digital world is behaving like a spoilt toddler, doing what it wants without thought of consequences. It's up to us to teach the digital world how to build new tools that are both ethical and responsible.

Tech needs to grow up. Right now, much of our vision of digital technology could be described as being stuck in the toddler stage. We have new toys and we bash them around experimentally, not caring what we break in the process. For all the talk of “disruption” as a positive force — and it can be — anyone who has had children will know that not all disruption is positive.

Like a toddler, the tech world has to learn a sense of consequences and of ethics. And like a toddler, it will need help to develop that. We’re the help it needs – and by “we” I mean us in our roles as both consumers, and as those people who are designing and building new digital products. By the decisions we make, by the services we choose to use, and support — and through the points where we say “enough” — we will shape that future.

The negative should be a gateway to the positive

Right now, it feels like the conversation is trapped on dwelling on the negative. That’s why we looked last weeks at possible solution emerging to fix — or replace — Facebook. Becoming aware of the problem is a good thing.

Euan Semple put this eloquently:

I have often said that the internet is a mirror, showing us a reflection of our nature, warts and all. Sometimes that mirror is going to be deliberately distorted, like those old mirrors at funfairs.

When it is, so long as we know it is happening, we don’t need to get literally “bent out of shape” about it. We can adjust our behaviour, expose the manipulation, and move on.

But we can do none of this if we’ve got our eyes shut!

The current state of play of digital is not something that was done to us, it was done by us. Now we’re aware of where we’ve ended up – which was the whole point of our “Digital Sucks” theme last year — we can make steps to adjust that.

Some people have been making tentative explorations down the path we need to travel. We’ve alluded before to the Indie Web movement, which has been working hard to put the technological underpinnings in place to build a version of the web that has less data centralisation. It self-describes its purpose as:

Our online content and identities are becoming more important and sometimes even critical to our lives. Neither are secure in the hands of random ephemeral startups or big silos. We should be the holders of our online presence.

Indeed, the website also suggests:

The IndieWeb is a people-focused alternative to the “corporate web”.

If you’re in business, you might be getting a little scratchy at this. Isn’t there an anti-corporate theme here? Not exactly. The movement is not anti-corporate, but anti-corporate control. Think of it this way: if your business has ever chafed at the restrictions imposed on it by the centricity of Google and Facebook in controlling information flows, you will find plenty to value in the Indie Web movement.

The corporate web supports the big two incumbents. An indie web supports new and existing businesses in doing more.

Right now, though, the Indie Web is a set of tools and technology, built on the same basic principles as the web or e-mail – open, inter-connected standards. We’ll need more than an technological grounding to move this discussion forwards. We need an ethical framework, too.

Fortunately, there’s some work in progress there, too.

Engaging with digital ethics

For example, Oxford Internet Institute, part of the UK’s University of Oxford, launched a Digital Ethics Lab in 2017. The Lab aims to

“help design a better information society: open, pluralistic, tolerant, equitable, and just. Our goal is to identify the benefits and enhance the positive opportunities of digital innovation as a force for good, and avoid or mitigate its risks and shortcomings.”

One example of that is the work they’ve been doing around integrating human rights into new technology products. They’ve identified three key considerations in both the legal and the technological fields:

Legal

  1. Further study of the application of instruments of the existing human rights framework to Internet actors is needed.
  2. More research is needed to analyse and rebuild the theories underpinning human rights, given the premises and assumptions grounding them may have been affected by the transition to a digitally mediated society.
  3. Human rights frameworks can best be approached as a legal minimum baseline.

Technical:

  1. Taking into account a wider range of international human rights would benefit the development of human rights oriented Internet technology.
  2. Internet technologies, in general, must be developed with an eye towards their potential negative impact and human rights impact assessments undertaken to understand that impact.
  3. Technology designers, funders, and implementers need to be aware of the context and culture within which a technology will be used, by involving the target end-users in the design process.

This feels like valuable work in an era when people are becoming aware of what they’ve already given up by surrendering their digital egos to a limited number of mega-companies. If privacy and ethics are to become competitive advantages in the next wave of digital, this sort of work might become very useful.

Researching the problems – and the solutions

Over in the United States, the Centre for Digital Ethics is doing similar work, as well as some useful research into the problems today’s technology might be storing up for our future, and that of our children:

A review of 36 social media studies, published in JAMA Pediatrics, found that 23 percent of kids are victims of cyberbullying. The review also found that cyberbullying results in low self-esteem, depression, self-harm and behavioral problems — in both the victims and the bullies. In addition, cyberbullying was more likely to produce suicidal thoughts than traditional bullying.

This is not just a compilation of problems, but also of suggested solutions:

“Mobile devices belong to the parent and the teen is being allowed to use it,” Shea said. “A contract can be a useful tool before putting a device in the hands of your teen which would allow parents to have access to the phone.”

Clearly we’re at such an early stage in these journeys that monitoring the results over time will be critical. It’s reassuring that work is now happening at an academic level to bring this together.

Private sector digital ethics

But those in the private sector have also been getting involved in this. Most notably, the Centre for Humane Technology is in the process of building our resources around the idea of digital time being Time Well Spent – if we take control of that time, and make it work for us. Most notably, they’ve built a set of basic design principles that allow you to consider how to make sure your product enhances people’s lives:

These are the beginnings of the foundations and scaffolding we’ll need to build the next wave of technology. With them, we can build technology that enhances our lives, rather than supplanting them. And maybe, with enough people in enough organisations turning their minds on this problem, and sharing what they’ve learnt, we won’t be quite so obsessed with the panda that’s dancing…


Photo by Jelleke Vanooteghem on Unsplash