We need to take up our responsibilities

What makes us human is that we take our fate in our own hands, that we take responsibility for our society and for the greater good. That we don't surrender to the supposedly superior forces of technology.

In July 1535, Thomas More was convicted of treason and then executed. His famous novel Utopia had been published almost two decades ago, in 1516. These seemingly unrelated events share a common thread through the word responsibility. In Utopia, More defines what it should mean to be a responsible human being. And in his execution, he died for following this principle. More’s Law, as Andrew Keen dubs it in his new book How to Fix the Future, states our duty to make the world a better place:

In today’s age of acceleration, five hundred years after the publication of Utopia, many of us once again feel powerless as seemingly inevitable technological change reshapes our society. As More reminds us, fixing our affairs — by becoming steersmen or pilots of society — is our civic duty. It’s what made us human in the sixteenth century, and it’s what makes us human today.

Let’s reiterate: What makes us human is that we take our fate in our own hands, that we take responsibility for our society and for the greater good. That we don’t surrender to the supposedly superior forces of technology. That we don’t become slaves of the internet, Facebook, or AI. Responsibility has many facets, three of which I’d like to take a closer look on:

  • individual responsibility
  • social responsibility
  • corporate social responsibility

Individual Responsibility

Do we have a free will, and if not, are we still morally responsible for our actions or omissions? We can leave the answer to philosophical debate and neuroscience. For practical reasons, it makes sense to operate under the assumption of being (morally) responsible for our affairs, including digital.

Our responsibility starts with basic behaviour in the digital sphere. Since we know about the traps designed into digital media, we are obliged to watch and to adjust our habits. This might include drastic measures: Your Facebook account doesn’t give you value, but instead shortens your attention span to that of a goldfish? Delete it! (Though the goldfish meme is probably a myth.)

But that’s not enough. We need to ask ourselves what we can do to make the digital world a better place. To paraphrase JFK: ask not what your digital world can do for you — ask what you can do for your digital world. The internet was designed on principles like liberty. We must defend them.

And for those who have acquired great wealth in the digital realm, there is an obligation to do good with it. This is a very American stance. In Europe, we are pretty much used to leaving that to the (welfare) state, and then minimising our contribution through taxes and public dues.

Andrew Keen reminds us of the philanthropic Carnegies, Stanfords, Rockefellers, and Fords of the early twentieth century. In our days, tech tycoons like Bill Gates or Mark Zuckerberg are becoming their counterparts. But that’s my second point.

Social Responsibility

Social media is not really social, if we stay with the overcome meaning of the word. Social is the fabric of our society, and it is our responsibility to cultivate it. Instead, the digital industry is often accused of disrupting it, and that has become the dominant narrative by now.

This view of course is by no means new. In his 1942 book Capitalism, Socialism and Democracy, Joseph A. Schumpeter notes:

In breaking down the pre-capitalist framework of society, capitalism thus broke not only barriers that impeded its progress but also flying buttresses that prevented its collapse. That process, impressive in its relentless necessity, was not merely a matter of removing institutional deadwood, but of removing partners of the capitalist stratum, symbiosis with whom was an essential element of the capitalist schema. [… T]he capitalist process in much the same way in which it destroyed the institutional framework of feudal society also undermines its own.

That very same movie is now shown again in a cinema near you. Yes, society will indeed change profoundly through the digital revolution, and it already has. That’s inevitable, and not necessarily a bad thing. But no, the direction of change is not predetermined by some integral forces of technology itself.

We need to be at the helm of technological innovation, to understand what’s happening and to drive the change. That’s our social responsibility. And it is social also with regard to the fact that it can’t be done alone. We need to build and foster civic associations to take care of the social fabric of our digital society.

Corporate Social Responsibility

The third pillar of responsibility sounds quite dull and makes everyone yawn, perhaps except some CSR eggheads. But wait a minute. What’s interesting are, if we follow Wikipedia’s definition, two aspects:

  • self-regulation
  • integration into a business model

Self-regulation, if done well, lessens the needed amount of external regulation. If, for example, Google would stick to its don’t be evil mantra, the debate with external regulators could focus on the definition of good and evil. The Google Code of Conduct serves as a good reference point for everyone involved.

Integration into a business model would be even stronger. If the business model itself fulfills the criteria of (corporate) social responsibility, we almost live in the best of possible worlds. At least in theory.

So how can we be good digital citizens? We need to take up our responsibilities. With great power comes great responsibility. But I think the opposite is also true: With great responsibility comes great power. That’s what we call Digital Humanism.

You must choose. But choose wisely.

Photo by Louis Smit on Unsplash