Will AI eat software?
Software used to be expensive, but thanks to AI, it may soon be cheap. Will software developers lose their jobs?
For years, if not decades, it has been a commonplace prospect that software is coming for our jobs. That software is eating the world, as Marc Andreessen famously put it. Now, it looks like AI will do the same with software.
In 2014, Bill Gates warned that within 20 years, a lot of jobs would be replaced by software and automation:
Software substitution, whether it’s for drivers or waiters or nurses … it’s progressing. … Technology over time will reduce demand for jobs, particularly at the lower end of skill set. … 20 years from now, labor demand for lots of skill sets will be substantially lower. I don’t think people have that in their mental model.
This warning was in line with the long-term trajectory of automation since the industrial revolution. We automate repetitive and structured tasks with predictable outcomes as soon as (a) we can and (b) the cost falls below that of human labour. Over the course of the digital revolution, automation increasingly meant digitisation, i.e. writing software.
This process inevitably gets recursive at some point: we start automating automation itself. In the case of software, this means we are about to automate coding, the process of producing software. Sean Everett predicted back in 2017 that AI is eating software. He looked at the basics (like energy, computation, and data) and concluded that it would take a few years to lay the foundation.
Five years later, OpenAI released ChatGPT.
Will developers lose their jobs?
Back in 2019, we saw early signs of AI making inroads. Now, we are witnessing a sheer explosion of events.
- Developers are getting new and better tools, fueled by machine learning and generative AI.
- A whole universe of low code and no code is emerging.
- Generative AI can write code.
- New interfaces like ChatGPT make code and coding increasingly obsolete.
Does this mean that developers will lose their jobs? Not so fast. Software is a layer (or rather multiple layers) between the users and the hardware. We used to abstract away lower layers in order to simplify the lives of developers and users alike. Automation isn’t something new.
If and when software development costs reach approximately zero, we’ll see a Cambrian explosion of new software. For many years, we have been limited in our ability to create and implement software due to the high cost of development. These costs continue to hinder achieving our full potential in this field.
Read this post by Paul Kedrosky and Eric Norlin to get an idea of what’s going to happen. It will be nothing less than the entire digital revolution that has taken decades until now, suddenly going into overdrive. This is an exponential moment. It’s everything everywhere all at once, way more than just pretty AI-generated pictures. We can rewrite how the world works, at least in principle.
Where are the limits?
Of course, there are still limits:
- We don’t have unlimited energy, although the solar revolution is working on that.
- Computation isn’t free of charge and we need hardware for all this software, whether AI-generated or not, to run.
- Data is plentiful, but not infinite.
- And usage is critical as well, because it’s closely related to human attention – a scarce resource.
What this means for the job market is hard to predict, apart from knowing that it will create tremendous change. Increasing productivity in the software industry should lead to a net gain, to say the least. Higher productivity can’t be bad, can it? We have two buckets to pay attention to:
- Productivity (doing more with less)
- Creativity (doing things that couldn’t be done before)
The former was the yardstick of the industrial age and is still relevant wherever we’re dealing with scarce resources, like energy, physical hardware, and software developers. The latter is what we traditionally call innovation. Both are closely related. We need innovation (and thus, creativity) to increase productivity. In turn, increased productivity enables us to do things we couldn’t do before.
To figure out what these things are, we again need creativity, which then leads to innovation. And so on. You see the loop here.
Moore’s Law vs Wright’s Law
In general, software already has shorter innovation cycles than hardware. We update the software on our devices much more often than the devices themselves. That’s why software is more exciting than hardware. Digitisation means applying this principle to new devices, like our cars or the infamous smart fridge. The Cambrian explosion of software will lead to two effects:
- Proliferation (software on more devices)
- Acceleration (shorter innovation cycles)
Again, it’s hard to predict which effect will dominate. We’ll continue to add software to devices wherever it makes sense and we can afford it (i.e. proliferation). The new world of cheap software means we can afford much more than we used to be able to. This will also accelerate innovation wherever possible.
Let’s again look at the limits.
Energy: Smart players have already established their server farms in areas with plenty of cheap renewable energy. Others will follow suit.
Computation: The cloud still needs hardware. Remember the old joke? The competition between Moore’s Law (for hardware) and Wright’s Law (aka the learning curve) will be interesting to watch. Machine learning could have a steeper curve than chips, which in turn could make hardware a bottleneck.
Data: AI has a tremendous appetite for data. How much data can we afford to generate?
Usage: Ultimately, it’s the users who determine value. Even autonomous AI systems must generate value for real users to continue existing.
These are all serious limiting factors, and the list isn’t exhaustive.
But what about software developers and their jobs? If I were a developer, my first priority would be productivity. Can I use new and better tools? Automation should free up capacities that could be used creatively. Can we liberate developers from the burden of repetitive, boring tasks?
In theory at least – I’m not a developer – the job should become more creative than it could be in the recent past. It would become more of an art form than traditional engineering (which, of course, also has a creative side). Think of design.
AI may eat software, but developers will evolve and survive.
Photo by Maximalfocus on Unsplash