Fatal Tesla Model S crash – are we ready for computer-driven death?

Two months ago, a Tesla Model S driver died when the car's Autopilot failed to notice a truck. And people are freaking out.

It’s odd for a fatal car crash from nearly two months back to suddenly start making headline news. But that’s exactly what’s happened – and all because the fatality was driving a Tesla Model S – or rather, the vehicle’s Autopilot system was.

News outlets are starting to characterise it as the first “driverless” car death – hence the interest. That’s not a fair description – Tesla’s Autopilot make son claims to be a complete autonomous car system – it’s what they call “driver assist”, which you might liken to being halfway between cruise control and a full autonomous vehicle. We’re a long way from truly autonomous vehicle systems being available to drivers.

But the idea of a computer driving a man to his death has a chilling aspect to it, that has caught people’s attention. From Tesla’s own response to the issue:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.

So, yes, the driver assist system failed to notice the vehicle – and the driver did, too.

Making driver assist into driver-free

Questions have also been raised about how safely the deceased was driving his car:

The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.

There’s no doubt that a subset of Tesla drivers are pushing the Autopilor system harder than the company recommends. Here’s Tesla founder Elon Musk’s wife, actress Talulah Riley, “driving” with her hands nowhere near the wheel:

And there’s plenty more evidence of others doing the same.

But despite all these – distinctly hazardous – behaviours, this was still the first death. And it’s easy to forget that people die on the roads every day. Around 90 people die per day on US roads.

The Tesla post on the issue makes a significant statistical point:

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

Even with people abusing the system, the fatality rate is significantly lower than average.

Learning to live with computer-driven death

As cars grow increasingly autonomous, we have to accept that people will die – especially in circumstances where you mix autonomous or semi-autonomous vehicles with traditionally driven ones. People make mistakes – and sometimes the systems can’t react in time. Are we, as a culture, more willing to accept human-caused death than machine-caused? And what if the machines do drive more safely – reducing overall road deaths?

Is your loved one being killed by a computer system error worse than them dying through their own bad driving – or that of others? These are the sorts of questions we’ll have to face as a society – and for some lucky individuals, as families.

If nothing else, the reaction to this tragic death shows us that the path to autonomous vehicles is not just a technological and legal challenge – it’s a social one, too.