Autonomous cars can fix road deaths – if we don’t rush them onto the roads too fast
In recent weeks, autonomous cars have been responsible for the deaths of a pedestrian and a driver. Are we moving too fast with this innovation?
Some companies regret the things they said in their earlier days. The motto “don’t be evil” has haunted Google for a decade, and people at Facebook are almost certainly regretting the fact that they used to have “move fast and break things” posters around the office. Breaking little things like user privacy, elections and possibly democracy is probably not what they had in mind.
There are some good examples of why sometimes you should take your time with innovation, and only launch when you’ve got it right. Microsoft launched tablet PCs back in 2001. For all the head start they got, it was Apple’s iPad that came to define and dominate the market 9 years later in 2010. First mover advantage is not always a good thing, and, as the old adage goes, pioneers often end up face down in the sand with arrows in their back.
Rushing into innovative disruption gets even more serious when you’re talking about technology that can kill people in an instant. Nowhere is that clearer right now than self-driving cars. Multiple parties are pushing forwards with autonomous car technology, including Uber, Tesla and, of course, Alphabet. But the headlines in the last few weeks have not been kind to those efforts.
Days after a self-driving Uber SUV struck a 49-year-old pedestrian while she was crossing the street with her bicycle in Tempe, Arizona, footage released by police revealed that the vehicle was moving in autonomous mode and did not appear to slow down or detect the woman even though she was visible in front of the car prior to the collision.
While more people are killed on the world’s roads every day by human drivers than have been killed by autonomous vehicles in the whole of history, this crash was particularly troubling because the video shows clearly that the car’s LIDAR detection system should have been able to stop the crash — and that the backup human driver wasn’t paying attention. This, coupled with Uber’s long-held reputation with playing fast and loose with the rules, made for a rapid backlash.
Don’t move fast and break things with a car
Moving fast and breaking things is fine for a website, or even a mobile phone — but for a fast moving metal object that can kill people, it’s a disaster.
As WIRED reported, Martyn Thomas CBE, professor of information technology at Gresham College and fellow of the Royal Academy of Engineering, has some serious worries about how these experiments are being conducted:
The process is then made additionally difficult by frequent incremental changes to hardware and the learning process of the software itself. “If your system is learning all the time, then it’s changing all the time, and any change could make things worse as well as better. So you could completely undermine the work you are doing,” Thomas explains. All these factors leave the testers with too many variables to effectively work through any problems buried deep in a car’s systems.
The Silicon Valley disruption culture could do much to learn from industries like aerospace that build incredibly complex systems slowly and carefully, with a remarkable safety record per hour flown. In the long-term, one of the great possibilities that autonomous driving heralds is massively reduced road deaths. To squander that possibility in a rush to get there too quickly would be a terrible waste of human life.
Pre-minimum viable products
That includes putting systems into cars that aren’t quite ready for prime time, which some are accusing Tesla of doing, after another fatal crash, this time one which killed the driver:
Tesla had also revealed that logs from the vehicle’s computer had indicated that the 38-year-old’s hands were not on the steering wheel in the six seconds prior to the collision on California’s Route 101. It said the car had provided several “hands-on” warnings earlier in the journey.
There’s a known problem with semi-autonomous systems like Tesla’s auto-pilot. They lead to diminished driver attention, even though his intervention is still needed. There’s a danger point where semi-automation provides enough autonomy that the driver’s attention wanders, but not enough to keep them safe.
There is incredible potential for autonomous vehicles to reshape the way we live. We’ve been writing about this for years. But there’s a really danger that Silicon Valley’s culture of rushing out a minimum viable product and iterating on that is not only going to kill people, it’s going to kill the potential of this technology.
A few more stories like this, and the mobility backlash will truly be underway.
Or, to use another old adage: more haste, less speed.
Autonomous vehicles are coming. That’s inevitable. But let’s take the time to do them right.