The latest footage released by the Tempe police department shows that the Uber car didn’t slow down before hitting the pedestrian walking with her bicycle. History will remember this as the first failed AI, self-driving causality.
This is a much bigger problem than people realize as it will make many question the safety of self-driving cars. Many are already skeptical and the fatality will simply create more opponents to the AI self-driving revolution.
The way that self/driving works, a situationwhere the car didn’t brake at all before impact, should not be possible even on Level 2 cars. There is a lot of redundancy in the self-driving system and the fact that the AI-powered Volvo car didn’t detect a pedestrian and tried to avoid her, or enforce the emergency brake is a failure of technology. Nvidia’s CEO confirmed it was not Nvidia hardware that was used in that particular car.
The Uber/Volvo car platform uses multiple cameras, radar, high precision GPS and a lidar (light ray radar) and computer processes all the data in order to keep the car on the road and keep pedestrians and other vehicles on the street safe.
Somehow, the police footage shows that this entire system failed, as the car hit a lady without any signs of braking. It is understandable that you cannot beat the physics, and fatalities cannot be totally avoided as the SUV type of car needs 36-37 meters (118-121 feet) to brake from 100 KM/h (62.3 MPH) to zero. These are just rough data for the 2016 model of the Volvo XC90 car. The Uber Volvo prototype was driving at 64 KM/h (40 MPH) with an approximate breaking path of 80 feet or some 25.5 meters. These are some rough estimates that might help you create a mental picture.
If someone jumps in front of the car, even if the car reacts in milliseconds, it will still travel quite some distance due to the laws of physics. The real question is what happened on the AI inferencing platform that failed to combine the LiDAR / Radar data as cameras only spoted the pedestrian when it was too late. This is a big question that might bother Uber, Volvo, the chip and the software provider and other companies who are involved in the whole self-driving platform. In 2017, there were 40,100 fatalities on the US streets, a one percent drop compared to the previous year. This doesn’t mean that lives are not precious, but history will remember the 49 year old lady as the first casualty of AI and technology fail. Uber is still beta testing and this is why an operator was in the driver seat.
The operator also noticed a pedestrian crossing the street on an unmarked part of the road too late. The scapegoat part of the story is that now the technology companies want to throw her under the bus as she was not looking. It is very convenient that the poor lady driver was a convicted felon too.
Realistically one can assume that with or without technology this accident was very hard to avoid, due to the simple laws of physics.
Who is responsible?
This incident will force Volvo, Nvidia, Uber and other people involved in self-driving to dissect the data and get on top of it what happened. To our best knowledge, LiDAR should simply map the data around the street and ought to have noticed the pedestrian at the side of the road and be alerted.
The National Transportation Safety Board (NTSB) is already investigating the accident involving the death of Elaine Herzberg. This looks just like the beginning of the saga and we will hear more about it from the authorities and technology companies involved.
What part of the algorithm and software that drove the car failed? Was the data collected by sensors such as Radar and Lidar misinterpreted? The short video shows the technology failing, but again you have to realize that this is the first accident of this kind.
Unfortunately for technology providers and the pedestrian involved, the first major accident had to be a fatal one. Ultimately technology companies will end up learning a lot from this mistake, which of course is too late for its first casualty.