There has been plenty of speculation about how it was possible for a high-tech autonomous vehicle to plow into, and kill, a woman pushing a bicycle. Much of the early coverage has been framed by some early comments from the Tempe Police that the woman had apparently jumped out in front of the car. The Department has now released dashcam video from the car, and they are really disturbing on a number of levels. The video appears to answer at least some of the questions we posed in our earlier coverage.
The Pedestrian Didn’t Emerge Suddenly Out Of Nowhere
Initial statements from the Tempe Police painted a fairly clear, if ultimately problematic, picture of the pedestrian having suddenly appeared “out of the shadows” — as if she had been invisible behind foliage on the median until the last second — and suggested that the resulting collision might have been unavoidable. However, the dashcam video tells a somewhat different story. While the collision itself might have been unavoidable, there should have been enough time to at least mitigate the impact by braking or swerving. The woman who was killed is already well out into the roadway as the Uber vehicle approaches. She has already slowly walked across the left lane of the two lanes and is in the middle of the right lane even before the vehicle’s headlamps are close enough to illuminate her.
This fact alone should send alarm bells to anyone expecting driverless cars from Uber any time soon. A woman, pushing a bicycle loaded with her belongings, walks slowly across a four-lane road (two lanes in each direction) and was plowed into without any apparent slowing, swerving, or other reaction from the car’s self-driving systems. You can get a sense for the situation from these frame grabs from a few seconds before the crash. The first shows the woman’s shoes part way across the lane:
You can see the pedestrian’s white shoes at the far edge of the area lit by the car’s low-beam headlights. She is clearly already in the middle of the traffic lane at this point.
A second frame grab shows her coming into full view:
The woman and bicycle were certainly obvious enough that the Uber self-driving lidar and software should have noticed them and reacted.
As some of the readers of our earlier story have noted, the car doesn’t appear to have its high beams on. Whether that is typical for Uber’s cars, or because there is a car up ahead isn’t known yet. While not having them on might not have affected the car’s own self-driving systems, which rely heavily on lidar, it certainly made it much harder for the safety driver to see the pedestrian.
The Safety Driver Was Barely Paying Attention
Companies testing autonomous vehicles made a deal with regulators to get permission: There will be a safety driver ready to take control when the car’s systems are unable to cope with the situation. The jargon for those situations are disengagements. Unfortunately, there aren’t very good regulatory systems for ensuring those drivers are properly trained, instructed in how to act, and monitored for compliance.
Some companies like Waymo have been very serious about training and monitoring drivers. But based on this incident and the video, Uber is not nearly as careful. For much of the preceding interval before the crash, the Uber safety driver is clearly looking down, and not at the road. Whether she is simply staring at her hands, or using some type of electronic device, or fiddling with her thumbs is unclear from the video. Likewise, it’s hard to tell whether her hands are on the wheel ready to take control, but it certainly doesn’t seem like they are.
Uber’s safety driver was looking down for much of the time immediately preceding the crash, and certainly doesn’t appear to have her hands on the wheel ready to take over.
The Car’s Safety Systems Didn’t Intervene
We don’t know yet what went on in either the Volvo’s advanced safety systems or Uber’s own autonomous-driving software. But there is no evidence the car (or the driver) attempted to slow down or take evasive action before hitting the woman. The crash does illustrate some of the limitations of current automated safety systems, though. For example, the documentation we’ve found states that Volvo’s impressive City Safety system — that can detect both pedestrians and cyclists — is only certified to operate at speeds up to 30mph. The Uber vehicle was traveling at about 40mph. More traditional vehicle collision avoidance systems might still have been operational at that speed, but they are designed primarily to detect and avoid moving vehicles, not stationary people or objects.
Autonomous vehicle industry maven, publisher of Smart Driving Cars, and Princeton professor Alain Kornhauser put the issue fairly succinctly for me: “While a human may not have been able to avoid this crash, a well-designed well-working collision avoidance system should have at least begun to apply the brakes.” Kornhauser went on to make the important point that if the industry is going to learn from tragedies like this, it’s imperative that all the vehicle’s telematics be made public — including lidar, radar, and any other cameras or sensors.
Uber’s Self-Driving Tech, Policies Appear to Have Failed
Overall, the video appears to be fairly damning evidence that Uber’s cars and drivers are not ready for the scale of testing on public roads the company has undertaken. Fortunately, Uber has pulled its test vehicles off the road in response to the crash, but the incident raises a lot of new questions about the company’s commitment to safety and the maturity of its self-driving effort. For example, inward-facing dash cams are used at least in part to monitor safety driver performance. A more-responsible company would have fired any driver who exhibited the lack of attention we see in this video. Uber doesn’t seem to have been concerned.
The Case For Intelligent Regulation Is Even Clearer
It’s been clear for a while that the regulation around self-driving vehicle programs has been haphazard, and doesn’t always match the risks and rewards. It’s often driven by a desire to attract exciting, high-tech research labs, or simply for cities to be viewed as leading edge. In particular, current regulations have general requirements for how those testing programs should work, but little or no detail on specific policies around safety driver training, monitoring, and auditing.
Hopefully, the silver lining of a tragedy like this crash will be momentum to improve the regulations about how self-driving technology will be tested and deployed. At a minimum, more controls over how safety drivers are trained and expected to behave are needed. Beyond that, there’s no reason that companies shouldn’t have to prove that their cars can handle a certain level of complexity on their own before they are allowed on the roads — with or without a human safety driver. That is not unlike the way we already test vehicles for crash safety before allowing them on the roads.