Bloomberg News
Spate of Self-Driving Probes Points to Higher Safety Bar

[Stay on top of transportation news: .]
Companies offering driver-assistance systems and developing autonomous vehicles are entering a new phase of more exacting oversight, with the top U.S. auto safety regulator investigating four of the industry鈥檚 most prominent companies in rapid-fire fashion.
The National Highway Traffic Safety Administration initiated two probes just this week into Waymo and Zoox, the driverless technology subsidiaries of Alphabet Inc. and Amazon.com Inc. While the circumstances are different for Tesla Inc. and Ford Motor Co. 鈥 their vehicles offer driver-support systems that require constant supervision 鈥 what the companies share in common are car crashes that drew the attention of an agency tasked with rooting out safety defects.
The investigations point to a more hands-on approach by a regulator that has been relatively forbearing of automated-driving systems until recently. While executives and bureaucrats alike have touted the technology鈥檚 potential to make roadways safer, NHTSA has steadily accumulated information on how the systems are actually faring on streets and highways. It鈥檚 now setting a high bar 鈥 in Zoox鈥檚 case, for example, the agency initiated a defect probe after just two crashes resulted in minor injuries.
鈥淲hat鈥檚 changing is the understanding on the ground of what鈥檚 happening, in terms of the performance of these vehicles, and the willingness to act,鈥 said Bryant Walker Smith, a University of South Carolina law professor who contributed to the driving automation classification system that NHTSA and the broader industry uses. 鈥淲e just know more now.鈥

A Waymo self-driving vehicle sits curbside at the Sky Harbor International Airport Sky Train facility in Phoenix. (Matt York/Associated Press)
Waymo became the latest company to come under the microscope on May 14, when NHTSA announced a probe of its automated driving system, citing 22 incidents in which vehicles 鈥渆xhibited unexpected behavior.鈥
The agency pointed to reports of Waymo vehicles crashing into gates, chains and parked cars, and potentially violating traffic laws. In some cases, the company鈥檚 cars drove in opposing lanes with oncoming traffic nearby, or entered construction zones.
A day earlier, NHTSA disclosed a probe of Zoox after two of its sport utility vehicles operating in autonomous mode suddenly braked and were rear-ended by motorcyclists.
The agency said it would evaluate how Zoox鈥檚 system behaves in crosswalks and in other similar scenarios where rear-end collisions can happen.
Ford similarly came under investigation after NHTSA received notice of two crashes, though the incidents resulted in three fatalities. In both instances, Mustang Mach-E electric SUVs collided with stationary vehicles at night on controlled-access highways. The manufacturer alerted the agency under a standing order issued in 2021, which requires carmakers to report when vehicles with automated driving systems activated have crashed.
Tesla has reported the vast majority of crashes under the order, which has contributed to ever-escalating scrutiny of its driver assistance system called Autopilot. On April 25, the agency opened a probe into whether the EV maker鈥檚 recall of more than 2 million cars months earlier adequately addressed safety risks that NHTSA identified in the course of a yearslong investigation.

Zoox autonomous vehicles at the company's manufacturing facility in Fremont, Calif. (David Paul Morris/Bloomberg News)
The carmaker led by Elon Musk agreed to the recall after the agency determined that Autopilot didn鈥檛 sufficiently ensure drivers stayed engaged in the task of driving, and that Autopilot invited drivers to be overconfident in the system鈥檚 capabilities. Those factors led to foreseeable misuse and avoidable crashes, at least 13 of which involved one or more fatalities, NHTSA said in a filing.
Mark Rosekind, who led NHTSA during the Obama administration, said the agency鈥檚 review 鈥渟hows they are not just taking the recall at face value, but looking for data to show it is effective.鈥
Information gleaned so far suggests it hasn鈥檛 been, Rosekind said in an interview. In announcing its recall query, NHTSA cited 20 crashes involving Tesla vehicles that had received over-the-air software updates aimed at plugging Autopilot鈥檚 safety gaps.
Philip Koopman, a professor of computer engineering at Carnegie Mellon University and co-founder of autonomous vehicle consultancy Edge Case Research, said it鈥檚 鈥減lausible鈥 that NHTSA could conclude Autopilot can鈥檛 be operated safely in some or all of Tesla鈥檚 cars, and will need to be deactivated without an effective fix.
Tesla has taken heat for years from safety advocates over how it monitors whether drivers are paying attention while using Autopilot. Critics have pointed to how rivals including General Motors and Ford have employed infrared cameras and eye-tracking software seen as more robust than Tesla鈥檚 approach, which for a time was to only monitor whether torque was being applied to the steering wheel.
Want more news? Listen to today's daily briefing above or go here for more info
In summarizing its defect investigation, NHTSA called Tesla鈥檚 driver-engagement system 鈥渨eak鈥 and 鈥渘ot appropriate for Autopilot鈥檚 permissive operating capabilities.鈥 The company intended for its interior cameras to monitor multiple occupants of vehicles Musk believed would eventually become robotaxis, so they鈥檙e positioned above the rear-view mirror. A more ideal placement to specifically track drivers would be behind the steering wheel.
As NHTSA conducts its latest Autopilot probe, one key question is whether Tesla will need to make more costly hardware fixes to its cars to address the agency鈥檚 concerns.
鈥淭esla dug this hole themselves,鈥 Koopman said, 鈥渁nd now they have to deal with it.鈥
听