Skip Ribbon Commands
Skip to main content
Bookmark and Share this page

Speeches

Opening Statement - Crash involving a pedestrian and an Uber test vehicle
Robert L. Sumwalt
Washington, DC
11/19/2019

Good afternoon and welcome to the Boardroom of the National Transportation Safety Board.

I am Robert Sumwalt, and I’m honored to serve as the Chairman of the NTSB. Joining us are my colleagues on the Board, Vice Chairman Bruce Landsberg and Member Jennifer Homendy.

Today, we meet in open session, as required by the Government in the Sunshine Act, to consider the collision between a vehicle controlled by a developmental automated driving system and a pedestrian in Tempe, Arizona, on March 18, 2018. The pedestrian was pushing a bicycle across a street mid-block when the vehicle struck her at 39 miles per hour. Tragically, she died in the crash.

On behalf of my colleagues on the Board and the entire NTSB, I offer our sincerest condolences to the family and friends of the woman who lost her life. Please understand that the NTSB’s purpose today is to prevent similar crashes in the future.

Last year, the NTSB published a pedestrian safety report. This pedestrian was at particularly high risk of being struck, regardless of the striking vehicle.

But today is also about the vehicle that struck her, the safety culture of the company that was testing it, and the industry, state, and federal safety risk management requirements for testing automated driving systems on public roads.

The crash vehicle was operated by Uber’s Advanced Technologies Group (or, Uber ATG). It was a Volvo XC90 that Uber ATG had modified with a proprietary developmental automated driving system.

It was not a self-driving car. We are not there yet.

The crash vehicle could drive on premapped routes with an attentive human operator. But as has been widely reported, the vehicle’s operator was not paying attention at the time of the crash.

Furthermore, although the vehicle’s sensors first noted the pedestrian 5.6 seconds before impact, the system waffled between classifying her as a vehicle, a bicycle, or “other.”

We will discuss Uber ATG’s decision to deactivate Volvo’s factory-installed forward collision warning system and the automatic emergency braking system while the vehicle was being operated in the test “automated driving system” mode.

We will also consider Uber ATG’s post-crash actions regarding these systems. But the lessons of this crash do not only apply to Uber ATG, and they are not limited to, “something went wrong and now it’s fixed.”

Rather, “something went wrong and something else might go wrong, unless it is prevented.”

After the crash, as specific technical limitations and safety risks were understood, Uber ATG learned and improved the schema. But at the time of the crash, the company was not continually monitoring its own operations to avoid or mitigate potential risks.

The inappropriate actions of both the automated driving system, as implemented, and the vehicle’s human operator, were symptoms of a deeper problem: an ineffective safety culture that existed at the time of the crash.

In organizational safety, a safety management system (SMS) helps to build an effective safety culture. An SMS includes safety policy, safety risk management, safety assurance, and safety promotion. I am glad we’ll have an opportunity to discuss SMS today.

But this crash was not only about an Uber ATG test-drive in Arizona. This crash was about testing the development of automated driving systems on public roads. Its lessons should be studied by any company, testing in any state.

If your company tests automated driving systems on public roads, this crash was about you. If you use roads where automated driving systems are being tested, this crash was about you. If your work touches on automated driving systems at the federal or state level of government, this crash was about you.

For anybody in the automated driving systems space, let me be blunt. Uber ATG is now working on an SMS. Are you? You can. You do not have to have the crash first.

In today’s Board meeting, the staff will lay out the pertinent facts and analysis found in the draft report. They will present findings, a probable cause, and recommendations to the Board. We on the Board will then question staff to ensure that the report, as adopted, truly provides the best opportunity to enhance safety.

The public docket for the report contains 439 pages of additional information, including factual reports of investigation, submissions by Uber ATG and Volvo, and other relevant material, and is available at www.ntsb.gov. Once finalized, the accident report will also be available at www.ntsb.gov.

Now Deputy Managing Director Paul Sledzik, if you would kindly introduce the staff.