800x100 static WP 3
WP_Term Object
(
    [term_id] => 95
    [name] => Automotive
    [slug] => automotive
    [term_group] => 0
    [term_taxonomy_id] => 95
    [taxonomy] => category
    [description] => 
    [parent] => 0
    [count] => 721
    [filter] => raw
    [cat_ID] => 95
    [category_count] => 721
    [category_description] => 
    [cat_name] => Automotive
    [category_nicename] => automotive
    [category_parent] => 0
)

NTSB Entry Raises the Stakes of Tesla Probe

NTSB Entry Raises the Stakes of Tesla Probe
by Roger C. Lanctot on 07-10-2016 at 12:00 pm

 The National Transportation Safety Board’s entry into the investigation of the first fatal crash of a Tesla Model S is a monumental turning point in the autonomous driving movement. While long-time observers of the NTSB note that it only gets involved in investigations where broader implications exist, the agency’s interest also reflects the fact that the National Highway Traffic Safety Administration lacks the technical ability to properly investigate a crash cause that is likely tied to a software failure.

As in the case of Toyota’s unintended acceleration fatalities, recalls and penalties, software is chiefly implicated in the fatal Tesla crash in Florida. In the Toyota case, NHTSA turned to the National Aeronautic and Space Administration for help and NASA ultimately turned to outside experts who criticized what they described as Toyota’s “spaghetti code.”

The source of the unintended acceleration in the Toyota Prius remains unresolved, but the primary learning from the experience was the realization of the investigatory limitations of the automotive industry’s primary regulatory agency. Those limitations, a legacy of the agency’s reduction in size going back to the Reagan Administration, remain uncorrected.

As a result, NHTSA lacks the fundamental expertise necessary not only to investigate the crashes of autonomous vehicles but also to evaluate the performance of these vehicles or even to properly set guidelines. This is a big problem for the industry and for the motoring public leaving individual state authorities in the awkward position of blindly cobbling together rules and guidelines of their own for the operation of self-driving vehicles on local and interstate highways.

Tesla and Google have more or less been left in the position of regulating and policing themselves. This is less of a problem for Google since its vehicles are not made available to the general public and generally operate at low speeds on local roads. It’s a different matter for Tesla Motors.

Ironically, Google lobbied the California Department of Motor Vehicles to leave autonomous vehicle regulatory oversight and guidance to Federal authorities in the form of NHTSA. During last year’s California DMV hearings on the subject, Google lobbyists and executives (including at least one former NHTSA executive, Ron Medford) implored the California DMV to relinquish its authority over Google’s local on-road testing activities.

Google’s argument before the California DMV was that the state agency was incapable of comprehending let alone evaluating the self-driving software Google was developing and deploying. It is clear that Google’s pleas were a cynical play to shift control to an agency with which it felt it had greater influence – knowing all along that NHTSA, too, lacked the necessary expertise and resources.

But the cynicism of Tesla’s CEO, Elon Musk, puts Google’s cynicism to shame. Musk has opened up his own traffic court serving as judge, jury, witness and prosecution. With each new Model S crash, Musk is quick to provide his assessment of fault – nearly universally lying with the driver – absolving himself and his company of responsibility.

The fatal crash in Florida is the first instance of Tesla acknowledging a potential flaw in its software and sensing architecture. Still, Musk fell back on the various caveats for use of the autopilot system including hands on the wheel etc. intended to release Tesla from any responsibility.

States such as California have begun insisting on full disclosure of self-driving car crash data – especially in the case of Google. Tesla is technically not offering a self-driving car, but state and Federal authorities may soon begin insisting on this same kind of sharing of crash data.

Transportation network companies (TNCs) such as Lyft and Uber, operating like Google and Tesla, outside the normal regulatory bounds are also being asked to disclose data about their drivers and crashes and other incidents. It seems that the battle for the next generation of transportation technology is evolving into a battle for data.

The manner in which the Tesla fatal crash has exposed the software blindspot of NHTSA has wider implications for the government’s role in redefining transportation safety. Safety in transportation is increasingly being determined by software systems. NTSB’s decision to enter NHTSA’s investigation suggests that NHTSA itself may not be up to the very task it has given itself – of promoting collision avoidance and autonomous driving.

Ultimately, this calls into question its plans to mandate the implementation of vehicle-to-vehicle wireless communications for the purpose of crash avoidance as well as its ability to comprehend, provide guidelines for and regulate the process of self-driving software development and sensor fusion. The arrival of the NTSB on the Tesla crash scene is an acknowledgement within the regulatory community that NHTSA is out of its depth, unequal to the task.

Until such time as the NTSB, NHTSA or NASA can sort out which agency has the scope or expertise to oversee autonomous vehicle development and deployment we are likely to see an ongoing and expanding free-for-all on U.S. highways. Such a free-for-all may lead to more fatalities, technological advancement or simple chaos – maybe all of the above.

Someone in Washington needs to sort out government’s role and properly fund the relevant agencies such that progress is successfully and safely achieved. The alternative will be a widespread call from safety advocates that all autonomous driving testing, development and deployment cease. Since autonomous driving technology is intended to save us all from ourselves, that can’t be the outcome we want to see.

Roger C. Lanctot is Associate Director in the Global Automotive Practice at Strategy Analytics. More details about Strategy Analytics can be found here: https://www.strategyanalytics.com/access-services/automotive#.VuGdXfkrKUk

Share this post via:

Comments

There are no comments yet.

You must register or log in to view/post comments.