Kentucky Injury Lawyers

NHTSA Investigates a Fatal Crash Involving a Vehicle Driving Itself on Autopilot

Published on Jul 14, 2016 at 2:52 pm in Auto Accident, Auto Product Liability.

A fatal accident occurring in Williston, Florida on May 7 is now under investigation by the National Highway Traffic Safety Administration. Joshua Brown, a 40-year-old man from Canton, Ohio, died when his Tesla Model S electric sedan crashed into the side of a tractor-trailer. The car was in self-driving mode and failed to apply the brakes when the tractor-trailer made a left turn in front of the vehicle. Federal regulators, who are in the early stages of setting guidelines for autonomous vehicles, have opened a formal investigation into the incident. This is the first known fatal accident involving a vehicle controlling itself by means of computer software, sensors, cameras, and radar.

Tesla is a unique car manufacturer, specializing in expensive, technologically sophisticated, electric vehicles. The company has been looking to expand its product lineup to include more mainstream models, however this fatal accident has put a hold on any forward progress. The crash casts doubt on whether or not the technology to allow vehicles to drive themselves is advanced enough to be called safe. In a news release regarding the accident, Tesla said, “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.”

Several other automakers have made recent investments to study and further the technology needed for autonomous vehicles. Google recently announced plans to equip 100 Chrysler minivans with autopilot driving features. And earlier this year, General Motors acquired the software firm Cruise Automation to accelerate its own self-driving applications. These companies are currently conducting numerous safety tests in private facilities and on public highways. The federal traffic safety agency is nearing the release of a new set of guidelines and regulations regarding the testing of self-driving vehicles on public roads. However, many are now skeptical if the government will approve cars that totally drive themselves anytime soon. This accident casts doubt on whether or not the vehicles can consistently make split-second, life-or-death driving decisions on the highway.

Tesla said in a news release that it notified the federal traffic safety agency about the accident “immediately after it occurred,” but the company did not report it publicly until last Thursday when it learned the agency had begun to investigate. In a statement on Thursday, the company cautioned that the self-driving mode was still only a test feature that “requires explicit acknowledgment that the system is new technology.” When a driver activates the autopilot system, an acknowledgement box pops us, explaining that autopilot mode “is an assist feature that requires you to keep your hands on the steering wheel at all times.”

At a recent technology conference in Novi, Michigan, Mark Rosekind, an NHTSA leader, said that self-driving cars should be at least two-times greater than human drivers in measures of safety to make any meaningful impact on roadway safety. Karl Brauer, an analyst with Kelley Blue Book said the accident might be an indicator that the technology is not as advanced as some automakers claim it to be. Brauer said, “This is a bit of a wake up call. People who were maybe too aggressive in taking the position that we’re almost there, this technology is going to be on the market very soon, maybe need to reassess that.”

If you wish to learn more about this accident, Tesla vehicles, and the safety regulations surrounding vehicles with autopilot features, contact Louisville, KY auto product liability lawyer Tad Thomas of Thomas Law Offices for more information.