Close Menu
Fort Lauderdale Injury Lawyer
Call Today For a Free Consultation
Languages Spoken: Portuguese · Spanish · Creole

Another Fatality in Autopilot Vehicle Accident Raises Concerns

Autopilot

Unfortunately, self-driving vehicles are on the hook (once again) for causing serious injuries and fatalities in a series of auto accidents. A mere week after an Uber vehicle in self-driving mode killed a pedestrian in Arizona, Tesla’s Model X was involved a fatal crash at the end of March, raising some serious questions about these semi-autonomous systems.

According to Tesla, although the Autopilot system was activated in the most recent accident, the driver failed to take action right before the crash, even though the system had delivered warnings. The National Highway Traffic Safety Administration and National Transportation Safety Board have now launched official investigations into the crash, in addition to investigating another Tesla semi-autonomous vehicle accident from January.

Is It Reasonable to Expect a Driver to Take Over if Autopilot Steers Them Towards an Accident? 

These systems work via Autopilot allowing drivers to take their hands off the wheel for extended periods under specific conditions, although Tesla reportedly requires that “users” keep their hands on the wheel at all times.

Yet, how reasonable is it to expect drivers in self-driving vehicles to entirely prevent an accident that the system steers them towards, five seconds before the accident occurs? In this case, the Autopilot system completely failed to detect the presence of a concrete divider. If Autopilot steers you towards a concrete divider on a highway, is it even possible to get out of an accident without anyone obtaining injuries or property damage in some way, as you try to steer away from it during those last few seconds when you receive the warning?

Tesla Previously Found Culpable

It is also important to note that the National Transportation Safety Board has already faulted Tesla before in a fatal autopilot crash involving the Tesla Model S, which killed a driving who was using autopilot in May 2016. The Board indicated that Tesla could have done more to prevent the system from being misused, where drivers fail to pay attention and, instead, over-relied on vehicle automation.

The Future For Self-Driving Cars: Are They Safe?

Meanwhile, the industry itself is seeking federal legislation that would ease deployment of self-driving cars. Tesla insists that Autopilot makes accidents much less likely to occur, where there is one fatality for every 320 million miles in Autopilot vehicles and one fatality every 86 million miles across all vehicles. Tesla also indicated that it was recalling the Model S sedans built prior to April 2016 in order to replace bolts in the power steering component.

Speak With Our Florida Auto Accident Attorneys

If you have been the victim of a self-driving (or regular) auto accident, you need to speak with an experienced accident and injury attorney right away in order to ensure that your rights are protected and you reserve the right to obtain compensation to cover any injuries and other costs as the result of an accident that you are not at-fault for. Contact the office of Friedland & Associates today to find out how we can help.

Resource:

cnbc.com/2018/03/30/tesla-says-crashed-vehicle-had-been-on-autopilot-prior-to-accident.html

Fort Lauderdale Personal Injury

© 2016 - 2018 Friedland & Associates. All rights reserved.
This law firm website is managed by MileMark Media.