14 September, 2021
As autonomous vehicle technology continues to evolve, more states are considering enacting legislation regarding self-driving vehicles. The predicted benefits automotive cars are expected to bring forth include reducing traffic deaths by 90%, a 60% decrease in harmful emissions, and 10% improved fuel economy, among others.
Autonomous vehicles are of different levels; some are fully automated, while others are partially self-driven. Unlike the partially autonomous cars that require human assistance, fully self-driving vehicles need no human assistance. However, the problem arises when these cars get into accidents, and proof of liability is required. This article explains who may be liable when an autonomous vehicle causes an accident.
The manufacturers
In a typical car crash, it's easy to determine liability as the drivers involved are human. Still, when an autonomous car causes an accident, the burden of proof lies with certified car accident lawyers in Florida. As a lawyer, you may argue that if there's a defect or a design decision in the vehicle that causes accidents in particular circumstances, the manufacturers likely knew (or ought to have known), and sold the defective car anyway, and they should be held responsible for any liabilities thereof.
The manufacturer may also be found liable for failing to provide instructions on how an autonomous car should be used. In addition, they can also be responsible for not providing adequate risk warnings to the vehicle users.
- The drivers
Humans are the leading cause of self-driving car accidents. Since some self-driving cars are partially automated and require human assistance, car accident attorneys may argue that an accident happened due to negligence on the driver’s part. This could be because, as an assistant, the driver should pay attention to the road and traffic and immediately intervene when the need arises to avoid accidents.
The driver's liability would then be the failure to pay attention and step in to evade the accident. Although it can be challenging to foresee danger, the driver can only be held liable where they are expected to anticipate an accident and act swiftly to avoid it. Additionally, the driver may be held responsible for accidents resulting from their failure to update the software.
- The driver may be held responsible as a form of strict liability
Strict liability means that a driver may be held morally responsible even if they had no duty or way of interfering when the vehicle was about to cause an accident. This may be applicable where the car is fully automated. The reasoning would be that the driver took the vehicle knowing that using it could pose a danger to themselves and other road users, leading to moral irresponsibility.
The software developer
Autonomous vehicles are designed using sophisticated technologies including autonomous emergency braking, blind-spot detection, lane-keeping assist, traffic jam assist, vehicle-to-vehicle communication, adaptive cruise control, and others. These technologies should detect danger and apply brakes to avoid collisions. If the car fails to detect risk or breaks traffic rules to cause an accident, the company that developed the software becomes liable for the damages that may arise.
Cybercriminals
Since self-driving vehicles are technologically operated, they're at risk of malicious attacks. When cybercriminals hack the car's system, its efficiency may be affected, resulting in road accidents. It's up to the car accident lawyers to prove that the vehicle's system was broken into, causing the accident.
The oversight authority
In a case concerning an autonomous car accident, the federal government may be found culpable. This is because of its failure to provide the oversight required during the testing and eventual adoption of self-driving vehicles.
While the state laws and federal guidelines that are in place have adopted a pro-technology approach, it doesn’t adequately protect other road users. The National Transport Safety Board (NTSB) blames the government for relying on companies electing to handover safety information to federal regulators. The oversight authority may be held liable for not putting regulations in place to the risks associated with autonomous vehicle accidents.
The state
Each state has the responsibility of ensuring that there are laws and regulations governing autonomous car accident liability. Failure to do so leaves the state responsible for any damages occurring from a self-driving car accident.
For states that have laws in place, there are guidelines provided on how liability should be treated. For example, the Michigan state has enacted a legislation that limits the manufacturer’s liability for damages arising from modifications made by a third party to an automated vehicle. A state can be held liable for not providing guidelines on how liability should be allocated among HAV owners, passengers, operators, manufactures and others when an accident occurs.
Endnote
Although it's difficult to determine the liability of an autonomous car's accident, someone must be held responsible for its damages. These tips will help you prove liability in a self-driving accident lawsuit.