While some drivers prefer having full control over their vehicle on the road, many others like partially autonomous vehicles to ease the stresses of driving. Today’s Level 2 self-driving features include lane keeping and adaptive cruise control, the latter keeping the EV safely spaced in traffic. Several companies offer hands-free self-driving on mapped, compatible roads, including Ford with its BlueCruise and General Motors Super Cruise.
Drivers still need to stay alert and watch the road, ready to take over at a moment’s notice. Engineers will eventually design fully autonomous cars in which none of the human occupants will serve as a driver.
Passengers will choose a destination and then watch movies, read, sightsee, or nap while their robotic car takes them there. While a few cars like the BMW i7 achieve near Level 3 capability, drivers currently need to remain ready to take over when needed.
Whether the car is actually hands-free during some of its operations, like a Ford or GM vehicle on compatible roads, today’s self-driving systems cut down greatly on direct driver inputs to the controls. Drivers experience less fatigue and stress even if they need to keep their hands on the wheel if the car does some of driving’s “drudge work.”
People remain skeptical of self-driving EVs’ safety and their potential liability issues, however. Here’s the score on who gets the blame in collisions involving self-driving EVs.
Operating the Car as Normal
A driver who switches off Level 2 driver assistance and maintains full control of the car has the same responsibility in an accident as one operating a vehicle that lacks those features. An EV owner might choose to drive this way out of personal preference.
It could also be necessary on very poor roads where the self-driving programs can’t get enough data to operate successfully. People who operate their EVs with self-driving features fall under the same liability rules that have applied to drivers for decades.
If the driver collides with a pedestrian, another vehicle, or an object, they will likely be found at fault. If another vehicle crashes into them, the other driver is more likely to be held liable, modified by circumstances.
Investigators can usually determine if a self-driving system was engaged at the time of an accident. Attention monitoring systems and the self-driving system’s memory both provide pertinent records. But if the impact destroys either one of these systems, or if a self-driving feature malfunctions, determining their liability could be trickier.
Do Self-Driving Cars Get Into Accidents?
Human error causes the great majority of accidents, which is why engineers design self-driving driver assistance to improve safety. Advocates of self-driving want to take the human element out of the driving equation to make roads safer.
Despite this, self-driving cars cause accidents regularly to this day. Drivers still create the vast majority of these accidents, since Level 2 self-driving still involves plenty of human input. Arguably, fully automated systems would nearly eliminate traffic accidents.
Self-driving EVs still get into fatal collisions with other cars and even pedestrians. A self-driving Uber car hit and killed Arizona homeless resident Elaine Herzberg as she was crossing the road in 2018. Other accidents fortunately involve no injuries or deaths.
An autopilot-driven Tesla Model S rear-ended a California fire engine on a sunny January day in 2018. The Tesla was moving at 65 mph when it smacked into the fire brigade’s Engine 42. The EV dramatically totaled itself but all humans escaped unscathed.
These incidents happened in 2018, but self-driving-equipped EVs continue to get in accidents to this day. The NHTSA conducted a study that cataloged 392 crashes and six fatalities involving self-driving systems. The agency’s research spanned 10.5 months between July 1st, 2021, and May 15th, 2022. Tesla Autopilot was running during about 273, or approximately 70%, of the accidents.
The NHTSA allowed EV manufacturers to redact a lot of vehicle-specific information from the study before its public release. Drivers and insurers alike are left with many questions even after the publication of the agency’s research.
Self-driving systems may make vehicles safer and reduce the severity of accidents when they do happen. Automation may also prevent accidents that would otherwise occur. The most recent study doesn’t reveal this, however, so more independent research is needed.
Who Is Liable in a Crash?
Even experts find determining liability in an ordinary car crash a difficult, complicated process in many cases. People may determine fault easily in some straightforward crashes, while others may need detailed investigation.
State lawmakers also create a patchwork of different rules across the country, with different methods of assigning responsibility. Many state legislatures have decreed liability should be shared between involved parties.
Self-driving vehicles add another layer of complexity to figuring out who’s at fault. The Tempe, Arizona accident mentioned above shows just how tangled assigning liability gets when self-driving technology is involved.
The Uber EV was operating autonomously when Elaine Herzberg stepped out into the road, pushing her bicycle. Herzberg attempted to cross the road at night at an intersection notorious for vehicles hitting pedestrians. The Uber vehicle struck her at 39 mph and inflicted fatal injuries.
Rafaela Vasquez, the backup driver in the car, was allegedly watching “The Verge” on her smartphone when the collision happened. The state charged Vasquez with negligent homicide two years later in September 2020. The court didn’t start the trial until spring 2022, delayed by complex evidence-gathering about the autonomous driving system.
The NTSB Ruling and Responsibility
The National Transportation Safety Board (NTSB) also issued a ruling about liability in the case. It split liability five ways, between Raphaela Vasquez, the pedestrian Elaine Herzberg, Uber, the vehicle, and Arizona, Science Daily says. The courts haven’t yet assigned financial and criminal liability with a verdict despite four years passing since the incident.
Legal authorities and insurance providers will almost certainly hold drivers at least partly at fault in a self-driving EV collision. Drivers can find themselves exposed to criminal charges and monetary responsibility.
However, many states also allow at-fault drivers to receive some compensation for an accident. The compensation may be reduced by the percent of fault officially assigned to the driver.
A driver will likely be exempt from responsibility if the self-driving tech catastrophically malfunctions. So, if the autonomous driving feature goes haywire and causes a collision, the courts may assign liability to the manufacturer.
Drivers of non-self-driving vehicles would benefit from a similar exception if a major component was defective and failed, causing an accident. Examples would include total brake failure, a stuck accelerator pedal, or a suddenly detaching steering wheel.
Self-Driving Accidents and Insurance
Insurers may charge higher premiums for vehicles with Level 2 self-driving. Insurance companies already charge roughly 15% more for EV insurance premiums compared to ICE passenger vehicle insurance. Individuals can expect higher rates following an accident, too, as with ordinary driving. If the claims assessor finds the driver even partially at fault, the provider may boost premiums even higher.
Critical Hit Technology reports that collision claims could be more expensive for several reasons. Specialized garages able to repair damaged self-driving systems’ sophisticated electronics charge steep prices. An insurance claims assessor may experience difficulty figuring out accident liability, racking up higher processing costs.
Most insurers probably already include the extra costs of self-driving in their standard EV insurance premiums. However, the complications of self-driving liability may keep EV insurance premiums elevated even as electric vehicles get more common.
Bottom Line: Who Gets the Liability with Self-Driving?
At this point, EV self-driving and legal liability are still evolving. Solid information is hard to obtain and the response of state lawmakers to the technology varies greatly between states.
Right now, however, it appears that the driver will get part of the liability for accidents when their EV is operating hands-free. Factors like driver attention, with attention monitoring systems, will play a role in assigning liability, too.
The NTSB’s ruling in the Herzberg case shows that practically everyone involved in a self-driving EV accident will likely be held partially responsible. This could even include the vehicle’s manufacturer.
Of course, if self-driving systems are successful in reducing human error, this could cut back on accidents and the risk of liability overall, even if the few remaining cases are very complex. For now, though, self-driving liability applies to everyone in an accident. It’s also a legally and technically complicated issue that’s unlikely to get simpler anytime soon.