Vehicular manslaughter charges filed in Los Angeles earlier this year mark the first felony prosecution in the US of a fatal car crash involving a driver-assist system.
In late 2019, Kevin George Aziz Riad’s car sped off a California freeway, ran a red light, and crashed into another car, killing the two people inside. Riad’s car, a Tesla Model S, was on autopilot.
The Los Angeles County prosecutors filed two charges against Riad, now 27. The case is also the first criminal prosecution of a crash involving Tesla’s autopilot function, which is found on over 750,000 cars in the US. Meanwhile, the crash victims’ family is pursuing civil suits against both Riad and Tesla.
Tesla is careful to distinguish between its autopilot function and a driverless car, comparing its driver-assist system to the technology airplane pilots use when conditions are clear.
“Tesla autopilot relieves drivers of the most tedious and potentially dangerous aspects of road travel,” states Tesla online. “We’re building autopilot to give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable.… The driver is still responsible for, and ultimately in control of, the car.”
The electric vehicle manufacturer clearly places the onus of safety on the driver, but research suggests that humans are susceptible to automation bias, an over-reliance on automated aids and decision support systems.
Now it’s up to the courts to decide who is culpable when the use of those systems results in fatal errors. Currently, Riad is out on bail and pleading not guilty to manslaughter charges.
Here, Mark Geistfeld, professor of civil litigation at New York University, and the author of the new paper in the California Law Review, talks about the significance of the criminal charges and what they might mean for the future of consumer trust in new tech:
The post Who’s responsible if a Tesla on autopilot kills someone? appeared first on Technovanguard.