Rafaela Vasquez, the safety driver engaged with Uber during the inaugural deadly incident involving an autonomous vehicle striking a pedestrian, has entered a guilty plea and received sentencing for a charge of endangerment. Vasquez has been sentenced to a three-year probationary term for her part in the 2018 collision in Tempe, Arizona that claimed the life of Elaine Herzberg as she crossed the street illegally at night. The sentence, sterner than the half-year requested by Vasquez’s defense team, respects the demands of the prosecutors.
The prosecution argued that Vasquez bore ultimate responsibility for the accident. While an autonomous vehicle was indeed implicated, Vasquez was tasked with keeping a vigilant eye on the road and stepping in if necessary. The adapted Volvo XC90 involved in the incident, with Level 3 autonomy, permitted hands-free operation under certain conditions but required the driver to be ready to take control instantaneously. Although the vehicle detected Herzberg, it did not take action in response to her presence.
The defense based its argument on sharing the blame with Uber. Company executives, according to alleged leaked conversations, expected an accident to occur at some point. The National Transportation Safety Board (NTSB) in their findings from the collision investigation mentioned that Uber had deactivated the XC90’s emergency braking system, rendering it incapable of an immediate stop.
According to the Tempe police, Vasquez was not focused on the road but was watching a show on Hulu at the time of the crash. However, the defense maintains that Vasquez was attentive and her distraction was only momentary.
This sentencing and guilty plea could shape how similar incidents are dealt with in court in the future. There has been ongoing debate about who holds liability in accidents involving semi-autonomous vehicles – the human driver or the car’s manufacturer? This case indicates that humans can still be held accountable and penalized if they have the ability to intervene, albeit the penalties might be less severe than in traditional situations.
Fatal accidents involving autonomous technology are not unprecedented. Tesla has faced partial blame for collisions while its Full Self Driving feature was engaged. However, the case of a pedestrian being involved is exceptional and hangs over the more recent Level 4 (fully driverless in specific situations) tests and offerings from GM’s Cruise and Waymo. Despite technological advancements since 2018, some voices are advocating for a halt in the launch of autonomous taxis due to potential safety concerns.
All the products recommended by BuyTechBlog are handpicked by our editorial team, completely independently of our parent company. Some of our articles may include affiliate links. If you make a purchase through these links, we might earn a commission. The prices are accurate at the time of publication.
Frequently Asked Questions (FAQs) about Autonomous Car Crash Liability
Who was the safety driver involved in the 2018 fatal autonomous car crash?
The safety driver was Rafaela Vasquez, who was working with Uber at the time of the incident.
What was the sentence for the Uber safety driver involved in the fatal crash?
Rafaela Vasquez was sentenced to three years of probation after pleading guilty to an endangerment charge.
What was the defense’s argument in this case?
The defense argued that blame should be partially placed on Uber because the company had allegedly anticipated an accident and had deactivated the vehicle’s emergency braking system. The defense also insisted that Vasquez was generally attentive and her distraction was only momentary.
How could this case influence future autonomous driving liability?
This case could set a precedent that humans can still be held accountable and penalized in accidents involving semi-autonomous vehicles if they have the ability to intervene, although the penalties might be less severe than in traditional situations.
Were there any other entities at fault according to the National Transportation Safety Board (NTSB)?
According to the NTSB’s collision investigation findings, Uber had disabled the Volvo XC90’s emergency braking system, which made it incapable of an immediate stop.
Has this case affected the roll-out of autonomous vehicles?
While the technology has evolved since the 2018 incident, the case looms over the more recent Level 4 tests and offerings from companies like GM’s Cruise and Waymo. Some voices are calling for a halt in the deployment of autonomous taxis due to potential safety risks.
More about Autonomous Car Crash Liability
- Uber’s autonomous vehicle safety protocol
- NTSB’s report on the 2018 Uber self-driving car crash
- Legal challenges of autonomous vehicles
- The state of autonomous vehicle technology in 2023
- Tesla’s Full Self Driving feature
5 comments
Honestly this is so scary. Fully driverless cars still need a lot more testing, we cant just trust the tech blindly yet!
tragic incident. but let’s not forget the potential of autonomous cars to reduce overall accidents in future. Hope they learn and improve the tech.
i’ve said it before and ill say it again, nothing beats human intuition when it comes to driving, these AI driven cars are a nightmare waitin to happen…
3 years of probation, that’s it? I mean she was watchin hulu while drivin, someone lost their life ‘coz of that!
So, they disabled the emergency brakes and still blame the driver. Sounds like the company shud take some responsibility too.