in

Grieving family blames Elon Musk for son’s death after his Tesla crashed in ‘autopilot’ mode

The family of a 31-year-old man who died in a collision utilizing Tesla’s self-driving “autopilot” technology has filed a lawsuit, bringing Elon Musk and the business under renewed scrutiny.

The Genesis On February 18, 2023, Giovanni Mendoza Martinez tragically died when his Tesla, which was on autopilot, collided at high speed with a firetruck.

Mendoza was killed in the collision after allegedly believing in the vehicle’s self-driving capability after being swayed by Tesla’s advertising.

Tesla and its CEO, Musk, are now being held responsible by his brother Caleb, who was also hurt in the collision, and his parents, Eduardo and Maria.

Attorney and Family Speak Out

The Mendoza family’s lawyer, Brett Schreiber, called the tragedy “completely preventable.” He accused Tesla of using public roads as test sites for its autonomous driving technology and criticized the company’s autopilot feature as “ill-equipped to perform.”

“This is just another instance of Tesla conducting research and development on our public roads for its autonomous driving technology,” Schreiber told The Independent. Even worse, Tesla is aware that a large number of its older models are still on the road today with the same flaw, endangering the public and first responders.

According to the lawsuit, Tesla’s heavy promotion of the self-driving capability, which was positioned as safer than a human driver, convinced Mendoza and “many members of the public” to purchase the vehicle. According to Schreiber, Mendoza trusted the car to drive itself on highways and believed Musk’s boasts.

“Not only was he aware that the technology itself was called ‘Autopilot,’ he saw, heard, and/or read many of Tesla or Musk’s deceptive claims on Twitter [now X], Tesla’s official blog, or in the news media,” according to the complaint.

“Giovanni believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously.”

Tesla Protects Its Innovation

However, claiming that its cars have a “reasonably safe design” in accordance with relevant state legislation, Tesla has rejected culpability.

According to the firm, Mendoza’s “own negligent acts and/or omissions” may have contributed to the disaster. Tesla claimed that “no additional warnings would have, or could have, prevented the alleged incident” in a court filing.

Information about the Crash

The lawsuit claims that the Tesla was traveling at an average speed of 71 mph while in autopilot mode for 12 minutes prior to the collision. In addition to Mendoza’s death, four firefighters suffered minor injuries in the accident with the firetruck.

This is not a unique instance. According to the LA Times, between 2015 and 2022, Tesla users recorded over 1,500 complaints about unexpected, abrupt braking while using the autopilot technology and 1,000 crashes.

What Does the ‘Autopilot’ feature of Tesla Do?

The ‘autopilot’ feature of the Tesla vehicle is described as “an advanced driver assistance system that enhances safety and convenience behind the wheel” on the company’s official website. Autopilot lessens your overall driving workload when used correctly.

The statement goes on to say: “Autopilot, Enhanced Autopilot, and Full Self-Driving capability are designed to be used with a driver who is completely focused, their hands on the wheel, and ready to take control at any time. The vehicle is not autonomous with the features active at the moment, even if they are intended to become more capable over time.

Federal Reluctance to Support Tesla

Government representatives, such as Transportation Secretary Pete Buttigieg, who has continuously voiced his disapproval of Tesla’s autopilot technology, have reacted negatively to the incident.

“I don’t think that something should be called, for example, an Autopilot, when the fine print says you need to have your hands on the wheel and eyes on the road at all times,” Buttigieg previously told AP last year.

Less than two weeks prior to Mendoza’s collision, in February, Buttigieg actually tweeted: “Reminder—ALL advanced driver assistance systems available today require the human driver to be in control and fully engaged in the driving task at all times.”

Written by admin

Leave a Reply

Your email address will not be published. Required fields are marked *