TLDRs;
Contents
- Florida jury rules Tesla one-third liable in 2019 Autopilot crash that caused a death and serious injury.
- Driver George McGee blamed for two-thirds of the fault after activating Autopilot and taking his eyes off the road.
- Jury awards $43 million in damages, punitive damages may follow as scrutiny mounts on Tesla’s marketing.
- Crash echoes other Tesla Autopilot failures involving trucks, NHTSA highlights driver inattention risk.
Tesla is once again under the spotlight after a Florida jury ruled the company partially liable for a deadly 2019 crash involving its Autopilot feature.
The verdict awarded $43 million in compensatory damages to the victims’ families, with additional punitive damages still to be determined.
The crash occurred when driver George McGee engaged Autopilot and removed his attention from the road, leading to a collision with pedestrians Naibel Benavides Leon and Dillon Angulo. Leon died at the scene, while Angulo sustained severe injuries.
While McGee was assigned two-thirds of the blame, the jury held Tesla accountable for the remaining third, citing how the company’s representation of Autopilot contributed to the crash.
Autopilot’s Blind Spot With Trucks
This isn’t an isolated case. Tesla’s Autopilot has faced increasing scrutiny due to its repeated failures to detect large, slow-moving, or stationary objects, particularly trucks.
Investigations reveal a troubling trend where Tesla vehicles, traveling at highway speeds, collide with semi-trailers when Autopilot is engaged.
In the 2019 Florida crash, McGee activated Autopilot just 10 seconds before impact. According to the National Transportation Safety Board (NTSB), the Tesla was traveling at 68 mph and McGee hadn’t touched the wheel for eight seconds prior to the collision.
Experts argue that Tesla’s camera-only vision system lacks the capacity to reliably detect large cross-path vehicles, unlike systems from competitors like Waymo, which use lidar and radar for redundancy.
Legal Precedent Emerges
The Florida verdict marks a shift in how courts are treating semi-autonomous systems. Previously, Tesla had largely avoided liability by placing full responsibility on driver misuse. But this case turned on a different question: Did Tesla’s marketing give users a false sense of security?
The plaintiffs successfully argued that Tesla’s branding misled McGee into overestimating Autopilot’s capabilities.
“My concept was that it would assist me should I have a failure… and in that case I feel like it failed me.” McGee stated in court.
This argument appears to have swayed the jury, signaling a legal environment where automakers could increasingly be held accountable not just for the tech, but for the way they promote it.
What’s Next for Tesla and Industry Players?
The $43 million ruling could ripple through the automotive industry, particularly among manufacturers developing advanced driver-assistance systems. Tesla has already updated its Autopilot software multiple times, but critics argue that deeper systemic changes and clearer marketing are necessary.
As NHTSA continues to probe similar incidents, future cases may hinge on how companies define and communicate the limitations of their self-driving features.