The U.S. National Highway Traffic Safety Administration is currently investigating two fatal July, 2022 crashes involving Teslas plowing into motorcyclists from behind. It was nighttime in both cases, and the crashes were both fatal for the riders. In both cases, NHTSA suspects that the Teslas were operating using the partially automated Autopilot feature. 

To be clear, despite its name, neither Tesla’s Autopilot—nor any other semi-automated driving technology available in August, 2022—is capable of driving a vehicle with no assistance or input from humans. Although multiple technology companies are actively working towards the development of such technologies, we simply aren’t at that point yet. 

As for the two motorcycle deaths that NHTSA is investigating, the first one occurred on July 7, 2022, in Riverside, California. A rider on a green Yamaha V-Star was traveling in the high-occupancy vehicle lane of State Route 91, according to the California Highway Patrol. At some point, a white Tesla Model Y struck the bike from behind, also in the HOV lane. The rider was ejected from the bike and died at the scene. CHP says it is still investigating whether Tesla Autopilot was engaged at the time. 

The second rider death under investigation by NHTSA happened on July 24, 2022, near Draper, Utah. Here, a Harley-Davidson motorcycle was traveling in the HOV lane on I-15. In this case, a Tesla Model 3 approached the Harley from behind and struck it, also ejecting the rider from the bike. He, too, was pronounced dead at the scene. In this case, the driver of the Tesla told Utah Department of Public Safety officers that he had been using Autopilot at the time. 

NHTSA continues to investigate crashes involving so-called automated driving systems, whether they’re fatal or not. Since 2016, it has investigated a total of 19 fatal crashes involving Teslas suspected of using Autopilot at the time.  

It’s worth noting that Tesla describes its Autopilot and Full Self-Driving features differently, with Autopilot allegedly keeping the cars in their lane, and at a safe following distance from vehicles ahead of them. If the system can’t see certain vehicles, that can clearly present a problem with the concept, however. By contrast, the Full Self-Driving function supposedly completes a predetermined route on behalf of the driver—but with the driver’s supervision.  

Now that we understand these definitions, it’s not exactly surprising to learn that the California Department of Motor Vehicles is also currently accusing Tesla of false advertising with regard to both of these features, as of August 5, 2022. As English speakers, the vast majority of people who are not deeply steeped in the Tesla universe would likely expect a higher level of automation from features called “Autopilot” and/or “Full Self-Driving.”  

Got a tip for us? Email: