bikers beware —

Tesla faces new probes into motorbike deaths, false advertising

NHTSA is investigating bike deaths as California says Tesla statements are "untrue."

Elon Musk <a href="https://www.businessinsider.com/elon-musk-tesla-worth-basically-zero-without-self-driving-2022-6">said in June</a> that without autonomous driving technology, Tesla is "worth basically nothing."
Enlarge / Elon Musk said in June that without autonomous driving technology, Tesla is "worth basically nothing."

Tesla went into the weekend with a fresh pair of headaches. On Friday, the Associated Press reported that the federal government is investigating whether or not the company's Autopilot system can safely recognize motorcyclists after a pair of fatal crashes in July. And the Los Angeles Times reported that California is unhappy with the way the automaker has advertised its Autopilot and Full Self-Driving driver assist technologies.

Can Autopilot see motorbikes at night?

The first fatal crash occurred in the early hours of July 7 in Riverside, California, when a Tesla Model Y on State Route 91 hit a motorcycle from behind, killing its rider. The second fatal motorcycle crash occurred on July 24, again at night, this time on I-15 outside Draper, Utah. In that case, a Tesla Model 3 was driving behind a motorcycle and hit it, killing the rider.

The AP reports that the California Highway Patrol is still investigating whether Autopilot was active in the first crash, but the driver in Utah admitted he was using the driver assist at the time of his crash.

Investigators from the National Highway Traffic Safety Administration traveled to both crash sites; according to the AP, NHTSA "suspects that Tesla's partially automated driver-assist system was in use in each case."

NHTSA's Office of Defects Investigation is already looking into Autopilot following at least 11 crashes where Tesla cars, operating under Autopilot, hit emergency vehicles after failing to recognize them. A second NHTSA investigation is also underway to determine if the removal of the forward-looking radar sensor on newer Teslas is the cause of a "phantom braking" problem that has resulted in hundreds of complaints to the regulator.

The lack of a forward-looking radar and the sole reliance on cameras may well be a factor in both of these fatal crashes, although even less controversial adaptive cruise control systems have been shown to have problems with motorcycles if they're driving near the edge of a lane.

Misleading marketing

Meanwhile, California's Department of Motor Vehicles filed a pair of complaints with the state's Office of Administrative Hearings. The complaints say that Tesla's statements describing Autopilot and the more controversial Full Self-Driving feature have been "untrue or misleading, and not based on facts."

As an example, the complaints cite a statement on Tesla's Autopilot webpage claiming that with Full Self-Driving:

[a]ll you will need to do is get in and tell your car where to go. If you don’t say anything, your car will look at your calendar and take you there as the assumed destination. Your Tesla will figure out the optimal route, navigating urban streets, complex intersections and freeways.

As the DMV points out, no Tesla (or any other car on sale today) can operate autonomously.

This isn't the first time that Tesla's marketing has been called out as misleading. In 2016, the German transport ministry told the company to stop using the term "Autopilot" in its advertising, and in 2019, NHTSA told Tesla to stop using misleading or incorrect statements about the safety of its cars.

Channel Ars Technica