Elon Musk swears Tesla’s self-driving cars are safe. So why is the government withdrawing them?

This week, Tesla announced two massive recalls related to software issues with its vehicles. One of the recalls called for Tesla to drop its autonomous driving feature, which made the company’s cars break the law. While Elon Musk said “There were no security issues,” federal regulators disagree. But it’s also becoming increasingly clear that government pressure is unlikely to change Tesla’s brash approach.

In fact, these recently withdrawn features are just the latest examples of the company selling potentially dangerous software. This trend is of particular concern because the National Highway Traffic Safety Administration (NHTSA), the agency that will control the recall, can only act after the Tesla software was released to drivers. Tesla usually sends these updates to their vehicles when they are connected to the internet, so they don’t arrive on time. significant cost to the company. This left Tesla free to release and test new features with drivers on the road — until the government catches up and intervenes, or until Tesla identifies problems on its own.

Tesla first review this week included a feature that allowed users to program their Teslas to slow down, but not come to a complete stop, at intersections with stop signs under very specific conditions. The company first introduced the so-called “rolling stop” feature back in October 2020 as part of a self-driving beta software update. This extended version Tesla’s Autopilot driver assistance technology is only available to a limited number of paying Tesla owners $12,000 to upgrade and famously controversial. Despite the name, the Full Self-Driving software doesn’t actually turn Teslas into fully autonomous vehicles, but it does give drivers access to semi-autonomous features. The recent recall of the continuous stop feature affected all 53,822 Tesla vehicles that were converted to full autonomous driving.

It is unclear how dangerous Tesla’s “sliding stop” feature was in practice. No warranty claims or injuries were reported to Tesla, and the system only worked if the car was driven at a very low speed. Elon Musk claims that the speed was 2 miles per hour, while the official NHTSA notes say it was 5.6 mph. Regardless, traffic through stop signs illegal in most places, no matter how slow you go. Like NHTSA marked in a letter to Tesla: “If you do not stop at a stop sign, it may increase the risk of an accident.” Tesla has released a software update to address the recall issue.

Tesla second review came Thursday after the NHTSA said more than 817,000 Tesla vehicles may have a faulty seat belt function. The agency said these vehicles will, in some cases, not chime when people forget to fasten their seat belts. is talking this, again, will be fixed with a software update.

One of the software updates released by Tesla allowed drivers to play video games while their cars were in motion.
Andy Cross/The Denver Post via Getty Images

These are just two of Tesla’s latest security-related software updates. The automaker has initiated nine vehicle recalls. within the last three months, and federal safety regulators launching an investigation into the company now think it’s a common occurrence. At the same time, some problems that software updates are designed to address remain or, in some cases, worsen.

For example, Tesla issued a review — and software updates — after reports last fall that cars using autopilot would brake without warning after their computers detected dangers that didn’t really exist. But this “phantom braking” problem actually worsened in the months following the recall, possibly because Tesla decided to phase out radar sensors in favor of cameras on some vehicle models. a recent report from the Washington Post. Federal regulators are once again consulting with the company on how to address the issue.

“NHTSA has been playing cat and mouse with Tesla for a few years now,” explained Michael Brooks, acting director of the Automotive Safety Center. “Right now, the NHTSA seems to be playing a slightly more active version of the cat.”

However, Tesla is not going to stop. Again, car recalls don’t work like they used to. Previously, car owners would take their recalled vehicles to a dealer or bodyshop for repairs, otherwise they would receive a replacement or a refund. But in recent years, automakers have had the opportunity to fix some of the flaws with an over-the-air software update, as Tesla did with all the aforementioned recalls. Because these reviews are akin to releasing app updates, it’s easier and faster for car owners to deal with them. But they also relatively cheap to be carried out by car manufacturers. Therefore, some may argue that if the recall simply results in a software update, then this is nothing more than a slap in the face from the regulators.

It looks like Tesla is also using software updates to prevent potential recalls. This past summer, the company released a feature called passenger game which allowed people to play video games on Tesla touchscreens while the cars were in motion. But after the NHTSA announced feature research, Tesla released a software update in December to forbid Passenger play while the car is moving. In August last year, the agency similarly announced an official investigation in 11 crashes involving Tesla vehicles with autopilot or traffic-aware cruise control engaged, which hit stopped emergency vehicles with headlights flashing, and then Tesla pushed appropriate software update a few weeks later. This prompted NHTSA to ask the company in October why it did not file a formal notice of withdrawal, implying that Tesla could have used the update as “hidden recall“.

In addition to official reviews, regulators currently there are not many other tools at their disposal. NHTSA may update Federal Motor Vehicle Safety Standards, but this multi-year process, a fact that may help explain why regulations have not kept up with new automotive technology. The National Transportation Safety Board may also deal with motor vehicle safety, but its activities are mainly limited to conducting accident investigation and making recommendations. While the states slowly developing their own rules, there is still no nationwide consensus on how to approach vehicles like the Tesla.

Transportation Secretary Pete Buttigieg looks under the open hood of a Tesla car.

The Department of Transportation is still pondering how to approach increasingly autonomous vehicles.
Drew Angerer/Getty Images

That’s why some think it’s time for the government to do more. Last May, representatives Bobby Rush (D-IL) and Larry Bakshon (R-IN) proposed Collision Avoidance System Evaluation Law, which will force the Department of Transportation to look into how well car accident prevention technologies such as Tesla Autopilot and full autonomous driving actually work. Meanwhile, Senators Ed Markey (D-Massachusetts) and Richard Blumenthal (D-Connecticut) called for a Federal Trade Commission in August investigate Tesla for false and misleading advertising related to these semi-autonomous features. Along with Senator Amy Klobuchar (D-MN), they also proposed Stay up to date for all the Law, which would require DOT to review driver assistance systems and require automakers to install driver monitoring tools.

“These recent developments with Tesla’s autopilot and full autonomous driving systems are precisely why we have long expressed our concerns about this imperfect technology,” said Blumenthal and Marki Recode. “While automated driving and driver assistance systems can improve safety, they must be implemented with strong guarantees that will ensure our vehicles comply with traffic rules and drivers are fully engaged.”

But the reason there are no strong guarantees for semi-autonomous vehicles is that there is no consensus on how these vehicles should be regulated. Is the driver or car manufacturer liable when the car with the software does something dangerous? Should autonomous and semi-autonomous cars be programmed to follow the letter of the law or drive the way most people actually drive? Regulators have not fully answered these questions, and Tesla is taking advantage of this ambiguity.

All of this is happening at what should be a moment of opportunity for Tesla and its commitment to safety. For the nine months of last year, death toll on US roads have grown at the fastest pace since the Department of Transportation began tracking them in 1975. Musk has repeatedly argued that AI-powered Tesla vehicles save lives and that his vehicles could be the solution to worsening road safety problems. But the company’s persistence in releasing features that deliberately play with security instead of promoting it seems to undermine that larger goal.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button