tesla autopilot

elon-musk-makes-bold-claims-about-tesla-robotaxi-in-hollywood-backlot

Elon Musk makes bold claims about Tesla robotaxi in Hollywood backlot

“It’s going to be a glorious future,” Musk said, albeit not one that applies to families or groups of three or more.

Musk claims that Tesla “expects to start” fully unsupervised FSD next year on public roads in California and Texas. A recent analysis by an independent testing firm found the current build requires human intervention about once every 13 miles, often on roads it has used before.

A rendering of the two-seat interior of the Tesla Cybercab

Only being able to carry two occupants is pretty inefficient when a city bus can carry more than 80 passengers. Credit: Tesla

“Before 2027” should see the Cybercab, which Musk claims will be built in “very high volume.” Tesla-watchers will no doubt remember similar claims about the Model X, Model 3, Model Y, and most recently the Cybertruck, all of which faced lengthy delays as the car maker struggled to build them at scale. Later, Musk treated the audience to a video of an articulated robotic arm with a vacuum cleaner attachment cleaning the two-seat interior of the Cybercab. Whether this will be sold as an aftermarket accessory to Cybercab owners, or if they’re supposed to clean out their robotaxis by hand between trips, remains unclear at this time.

Musk also debuted another autonomous concept, the Robovan. It’s a small bus with no visible wheels, but brightly lit interior room for up to 20 occupants. Musk said little about the Robovan and how it figures into Tesla’s future. In 2017 he revealed his dislike for public transport, saying “it’s a pain in the ass” and that other passengers could be serial killers. 

After promising that “unsupervised FSD” is coming to all of Tesla’s five models—”now’s not the time for nuance,” Musk told a fan—he showed off a driverless minibus and then a horde of humanoid robots, which apparently leverage the same technology that Tesla says will be ready for autonomous driving with no supervision. These robots—”your own personal R2-D2,” he said—will apparently cost less than “$30,000” “long-term,” Musk claimed, adding that these would be the biggest product of all time, as all 8 billion people on earth would want one, then two, he predicted.

Elon Musk makes bold claims about Tesla robotaxi in Hollywood backlot Read More »

tesla’s-2-million-car-autopilot-recall-is-now-under-federal-scrutiny

Tesla’s 2 million car Autopilot recall is now under federal scrutiny

maybe ban it instead —

NHTSA has tested the updated system and still has questions.

A 2014 Tesla Model S driving on Autopilot rear-ended a Culver City fire truck that was parked in the high-occupancy vehicle lane on Interstate 405.

Enlarge / A 2014 Tesla Model S driving on Autopilot rear-ended a Culver City fire truck that was parked in the high-occupancy vehicle lane on Interstate 405.

Tesla’s lousy week continues. On Tuesday, the electric car maker posted its quarterly results showing precipitous falls in sales and profitability. Today, we’ve learned that the National Highway Traffic Safety Administration is concerned that Tesla’s massive recall to fix its Autopilot driver assist—which was pushed out to more than 2 million cars last December—has not actually made the system that much safer.

NHTSA’s Office of Defects Investigation has been scrutinizing Tesla Autopilot since August 2021, when it opened a preliminary investigation in response to a spate of Teslas crashing into parked emergency responder vehicles while operating under Autopilot.

In June 2022, the ODI upgraded that investigation into an engineering analysis, and in December 2023, Tesla was forced to recall more than 2 million cars after the analysis found that the car company had inadequate driver-monitoring systems and had designed a system with the potential for “foreseeable misuse.”

NHTSA has now closed that engineering analysis, which examined 956 crashes. After excluding crashes where the other car was at fault, where Autopilot wasn’t operating, or where there was insufficient data to make a determination, it found 467 Autopilot crashes that fell into three distinct categories.

First, 221 were frontal crashes in which the Tesla hit a car or obstacle despite “adequate time for an attentive driver to respond to avoid or mitigate the crash.” Another 111 Autopilot crashes occurred when the system was inadvertently disengaged by the driver, and the remaining 145 Autopilot crashes happened under low grip conditions, such as on a wet road.

As Ars has noted time and again, Tesla’s Autopilot system has a more permissive operational design domain than any comparable driver-assistance system that still requires the driver to keep their hands on the wheel and their eyes on the road, and NHTSA’s report adds that “Autopilot invited greater driver confidence via its higher control authority and ease of engagement.”

The result has been disengaged drivers who crash, and those crashes “are often severe because neither the system nor the driver reacts appropriately, resulting in high-speed differential and high energy crash outcomes,” NHTSA says. Tragically, at least 13 people have been killed as a result.

NHTSA also found that Tesla’s telematics system has plenty of gaps in it, despite the closely held belief among many fans of the brand that the Autopilot system is constantly recording and uploading to Tesla’s servers to improve itself. Instead, it only records an accident if the airbags deploy, which NHTSA data shows only happens in 18 percent of police-reported crashes.

The agency also criticized Tesla’s marketing. “Notably, the term “Autopilot” does not imply an L2 assistance feature but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation,” it says.

But now, NHTSA’s ODI has opened a recall query to assess whether the December fix actually made the system any safer. From the sounds of it, the agency is not convinced it did, based on additional Autopilot crashes that have happened since the recall and after testing the updated system itself.

Worryingly, the agency writes that “Tesla has stated that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it” and wants to know why subsequent updates have addressed problems that should have been fixed with the December recall.

Tesla’s 2 million car Autopilot recall is now under federal scrutiny Read More »