Tesla FSD

feds-give-tesla-another-five-weeks-to-respond-to-fsd-probe

Feds give Tesla another five weeks to respond to FSD probe

The original request was sent to Tesla on December 3 with a deadline of January 19—next Monday—with penalties of up to $27,874 per day (to a maximum of $139.4 million) for not complying.

However, the winter holiday period ate up two weeks of the six-and-a-bit weeks, and the company has had to simultaneously prepare two other information requests for other ongoing NHTSA probes, one due today, another on January 23rd, and yet another on February 4, the company told NHTSA. Identifying all the complaints and reports will take more time, Tesla said, as it found 8,313 items when it searched for traffic violations, and it can only process 300 a day to see which ones are relevant.

Answering the remaining questions on NHTSA’s list would require the above to be completed first, so Tesla asked for and was granted an extension until February 23.

Meanwhile, Tesla has changed how its driver assist cash cow contributes to the bottom line. Until now, Tesla owners had the option of buying the system outright for (currently) $8,000. Now, CEO Elon Musk says that option will go away on February 14. From then on, if a Tesla owner wants FSD, they’ll have to pay a $99 monthly fee to use it.

Feds give Tesla another five weeks to respond to FSD probe Read More »

tesla-fsd-crashes-in-fog,-sun-glare—feds-open-new-safety-investigation

Tesla FSD crashes in fog, sun glare—Feds open new safety investigation

Today, federal safety investigators opened a new investigation aimed at Tesla’s electric vehicles. This is now the 14th investigation by the National Highway Traffic Safety Administration and one of several currently open. This time, it’s the automaker’s highly controversial “full self-driving” feature that’s in the crosshairs—NHTSA says it now has four reports of Teslas using FSD and then crashing after the camera-only system encountered fog, sun glare, or airborne dust.

Of the four crashes that sparked this investigation, one caused the death of a pedestrian when a Model Y crashed into them in Rimrock, Arizona, in November 2023.

NHTSA has a standing general order that requires it to be told if a car crashes while operating under partial or full automation. Fully automated or autonomous means cars might be termed “actually self-driving,” such as the Waymos and Zooxes that clutter up the streets of San Francisco. Festooned with dozens of exterior sensors, these four-wheel testbeds drive around—mostly empty of passengers—gathering data to train themselves with later, with no human supervision. (This is also known as SAE level 4 automation.)

But the systems that come in cars that you or I could buy are far less sophisticated. Sometimes called “level 2+,” these systems (which include Tesla Autopilot, Tesla FSD, GM’s Super Cruise, BMW Highway Assistant, and Ford BlueCruise, among others) are partially automated, not autonomous. They will steer, accelerate, and brake for the driver, and they may even change lanes without explicit instruction, but the human behind the wheel is always meant to be in charge, even if the car is operating in a hands-free mode.

Tesla FSD crashes in fog, sun glare—Feds open new safety investigation Read More »