mmcartalk
Expert
- Messages
- 4,164
- Reactions
- 2,677
Long overdue, IMO. After numerous accidents and injuries/deaths, we are finally starting to see some Federal attention being paid to Tesla's controversial self-driving claims. Two Senators have petitioned the FTC to start an investigation into deceptive claims, and DOT/NHTSA has opened up an active safety-investigation.
Yes, some of these incidents are probably due to morons either abusing the system by trying to drive (or monitor) from the back seat, or otherwise deliberately putting the vehicle in a dangerous or questionable situation to try and test the system...but others are pointing to system-malfunction when the driver was not trying to interfere wth it or abuse it in any way.
https://www.theverge.com/2021/8/18/2...f-driving-musk
www.vice.com
As the agency turns up the heat on tech companies
By Sean O'Kane@sokane1 Aug 18, 2021, 2:47pm EDT
Photo by Amelia Holowaty Krales / The Verge
Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) have asked new Federal Trade Commission Chair Lina Khan to investigate Tesla’s marketing of its advanced driver assistance system, Autopilot. The Senators are particularly concerned with how Tesla has been charging customers thousands of dollars for what it refers to as “Full Self-Driving capability,” despite the fact that buying that package does not make the company’s cars fully autonomous.
“Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road,” the senators wrote in a letter published Wednesday. “Accordingly, we urge you to open an investigation into potentially deceptive and unfair practices in Tesla’s advertising and marketing of its driving automation systems and take appropriate enforcement action to ensure the safety of all drivers on the road.”
The letter comes as the Biden administration has been steadily increasing the scrutiny of tech companies. Much of that focus to date has been through the lens of antitrust policy — especially at the FTC under Khan — but Tesla has drawn some heat as well. Earlier this week, the National Highway Traffic Safety Administration announced an investigation into Autopilot’s tendency to crash into parked emergency vehicles.
THE REQUEST COMES DAYS AFTER SAFETY REGULATORS ANNOUNCED A NEW PROBE INTO AUTOPILOT
Tesla CEO Elon Musk has spent years claiming that his company’s cars were on the verge of being able to drive themselves without any human intervention, though that functionality has never arrived. In 2015, he said fully autonomous Teslas were just two years away. In 2016, Tesla announced that all its new cars had the hardware required to accomplish this, and that the company just needed a little more time to dial in the software. That turned out to not be true, as Tesla has since created a new computer that these older cars would need.
At the same time, Tesla started offering customers a “Full Self-Driving” option when they bought their cars, essentially asking them to prepay for one day having a fully autonomous car. In exchange, these owners would get more advanced capabilities than what is offered in the standard Autopilot suite as Tesla developed them.
In late 2018, Tesla pulled this option from its website, with Musk admitting that it caused “too much confusion.” But just a few months later, it was back, and Musk again promised that full autonomy would be available by the end of 2019.
Since then, Tesla has continued to charge money for the “Full Self-Driving” option — the price currently stands at $10,000 if you buy it when you buy one of the company’s cars, though Tesla also recently started selling “subscriptions” to the Full Self-Driving package if owners want it after purchase. The company is explicit on its website that the option does not make its cars fully autonomous, though as of writing it still promises the functionality will be available by year’s end.
Even then, Musk now says a “feature-complete” version of Tesla’s Full Self-Driving software is defined as the car being able to drive someone from home to work “most likely without interventions,” which does not describe a fully autonomous vehicle.
TESLA ONCE PULLED THE “FULL SELF-DRIVING” OPTION FROM ITS WEBSITE BECAUSE OF “CONFUSION”
Tesla has been beta testing this more “feature-complete” version of the Full Self-Driving software for months now, with a few thousand owners in the program running the software on roads across the country. Many of these owners have filmed their experiences with the software, which gets updated every few weeks or months. The results are a mixed bag; for every intervention-free video of a Tesla gliding through a cityscape, there’s another where the car refuses to take a left turn or dives toward oncoming traffic.
Musk’s many claims, as well as the beta, and a video released in 2019 showing an early version of the Full Self-Driving software navigating roads in the San Francisco Bay Area make up the bulk of what the Senators cite in their letter. They also say Tesla has been misleading in its advertising of Autopilot and Full Self-Driving — though the company doesn’t engage in traditional advertising.
“We fear that Tesla’s Autopilot and FSD features are not as mature and reliable as the company pitches to the public,” the Senators wrote. “Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves – with potentially deadly consequences.”
Safety advocates and other regulators have long pushed for more scrutiny over the way Tesla treats its driver assistance technology. In early 2020, the National Transportation Board found that the design of Autopilot and overconfidence in its abilities were what led to a fatal crash in Mountain View, California. Musk has even admitted that drivers can get too complacent when using Tesla’s driver assistance features. But for years, he’s only allowed a passive form of driver monitoring when Autopilot is engaged.
Charging thousands of dollars for a feature that doesn’t wholly exist while also beta testing it in the real world, after spending years of moving the target for creating an autonomous car, is certainly still causing confusion like the kind that once made Musk step back from the Full Self-Driving option. In Wednesday’s letter, Markey and Blumenthal mistakenly say that the Full Self-Driving beta software is now available to “all Tesla owners” through a subscription. That promise, like many others Musk has made, has not yet been fulfilled.
After years of looking the other way, regulators might finally be getting around to caring about Tesla's deceptive self-driving claims.

By Aaron Gordon
August 18, 2021, 11:43am
SCREENSHOT: YOUTUBE

Moveable explores the future of transportation, infrastructure, energy, and cities.
Tesla, the world's most frustrating company, simultaneously makes what are widely regarded as the best electric vehicles and most functional and comprehensive charging network while also selling the world's most dangerous and widely abused driver-assist features. Thanks to years of the company's misleading marketing of the "Autopilot" and "Full Self-Driving" packages—as well as the frequent wild claims by the extremely online CEO Elon Musk such as the prediction in 2019 that there would be one million Tesla robotaxis by 2020—owners perceive it to be far more capable than it is.
After years of looking the other way, it's possible that maybe, just maybe, the government is finally going to do something about Tesla's massive beta test in which we are all experiment subjects.
On Monday, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into 11 cases where a Tesla on Autopilot crashed into emergency vehicles. NHTSA has previously disclosed it is also investigating 30 other Tesla crashes where 10 people died, most involving Autopilot of FSD.
NHTSA's investigations alone indicate a new degree of seriousness from the agency under the Biden administration, but Tesla faces criticism from elsewhere in the government, too. On Wednesday, Senators Richard Blumenthal and Ed Markey sent a letter to Federal Trade Commission chief Lina Kahn asking her to open its own investigation into Tesla's deceptive marketing practices around Autopilot and FSD. The letter cites a video Tesla posted to YouTube in 2019 with 18 million views showing someone "driving" the car without touching the wheel for more than a minute, in violation of Tesla's own stated safety policies.
Even taking Tesla's policies at relatively face value—and not including the highly publicized ways Teslas have been easily tricked for years into driving on their own for extended periods, bugs for which Tesla could issue a software update to fix—Tesla has always tried to have it both ways. It promotes these driver assist features as if they basically drive the car itself—the names are "Autopilot" and "Full-Self Driving," after all—and you can pay $10,000 for the privilege of using them, a premium price for what’s being sold as a premium experience. But, in the fine legal print, the company says these features are no more reliable than any other Level 2 driver assist system that can be found from virtually every other manufacturer, and the driver must still pay close attention at all times. Some drivers tragically find this out the hard way, like George McGee, a man in Florida who reached down to pick up his phone thinking Autopilot was in control when it promptly slammed into another car, killing a woman. When police arrived, he referred to the car's capabilities as "stupid cruise control."
Whether anything will come of these investigations remains to be seen—or, in the FTC's case, if an investigation will be made at all. But if the last five years or so have taught us anything, it's that Tesla won't stop until someone makes them.
Yes, some of these incidents are probably due to morons either abusing the system by trying to drive (or monitor) from the back seat, or otherwise deliberately putting the vehicle in a dangerous or questionable situation to try and test the system...but others are pointing to system-malfunction when the driver was not trying to interfere wth it or abuse it in any way.
https://www.theverge.com/2021/8/18/2...f-driving-musk
The Government Is Finally Catching Up With Tesla's Wild Autopilot Claims
After years of looking the other way, regulators might finally be getting around to caring about Tesla's deceptive self-driving claims.
Senators ask FTC to investigate Tesla’s ‘Full Self-Driving’ promises
As the agency turns up the heat on tech companies
By Sean O'Kane@sokane1 Aug 18, 2021, 2:47pm EDT
Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT) have asked new Federal Trade Commission Chair Lina Khan to investigate Tesla’s marketing of its advanced driver assistance system, Autopilot. The Senators are particularly concerned with how Tesla has been charging customers thousands of dollars for what it refers to as “Full Self-Driving capability,” despite the fact that buying that package does not make the company’s cars fully autonomous.
“Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road,” the senators wrote in a letter published Wednesday. “Accordingly, we urge you to open an investigation into potentially deceptive and unfair practices in Tesla’s advertising and marketing of its driving automation systems and take appropriate enforcement action to ensure the safety of all drivers on the road.”
The letter comes as the Biden administration has been steadily increasing the scrutiny of tech companies. Much of that focus to date has been through the lens of antitrust policy — especially at the FTC under Khan — but Tesla has drawn some heat as well. Earlier this week, the National Highway Traffic Safety Administration announced an investigation into Autopilot’s tendency to crash into parked emergency vehicles.
THE REQUEST COMES DAYS AFTER SAFETY REGULATORS ANNOUNCED A NEW PROBE INTO AUTOPILOT
Tesla CEO Elon Musk has spent years claiming that his company’s cars were on the verge of being able to drive themselves without any human intervention, though that functionality has never arrived. In 2015, he said fully autonomous Teslas were just two years away. In 2016, Tesla announced that all its new cars had the hardware required to accomplish this, and that the company just needed a little more time to dial in the software. That turned out to not be true, as Tesla has since created a new computer that these older cars would need.
At the same time, Tesla started offering customers a “Full Self-Driving” option when they bought their cars, essentially asking them to prepay for one day having a fully autonomous car. In exchange, these owners would get more advanced capabilities than what is offered in the standard Autopilot suite as Tesla developed them.
In late 2018, Tesla pulled this option from its website, with Musk admitting that it caused “too much confusion.” But just a few months later, it was back, and Musk again promised that full autonomy would be available by the end of 2019.
Since then, Tesla has continued to charge money for the “Full Self-Driving” option — the price currently stands at $10,000 if you buy it when you buy one of the company’s cars, though Tesla also recently started selling “subscriptions” to the Full Self-Driving package if owners want it after purchase. The company is explicit on its website that the option does not make its cars fully autonomous, though as of writing it still promises the functionality will be available by year’s end.
Even then, Musk now says a “feature-complete” version of Tesla’s Full Self-Driving software is defined as the car being able to drive someone from home to work “most likely without interventions,” which does not describe a fully autonomous vehicle.
TESLA ONCE PULLED THE “FULL SELF-DRIVING” OPTION FROM ITS WEBSITE BECAUSE OF “CONFUSION”
Tesla has been beta testing this more “feature-complete” version of the Full Self-Driving software for months now, with a few thousand owners in the program running the software on roads across the country. Many of these owners have filmed their experiences with the software, which gets updated every few weeks or months. The results are a mixed bag; for every intervention-free video of a Tesla gliding through a cityscape, there’s another where the car refuses to take a left turn or dives toward oncoming traffic.
Musk’s many claims, as well as the beta, and a video released in 2019 showing an early version of the Full Self-Driving software navigating roads in the San Francisco Bay Area make up the bulk of what the Senators cite in their letter. They also say Tesla has been misleading in its advertising of Autopilot and Full Self-Driving — though the company doesn’t engage in traditional advertising.
“We fear that Tesla’s Autopilot and FSD features are not as mature and reliable as the company pitches to the public,” the Senators wrote. “Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves – with potentially deadly consequences.”
Safety advocates and other regulators have long pushed for more scrutiny over the way Tesla treats its driver assistance technology. In early 2020, the National Transportation Board found that the design of Autopilot and overconfidence in its abilities were what led to a fatal crash in Mountain View, California. Musk has even admitted that drivers can get too complacent when using Tesla’s driver assistance features. But for years, he’s only allowed a passive form of driver monitoring when Autopilot is engaged.
Charging thousands of dollars for a feature that doesn’t wholly exist while also beta testing it in the real world, after spending years of moving the target for creating an autonomous car, is certainly still causing confusion like the kind that once made Musk step back from the Full Self-Driving option. In Wednesday’s letter, Markey and Blumenthal mistakenly say that the Full Self-Driving beta software is now available to “all Tesla owners” through a subscription. That promise, like many others Musk has made, has not yet been fulfilled.
The Government Is Finally Catching Up With Tesla's Wild Autopilot Claims
After years of looking the other way, regulators might finally be getting around to caring about Tesla's deceptive self-driving claims.

By Aaron Gordon
August 18, 2021, 11:43am
SCREENSHOT: YOUTUBE

Moveable explores the future of transportation, infrastructure, energy, and cities.
Tesla, the world's most frustrating company, simultaneously makes what are widely regarded as the best electric vehicles and most functional and comprehensive charging network while also selling the world's most dangerous and widely abused driver-assist features. Thanks to years of the company's misleading marketing of the "Autopilot" and "Full Self-Driving" packages—as well as the frequent wild claims by the extremely online CEO Elon Musk such as the prediction in 2019 that there would be one million Tesla robotaxis by 2020—owners perceive it to be far more capable than it is.
After years of looking the other way, it's possible that maybe, just maybe, the government is finally going to do something about Tesla's massive beta test in which we are all experiment subjects.
On Monday, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into 11 cases where a Tesla on Autopilot crashed into emergency vehicles. NHTSA has previously disclosed it is also investigating 30 other Tesla crashes where 10 people died, most involving Autopilot of FSD.
NHTSA's investigations alone indicate a new degree of seriousness from the agency under the Biden administration, but Tesla faces criticism from elsewhere in the government, too. On Wednesday, Senators Richard Blumenthal and Ed Markey sent a letter to Federal Trade Commission chief Lina Kahn asking her to open its own investigation into Tesla's deceptive marketing practices around Autopilot and FSD. The letter cites a video Tesla posted to YouTube in 2019 with 18 million views showing someone "driving" the car without touching the wheel for more than a minute, in violation of Tesla's own stated safety policies.
Even taking Tesla's policies at relatively face value—and not including the highly publicized ways Teslas have been easily tricked for years into driving on their own for extended periods, bugs for which Tesla could issue a software update to fix—Tesla has always tried to have it both ways. It promotes these driver assist features as if they basically drive the car itself—the names are "Autopilot" and "Full-Self Driving," after all—and you can pay $10,000 for the privilege of using them, a premium price for what’s being sold as a premium experience. But, in the fine legal print, the company says these features are no more reliable than any other Level 2 driver assist system that can be found from virtually every other manufacturer, and the driver must still pay close attention at all times. Some drivers tragically find this out the hard way, like George McGee, a man in Florida who reached down to pick up his phone thinking Autopilot was in control when it promptly slammed into another car, killing a woman. When police arrived, he referred to the car's capabilities as "stupid cruise control."
Whether anything will come of these investigations remains to be seen—or, in the FTC's case, if an investigation will be made at all. But if the last five years or so have taught us anything, it's that Tesla won't stop until someone makes them.