September 28, 2022
EV maker Tesla has found themselves in a bit of hot water again as their Autopilot and FSD (Full Self Driving) were deemed false advertising by the California state Department of Motor Vehicles (sort of like our JPJ), not that it was clean of controversy before. We generally don’t give two hoots about what goes on in California when it comes to driving, but the issue here could be emblematic of other automakers as they promote, advertise and/or make claims about their cars’ autonomous (or semi-autonomous) driving capabilities. Tesla Autopilot/ FSD not safe? The company’s Autopilot suite has seen its fair share of detractors and criticisms that it may give some drivers an exaggerated sense of security when operating the vehicle, and we’ve seen many instances of Tesla drivers flat-out letting the car ‘drive itself’ as they nap or are inattentive in the front seat or, worse, in the back seat. We’ve also seen more than a few cases of Teslas careening off roads, making erratic manoeuvres, and generally crashing. Dangerous, to say the least, when compounded with a driver literally asleep at the wheel. Videos shared online show its cars being involved in near head-on collisions with trucks or trains that required immediate and sudden driver intervention to avoid and another with the sensor system mistaking the moon for a traffic light stuck on yellow. For the most part, FSD - which purports to automatically pilot the car on highways and city streets as well as find a parking space independent of a driver - is a more advanced collection of technologies that build upon Autopilot, as its name suggests, and this has prompted the California DMV to issue a pair of complaints with the state Office of Administrative Hearings in late July 2022. According to the Los Angeles Times, Tesla “made or disseminated statements that are untrue or misleading, and not based on facts,” while pointing to the actual names of these technology features as they are marketed to customers. California DMW lodges complaint against Tesla Autopilot The complaints also pointed to the language used on Tesla’s website pertaining to the Autopilot suite that reads: “All you will need to do is get in and tell your car where to go. If you don’t say anything, your car will look at your calendar and take you there as the assumed destination. Your Tesla will figure out the optimal route, navigating urban streets, complex intersections, and freeways.” However, Tesla’s website also states that “the currently enabled features require active driver supervision and do not make the vehicle autonomous,” though the complaint counters that it “contradicts the original untrue or misleading labels and claims, which is misleading, and does not cure the violation.” The DMV asserts that Tesla cars never could, “and cannot now, operate as autonomous vehicles.” Worryingly, should their complaints be pursued to their fullest, it could lead to the revocation of the automaker’s license to sell or produce cars in California, a spokesperson told the newspaper: “The DMV will ask that Tesla will be required to advertise to consumers and better educate Tesla drivers about the capabilities of its ‘Autopilot’ and ‘Full Self-Driving’ features, including cautionary warnings regarding the limitations of the features, and for other actions as appropriate given the violations.” It is unknown if FSD, now a whopping US$12,000 optional extra, will be eventually merged with Autopilot and offered as standard on every or most new Tesla models. CEO Elon Musk claimed that FSD had not been a factor in any crashes involving the company’s cars, though at least 8 crash reports submitted by owners to federal safety regulators suggest otherwise.

EV maker Tesla has found themselves in a bit of hot water again as their Autopilot and FSD (Full Self Driving) were deemed false advertising by the California state Department of Motor Vehicles (sort of like our JPJ), not that it was clean of controversy before.

We generally don’t give two hoots about what goes on in California when it comes to driving, but the issue here could be emblematic of other automakers as they promote, advertise and/or make claims about their cars’ autonomous (or semi-autonomous) driving capabilities.

Tesla Autopilot/ FSD not safe?

The company’s Autopilot suite has seen its fair share of detractors and criticisms that it may give some drivers an exaggerated sense of security when operating the vehicle, and we’ve seen many instances of Tesla drivers flat-out letting the car ‘drive itself’ as they nap or are inattentive in the front seat or, worse, in the back seat.

We’ve also seen more than a few cases of Teslas careening off roads, making erratic manoeuvres, and generally crashing. Dangerous, to say the least, when compounded with a driver literally asleep at the wheel.

Videos shared online show its cars being involved in near head-on collisions with trucks or trains that required immediate and sudden driver intervention to avoid and another with the sensor system mistaking the moon for a traffic light stuck on yellow.

For the most part, FSD – which purports to automatically pilot the car on highways and city streets as well as find a parking space independent of a driver – is a more advanced collection of technologies that build upon Autopilot, as its name suggests, and this has prompted the California DMV to issue a pair of complaints with the state Office of Administrative Hearings in late July 2022.

According to the Los Angeles Times, Tesla “made or disseminated statements that are untrue or misleading, and not based on facts,” while pointing to the actual names of these technology features as they are marketed to customers.

California DMW lodges complaint against Tesla Autopilot

The complaints also pointed to the language used on Tesla’s website pertaining to the Autopilot suite that reads: “All you will need to do is get in and tell your car where to go. If you don’t say anything, your car will look at your calendar and take you there as the assumed destination. Your Tesla will figure out the optimal route, navigating urban streets, complex intersections, and freeways.”

However, Tesla’s website also states that “the currently enabled features require active driver supervision and do not make the vehicle autonomous,” though the complaint counters that it “contradicts the original untrue or misleading labels and claims, which is misleading, and does not cure the violation.”

The DMV asserts that Tesla cars never could, “and cannot now, operate as autonomous vehicles.”

Worryingly, should their complaints be pursued to their fullest, it could lead to the revocation of the automaker’s license to sell or produce cars in California, a spokesperson told the newspaper:

“The DMV will ask that Tesla will be required to advertise to consumers and better educate Tesla drivers about the capabilities of its ‘Autopilot’ and ‘Full Self-Driving’ features, including cautionary warnings regarding the limitations of the features, and for other actions as appropriate given the violations.”

It is unknown if FSD, now a whopping US$12,000 optional extra, will be eventually merged with Autopilot and offered as standard on every or most new Tesla models. CEO Elon Musk claimed that FSD had not been a factor in any crashes involving the company’s cars, though at least 8 crash reports submitted by owners to federal safety regulators suggest otherwise.

Leave a Reply

Your email address will not be published.

Generated by Feedzy
Language