Elon Musk acknowledges that he underestimated the difficulties in improving Autopilot. Tesla’s flagship electric car manufacturer is the Autopilot system.
Their reliability and safety are becoming increasingly difficult to achieve.
Full-Self Driving software continues to be a tremendous challenge for Musk’s company, as the billionaire recently pointed out.
Haha, FSD 9 beta is shipping soon, I swear!
Generalized self-driving is a hard problem, as it requires solving a large part of real-world AI. Didn’t expect it to be so hard, but the difficulty is obvious in retrospect.
Nothing has more degrees of freedom than reality.
— Elon Musk (@elonmusk) July 3, 2021
This weekend, Musk noted, “The FSD 9 beta will ship soon (…) I swear!”. He had already promised it in 2018 and then in 2019. Already in 2021, he does it again.
“Pervasive autonomous driving is a difficult problem, as it requires solving a large part of real-world AI,” explained Tesla’s CEO. “We didn’t expect it to be this complicated, but the difficulty is obvious in retrospect.”
“Nothing has more degrees of freedom than reality,” Musk reflected.
Elon Musk and criticism of Autopilot
Recalls The Verge that Tesla upholds the safety of Autopilot, but as long as the driver remains attentive behind the wheel. It recently added to its detection system a camera that monitors, in one way or another, the user.
Find an (empty!) road with a sharp curve and point autopilot at it. For most safety ensure there's an extra space outside the curve so you can catch the car before it crashes into stuff and boom!
(this is 2021.4.18.3 btw)Remember to always pay attention when on AP! pic.twitter.com/g4ukeo82X5
— green (@greentheonly) July 4, 2021
It happened in China: a vehicle with Tesla’s Autopilot system crashed when it was unable to turn safely. The video, shown by user @greentheonly, shows how the car “issues several alerts before the eventual takeover we are giving up.” It happens in 50% of the tests, according to the user quoted by The Verge.
Meanwhile, in April, a group of engineers demonstrated in a video how to fool the autopilot system. The aim was to activate it without someone sitting in front of the steering wheel.
Consumer Reports’ official YouTube channel posted the controversial video demonstrating how they were able to trick the systems of a Model Y so that the Autopilot would run without someone behind the wheel.
It was enough to buckle the driver’s seat belt, use a chain and several basic counterweights to make the Autopilot work smoothly.
“In our evaluation,” notes Consumer Reports, “the system not only failed to make sure the driver was paying attention. But it also couldn’t tell if there was a driver there at all.”
“Safety advocates and Consumer Reports researchers say it shows that driver monitoring systems need to work harder to prevent drivers from using the systems in predictably dangerous ways.”
Increased safety, that’s all
To The Verge, “the number of open car crash investigations involving Tesla Autopilot appears to be growing in inverse relation to customer expectations about Musk’s ability to deliver on his promises.”
For now, all that’s left to do is keep working… but not so much bragging. The road is still long, before ensuring the safety of Tesla drivers.