Tesla’s abandonment of LIDAR and radar sensors and use of visual information from cameras alone has been revealed as a risky bet.
Popular YouTuber and former NASA engineer Mark Rober recently showed how easily Tesla’s Autopilot system can be tricked in real-world situations, and it calls into question the safety of the company’s driver-assistance technology.
Dependence on Visual Data Only is a Risky Step For Tesla
YouTuber Exposes Tesla’s Autopilot Flaws: Here’s Why Ditching LIDAR is
Mark Rober/YouTube
Elon Musk’s aggressive opposition to LIDAR technology has been widely reported. Musk famously referred to LIDAR as “fricking stupid, expensive, and unnecessary,” opting instead to concentrate on camera-based object detection. But Rober’s new video shows the inherent risks of this strategy.
In contrast to LIDAR and radar systems, which employ sophisticated sensing technology to identify objects precisely in different conditions, Tesla’s camera-only system falters in adverse conditions.
Read more:
Renault’s Insane $260K Electric Mini-Supercar Could Sell Out Before Anyone Sees Inside: Here’s Why
Mark Rober’s Eye-Opening Tests Highlight Critical Failures
To show the shortcomings of Tesla’s Autopilot system, Rober compared a Tesla with a Luminar-enabled Lexus SUV that relies on LIDAR sensors. The test results were stunning:
Child Mannequin Test: The Tesla didn’t have enough time to stop, even though it did detect the object. The car’s emergency braking system failed to kick in, and Rober’s Tesla ended up crashing into the mannequin.
Fog and Rain Simulation: Tesla’s cameras had a hard time driving through thick fog and rain, making the system useless in poor weather.
Painted Wall Illusion: In a dramatic test, Rober put a wall-sized painting of a road in front of the Tesla. Bizarrely, the car drove straight through it, unable to tell the difference between the actual road and the painted illusion.
By contrast, the Luminar-enabled Lexus SUV performed flawlessly in all the tests, proving the efficacy of LIDAR technology in providing safety.
Tesla’s ‘Full Self-Driving’ Presents Even More Risks
Rober’s revelations are timely as Tesla prepares to release an “unsupervised” form of its much-debated Full Self-Driving (FSD) software later this year.
As per Musk, this iteration will enable cars to drive without human supervision, which is a concern that it might give drivers a false sense of security.
Considering Tesla’s Autopilot system has already been associated with hundreds of injuries and dozens of fatalities, the idea of eliminating supervision altogether is frightening, according to Futurism.
Possible Impact on Tesla’s Robotaxi Plans
Musk’s plans to roll out a fully autonomous robotaxi service may further accentuate these shortcomings. Without the safety of LIDAR or radar, Tesla’s system might not be ready to cope with the variability of actual driving conditions.
Rober’s experiments highlight the point that Tesla’s existing camera-based system is still not ready to provide the degree of safety needed for autonomous cars.
Safety Issues for Tesla Owners and Regulators
The findings of Rober’s experiments contribute to the increasing worries of regulators and safety groups. Tesla’s use of a camera-based system has been attributed to countless accidents, including investigations and questions from federal authorities.
As Rober’s experiment illustrates, the dangers of Tesla’s methodology are not merely theoretical: they are real and potentially deadly.
Can Safety Be Still Guaranteed With Autopilot?
Even as Tesla pushes the limits of autonomous driving technology, the naked flaws in its camera-based system cannot be overlooked. Rober’s experiments are a wake-up call for the EV giant to rethink its strategy before proceeding with unsupervised FSD and robotaxi services.
Until Tesla closes these safety loopholes, the vision of fully autonomous cars remains just that — a fantasy, not a reality.
Related Article:
Tesla Recall 2024: 1.6 Million Vehicles Recalled in China Over Autopilot-Related Crashes