Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
Photonics World

Tesla touts lidar-free autonomy

23 Apr 2019

Elon Musk dismisses lidar technology as 'lame' at electric car maker’s autonomy event specially arranged for investors.

Tesla chief Elon Musk has told an investor event dedicated to the company’s latest advances in autonomous driving that lidar technology will be of no use in the future.

The notoriously outspoken CEO variously described lidar as “lame”, “a fool’s errand”, and “doomed” – and predicted that other car makers who have invested in the technology would eventually dump the approach.

At the Tesla event, which instead touted the capability of neural network intelligence and machine learning to replicate human image processing, Musk and his development team detailed a new “full self-driving (FSD)” chip that is already being deployed in all new Tesla cars.

“I don’t super-hate lidar, as much as it may sound,” Musk told investors, pointing out that another of his companies, SpaceX, had developed its own lidar solution for Dragon supply ships to navigate to, and dock with, the International Space Station (ISS).

“In that scenario, lidar made sense,” said Musk. “[But] in cars it’s freakin’ stupid – it’s expensive, it’s unnecessary, and once you’ve solved vision it’s worthless.”

‘Fool’s errand’
When asked whether the new FSD chip might be able to process lidar data as well as images, plus data from the ultrasonic and radar sensors used on board Tesla vehicles, Musk continued: “Lidar is a fool’s errand. Anyone relying on lidar is doomed.”

To nervous audience laughter, he went on to liken the role of lidar on autonomous cars to that of a useless part of the human anatomy, describing the use of multiple sensors as “like having a whole bunch of expensive appendices”.

Instead, the thinking at Tesla is that imitating the human brain’s interpretation of imagery is the way to go, and that the combination of high-definition cameras, neural networking, and machine learning yields the level of depth perception and object recognition required to drive safely.

Said to be capable of interpreting 2300 frames per second, from images relayed by eight high-definition video cameras on board each vehicle, Tesla’s FSD approach is claimed to be far superior to rival technologies.

Musk and colleagues explained that aside from the advantage of designing a chip specifically for self-driving cars, Tesla was already using masses of images and data captured by its vehicles to enable “fleet learning” of all kinds of different scenarios encountered on the road – however unlikely.

That approach was likened to the way that Google has been able to optimize its search algorithms with real feedback from people using the search engine to find what they are looking for. With its current fleet of around half a million vehicles, the Tesla team says that it has a “massive data advantage” over its rivals.

Depth perception
Where lidar has been touted by its proponents – of which there are many – as having a key advantage is in depth perception, by firing out and detecting reflected photons to generate a fast-moving three-dimensional profile of any scene encountered while driving. Scores of lidar development startups have been backed heavily, by major auto companies and venture capital alike.

When asked whether it might make sense to use lidar sensors as a “backup”, especially when most of the rest of the industry appears to be looking to deploy the technology, Musk was unequivocal:

“Lidar is lame,” said the Tesla CEO. “They’re all going to dump lidar. That’s my prediction…mark my words.”

Musk and his developers say that all the necessary data required for depth perception can be generated using machine vision alone, without the need for lidar – although the on-board computer and neural network element of the self-driving chip needs to be “trained” with thousands of images for each scenario encountered.

The CEO's development team stressed during the autonomy event that neural network intelligence has proved very competent at deriving depth perception from camera images, and that the point-cloud of information generated by lidar equipment represented a "short-cut" and ultimately a technological "crutch" compared with the human-like potential of machine vision and learning.

Tesla and Musk are also extremely bullish on the likely timing of the self-driving revolution. While most analysts have predicted that full “Level 5” autonomy – where cars drive themselves with no human intervention – is at least a decade away, Musk told investors that “no-brain” driving would be technologically feasible a year from now, once software upgrades have been rolled out by the company.

He also predicted that initial regulatory approval for fully autonomous vehicles would be in place by the end of 2020, most likely for “platoons” of trucks.

Photon Lines LtdIridian Spectral TechnologiesTRIOPTICS GmbHCHROMA TECHNOLOGY CORP.Sacher Lasertechnik GmbHSynopsys, Optical Solutions GroupLaCroix Precision Optics
© 2024 SPIE Europe
Top of Page