Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
News
Menu
Photonics World

Photonics West 2021: Making sense of sensors for self-drive

12 Mar 2021

Panel of industry experts discuss how different technologies meet the photonics needs of autonomous vehicles.

From Ford Burkhart

The autonomous vehicle (AV) industry – at companies like Lyft, Waymo and Argo – is not actually looking for a super-sensor. Leading experts from these companies agreed in a Photonics West Digital Forum panel on Thursday that the key is how an auto’s many cameras, lidar and radar systems can cooperate to fill in each other’s blind spots.

What’s vital, said Michel Laverne, of Argo AI, is “fusing together their distinct views of the world” to move to at least Level 4 – known as high driving automation (mostly with no human intervention). Little was said about the futuristic Level 5, smart cars potentially without even steering wheels

But Level 4, or L4, stirred enough comment to run overtime at the Photonics West “Autonomous Vehicles” industry session called.

Laverne, who heads the Hardware Special Projects Group working on sensors and computing in autos at Argo, said, “No single sensor gets to the level of safety we are seeking.”

He spelled out the up- and down-side of each of the so-called Big three sensors, as follows:

  • LiDAR accurately tells distance and angle. But it struggles to see dark or reflective targets.
  • Cameras see colors and handle spatially dense data, Laverne said, but require complex interfaces and can fail with high contrasts or low light conditions “in pretty significant ways.”
  • Radar sees things at long range and discovers velocity, but has difficulty detecting people and has low resolution.

Then, add the IMU (inertial navigation unit), GPS, gyroscope, and wheel speed sensor and the rest feeding in their data, helping the auto computer to answer the question: “What should I do?” It all must separate out trees from stop signs, recognize vehicles, pedestrians and cyclists, always thinking, “Will someone run a stop sign?”

Said the moderator, Paul McManamon of Ohio State University: “The toughest problem is sensor fusion. How to allocate what task to what sensors?”

Lyft on campus

Christy Fernandez-Cull, Head of Sensors at Lyft Level 5 in Sunnyvale, California, said Lyft’s campus there is now on its fourth generation of the AVs that already provide campus employee ride share.

For her, the key concept is “iteration,” and Lyft has a dedicated facility with a suite of lidars and radars where that happens. “We keep improving all along the way. We want to know what competitive technologies are on the cusp.” Those devices do three things, she said, can sense, think and act. “We iterate, prototype, build, and improve.”

Cull noted that lidar and radar are more costly than any cameras to date, posing challenges in building “a minimum viable product.” But she added, “The future is bright for lidar and radar. In the not too distant future, payloads will cost significantly less than five years ago.”

She added that there are other “more exotic” chip-level enhanced imagers that have been used for defense systems, but so far they lack the market’s cost, size, weight and power (or CSWAP).

Mark Shand, a software engineer on the Lidar Team at Waymo, said his company’s goal is “building the world’s most experienced driver.”

Noting the millions of deaths and injuries on highways each year, Shand said “a fully automated system would be the safest system. That’s our focus in everything we do: Vehicles that are fully autonomous.”

Autonomous vehicles are already available for some Waymo taxi rides, he said, and “the vehicle comes with no one in it. They are not everywhere, but they are present and they are growing.”

20 billion miles – of simulation rides

Waymo, he said, has driven 20 billion miles in simulation rides “and we’ve learned a lot about driving.” It has learned how to listen for emergency vehicle sirens, for example, and to anticipate what they will do next. Shand showed a video of people at an intersection in dinosaur costumes as an AV approached. “We don’t want to be overly indexed on what humans normally look like,” he said.

AVs can recognize even a few chickens crossing the road, as they did in his video. The video spotted just a pair of legs below a large box crossing a road and, he said, decoded it as a human.

Waymo’s fifth generation hardware, he said, is producing high resolution lidar images enabling greater capability and scale, and displayed an auto with lidar, cameras, and radar located at all corners of the vehicle. All have backup units to step in if one fails, he said, making it “reliable in a full range of driving conditions.”

Shand recalled a night when he spotted wild boars in his car’s path. He braked and stopped just before hitting them. But the driver behind didn’t stop, and knocked his car into ditch, injuring him.

“It could have been much worse,” he said. “With more forward vision, the car behind could have stopped sooner. We should try to improve. It’s not acceptable as things are.”

ABTechCeNing Optics Co LtdECOPTIKHyperion OpticsAlluxaMad City Labs, Inc.JenLab GmbH
© 2024 SPIE Europe
Top of Page