01 Sep 2021
Updated 'Fusion' hardware for autonomous vehicles harnesses Doppler Effect to determine position and velocity of hazards.
Aurora Innovation, the autonomous driving systems company that is looking to raise $2 billion in a special purpose acquisitions company (SPAC) -powered Nasdaq listing, has outlined plans for its future commercial roll-out.
Based around 360-degree cameras, radar, and frequency-modulated continuous-wave (FMCW) lidar, the “Fusion” version of the Aurora Driver is scheduled to appear in pilot projects later this year, with incorporation into self-driving trucks and robotaxis slated for 2023 onwards.
“With more powerful sensors condensed into a sleek, modular, automotive-grade rack and a new powerhouse computer, Aurora’s hardware is feature-complete and primed to deliver a safer, more reliable Aurora Driver at a commercial scale,” boasted the Silicon Valley firm, whose backers include Amazon, Uber, and Volvo, among others.
FMCW lidar
Optical technology is at the heart of the system, most notably in the form of Aurora’s “FirstLight Lidar”. The FMCW technology, initially developed by Blackmore Sensors and Analytics before it was acquired by Aurora in 2019, uses the Doppler Effect to determine the location, speed, and direction of other vehicles on the road.
That capability requires a much more complex optical arrangement than the conventional pulsed lidar systems that have been deployed in the vast majority of automotive lidar applications thus far. A handful of other companies, including a group at Intel, are also working on the FMCW approach.
“By leveraging FirstLight’s data, the Aurora Driver can track the velocity and compute the acceleration of vehicles over 400 meters away faster than ever before, creating more time for braking and responding safely,” states Aurora.
The FMCW lidar also utilizes what the company calls “OURS” silicon photonics, which it is said will enable mass-production of the lidar sensors at low cost when the Aurora Driver product is deployed in commercial volumes.
OURS is an abbreviation of Optical Universal RISC Systems (OURS) Technology, an early-stage silicon photonics startup spun out of the University of California, Berkeley, that Aurora acquired earlier this year.
Custom camera lenses
Other key elements of the Fusion hardware are cameras with a 360-degree field of view, which Aurora says combine the most advanced automotive-grade sensor technology with custom-designed lenses.
“The cameras allow the Aurora Driver to detect objects even in challenging lighting situations like facing headlight glare and sun glare, and entering and exiting tunnels,” it claims.
Rounding off the sensor suite is an imaging radar system said to be more precise than traditional approaches, providing Aurora Driver with high resolution and broad coverage that complements the optical data received from its cameras and lidar.
“The improved imaging radar sensors on Aurora’s hardware produce true and precise 3D images despite challenging weather conditions like rain, dense fog, and snow,” states the firm.
Aurora adds that the Fusion hardware is intended to operate all of Aurora’s vehicle platforms, ranging from trucks and light-duty vans, to passenger cars.
Its senior VP of hardware, Sandor Barna, said in a company announcement: “Aurora’s hardware fuses the best of many generations of hardware development from Aurora and Uber ATG into a single, optimized, deeply integrated system, setting us up for the successful deployment of the Aurora Driver.”
The system will be able to handle typical challenges including sun glare, bright emergency vehicle lights, dust, pedestrian detection at night, small object detection, and fast-moving objects such as speeding motorcyclists, added Aurora.
Before the autonomous trucking and ride-hailing businesses are launched, the system will first appear on board Aurora’s “Class 8” [i.e. large-scale] trucks in pilot trials, as well as a test fleet of Toyota Sienna minivans that are expected on public roads by the end of this year.
Hyundai robotaxis; Waymo lidar re-think
Meanwhile, major Korean car maker Hyundai has revealed that its all-electric “IONIQ-5” cars will be used as fully driverless robotaxis, starting in a couple of years.
For partner Motional, which uses surround-view lidar sensors from Velodyne - alongside cameras and radar - in its self-driving setup, the Hyundai car will represent its first commercial vehicle.
The Boston-headquartered company plans to begin transporting public passengers from 2023, in collaboration with taxi firm Lyft.
• According to a Reuters report, autonomous vehicle pioneer Waymo will no longer sell its lidar sensors to third parties working on non-automotive applications - although it will still build lidar units for internal use.
In 2019, the Alphabet-affiliated business said it planned to sell one of its three different in-house lidars to customers in applications such as robotics and farming, to help deliver the economies of scale needed to reduce the cost of the technology for self-driving vehicles.
The Reuters report also states that Tim Willis, previously general manager of Waymo’s short-range “Laser Bear” lidar offering, left the company earlier this year to join rival developer Aeva.
© 2024 SPIE Europe |
|