20 Nov 2014
Enables vehicle to locate itself on a road, demonstrated last week at Michelin Challenge, Chengdu, China.
The technology, which is based on the use of simple video cameras, was developed by researchers at Institut Pascal, a partnership of CNRS/Université Blaise Pascal de Clermont Ferrand/IFMA in France.
The vision system is at the heart of the EZ-10 autonomous shuttle vehicle developed by Ligier Group2, which was last week unveiled at the Michelin Challenge Bibendum event in Chengdu, China.The EZ-10 vehicle is marketed by the EasyMile, a joint venture between Ligier and Robosoft Technology.Over the past decade, researchers at Institut Pascal have been working on automated control of urban electric vehicles based on simple video cameras. The technology is based on two operational phases: 1. identifying all significant points in the immediate environment of the shuttle's path based on a video recorded during a manually controlled drive then, 2. in automatic mode, the vehicle continuously monitors its path, ensuring that the images correspond to the recorded sequence. The initial video creates a "virtual track", which the vehicle follows in autonomous mode.
Since 2006, the researchers at Institut Pascal, in collaboration with Ligier Group, have been developing automatic driverless shuttle vehicles that can transport up to 10 people along short routes (in the region of one kilometer), rather like a horizontal elevator.
'Accuracy of 3-5cm'
Michel Dhome, head of the R&D lab at the Institut Pascal and principal researcher on the EZ-10 development project told optics.org, “The positional accuracy of the video-led driving system controls the shuttle with a margin of just 3-5cm. The video system is based on a conventional video camera fitted with a fisheye lens that offers a 160-degree field of view. There are two cameras per vehicle; one at the front and one at the back.
"The software manages the data points from the surrounding view in bundles such that the observed 3D view on auto-driving mode can be precisely matched to the view collected in learning mode with a human driver. There are other potential application sin robot arm control and in travel systems for people with restricted vision. The first two EZ-10 vehicles are currently being deployed in real-world applications in Clermont Ferrand, France and at EPFL, Lausanne, Switzerland, with further commercial deployments expected.”
The researchers now intend to turn their attention to running a fleet of five vehicles at the Michelin Europe Technology Center at Ladoux, France. The aim is to deal with multiple and potentially simultaneous requests from call points or smartphones, in real time and on a large industrial site, rather like an automatic taxi service.
About the Author
Matthew Peach is a contributing editor to optics.org.
|Hyundai Mobis develops Driver State Warning technology|
|Metasurface gratings offer new route to polarization imaging|
|Quanergy lidar sensors land security mission|
|Trumpf platform brings high productivity to denture manufacture|
|Partnership to enable ‘plug-and-play’ 3D facial recognition|
|LASER 2019: German firms tout auto lidar components|