Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
Research & Development

Sensors that mimic the human retina promise improved machine vision

03 May 2017

UK project investigates neuromorphic vision sensing for cameras and robots.

A three-year project funded by the UK Engineering and Physical Sciences Research Council (EPSRC) aims to investigate the topic of machine-to-machine communications for neuromorphic vision sensing data, in which sensors mimic the action of biological vision systems.

Titled the Internet of Silicon Retinas, or IOSIRE, the project's £1.3 million has been divided among three project streams, involving Kingston University, King's College London, and UCL, alongside industry partners.

The goal is to examine how data from artificial vision systems inspired by the human eye can be captured, compressed and transmitted between machines at a fraction of the current energy cost. The project will commence in June 2017.

At Kingston University, a group led by Maria Martini will research innovative ways to process and transmit information secured through neuromorphic sensors.

"Conventional camera technology captures video in a series of separate frames, which can be a waste of resources if there is more motion in some areas than in others," commented Martini, whose group will receive £280,000 of the ESPRC funding. "Where you have a really dynamic scene, you end up with fast-moving sections not being captured accurately due to frame-rate and processing power restrictions, and too much data being used to represent areas that remain static."

Neuromorphic sensors aim to deal with this problem by sampling different parts of a scene at different rates. The principle behind silicon retinas - also termed dynamic vision sensors (DVS) - involves transmitting only the local pixel-level changes caused by movement in a scene, and transmitting them at the time they occur.

The result is a stream of events at microsecond resolution, said to be equivalent to or better than conventional high-speed vision sensors running at thousands of frames per second, but with significant reductions in the power required.

Cloud-based analytics

"This energy saving opens up a world of new possibilities for surveillance and other uses, from robots and drones to the next generation of retinal implants," Martini said. "They could be implemented in small devices where people cannot go, and where it is not possible to recharge the batteries required."

At King's College London, a group led by Mohammad Shikh-Bahaei in the Centre for Telecommunications Research has been awarded £560,000 to explore a novel and advanced technology for layered representation and transmission of silicon retina data over the Internet of Things (IoT) for cloud-based analytics.

The third leg of the IOSIRE project will see researchers at UCL's department of electronic and electrical engineering under Yiannis Andreopoulos receive £550,000 to meet the challenge of how to represent and compact the captured neuromorphic vision streams of data, in order to allow for the most efficient transmission and processing by a cloud-based back-end processing system.

Industrial partners in the project include Samsung, Ericsson and Thales, who are interested in exploring how such sensors could be incorporated into the next generation of smart devices and used in future machine to machine communications. iniLabs, a spin-off of the University of Zurich and ETH Zurich which promotes neuromorphic engineering for a range of applications, is also a IOSIRE partner.

Iridian Spectral TechnologiesHyperion OpticsECOPTIKPhoton Lines LtdHamamatsu Photonics Europe GmbHSacher Lasertechnik GmbHUniverse Kogaku America Inc.
© 2024 SPIE Europe
Top of Page