Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
Applications

MIT Media Lab develops faster single-pixel camera

26 Apr 2017

New design principles and optimized algorithms enhance the potential uses of lensless imaging systems.

Traditional imaging and microscopy techniques employ high-magnification objective lenses to map light from an object onto a suitable sensor plane, but methods to obtain high-quality images without the use of a lens are making steady progress.

One reason for this has been advances in modern signal processing techniques, and the opportunity they provide to shift the emphasis in an imaging operation away from the optical hardware and onto the computational aspects instead.

A project in the Camera Culture Group at MIT Media Lab has now developed a method for lensless imaging that leverages both compressive sensing (CS) - one of the foundational numerical methods of computational imaging - and current breakthroughs in time-resolved optical sensing, a technology which is already key to several Group research projects.

The results show that efficient lensless imaging is possible with ultrafast measurement and CS, and point towards ways that novel imaging architectures could be put to use in situations where imaging with a lens is impossible.

"To the best of our knowledge this is the first combination of time-resolved sensing with a single-pixel camera used for detecting reflectivity - effectively, a photography application," commented Guy Satat of MIT.

"Single-pixel systems have been known and investigated for some years, while time-resolved sensing has been used for measuring reflectivity without CS, as well as in LIDAR to recover scene geometry. But we have now developed the missing piece of the puzzle, combining these approaches to recover the reflectivity and albedo of a scene."

As reported in a paper for IEEE Transactions on Computational Imaging, the new approach ultimately made image acquisition using CS more efficient by a factor of 50. In lensless single-pixel camera systems - which rely on multiple measurements by the same sensor pixel under different illumination patterns, each controlled by a spatial light modulator so as to encode different information into each measurement - the findings could help to reduce the number of exposures typically involved from thousands down to dozens.

Smarter modulations

The MIT project investigated three distinct, but closely connected, fundamental issues in lensless imaging, starting with how to guide designers towards the best system architectures for lensless applications in general. A new design framework created by the Group provides a set of guidelines and decision tools to suggest how the available resources in a given scenario can be deployed to recover the best image using CS, and define when the CS approach is likely to be of most benefit.

This need not necessarily involve single-pixel sensing - one of the goals of the framework is to help define when a single-pixel system can be of most value and when it might not be - but the framework is intended to answer relevant design questions about the best positions for particular sensors, or lay out the conditions under which an improved time-resolution might be more beneficial than additional detectors.

A second project thread was to examine the role of time-resolved signals, and clarify how an improved temporal resolution can reduce the number of individual modulated signals needed to build up a high quality image. The third area of investigation related to the optimization of those individual patterns of modulated light, and ways to squeeze more information out of each one.

"Compressive sensing allows you to modulate the light in a smarter way," said Satat. "Without it, you have to do a large number of measurements to obtain a high-resolution image, which takes time. The geometry of the system will significantly affect the time-resolved measurements, since points closer to the detector are going to be measured first while points further away will be measured later. Modulating the light allows you to accommodate this effect and obtain more information per measurement, and so potentially need fewer modulation patterns to obtain a full result."

Optics in challenging environments

Time-resolved sensing and femto-photography have already played a part in several projects in the Camera Culture Group under its leader Ramesh Raskar, an indication of the diverse practical uses that can arise from precise optical time-of-flight measurements.

Notable breakthroughs have included the use of time-resolved sensing to effectively image through scattering media by collecting and assessing all of the photons emerging after scattering events, rather than attempting to sift the most useful ones from the group. Another line of research studied ways to measure the bidirectional reflectance distribution function (BRDF), a complex parameter related to the reflection of light from an object that is a vital element in the creation of realistic computer graphics and VR animations.

"To measure the BRDF completely is a challenge, and involves a moving illumination source," said Satat. "A Group project avoided these difficulties by using a streak camera to measure the BRDF without a moving light source, recognizing instead that time-of-flight and the angle of reflection are related, and allowing us to recover more information from the time-resolved signals."

Other Group projects have included terahertz time-gated spectral imaging for analysis of layered structures, likely to be valuable in industrial inspection but already deployed in an eye-catching proof-of-concept to read the pages of a closed book; and using an open-ended bundle of optical fibers to image locations that would be difficult for a conventional monolithic camera instrument to reach, effectively turning the fibers into a group of scattered individual pixels whose locations in space are not precisely defined.

And outside MIT, a recent indication of the potential value of lensless single-pixel camera systems has come from M Squared and the University of Glasgow, and the development of an instrument to directly image methane gas leaking from a ruptured pipeline in real-time.

"Some of the interesting uses for single-pixel systems are in exactly these kinds of places, in challenging environments where it is hard to build and maintain conventional optical instruments," commented Satat. "Our work in the Camera Culture Group aims to help this process by showing ways to augment any lensless or single-pixel system, and suggesting combinations of hardware and computational techniques that can not only enhance existing applications but also enable entirely new ones."

MIT Media Lab video



About the Author

Tim Hayes is a contributor to Optics.org.

ECOPTIKTRIOPTICS GmbHLaCroix Precision OpticsMad City Labs, Inc.Berkeley Nucleonics CorporationUniverse Kogaku America Inc.Changchun Jiu Tian  Optoelectric Co.,Ltd.
© 2024 SPIE Europe
Top of Page