18 Apr 2017
Researchers in Finland, Germany and Israel help 'democratize' the broadband sensor technology.
by Andrew Williams
In recent months, a number of organisations around the world have made great strides in the development and application of miniaturized hyperspectral imaging and sensing hardware.
One of the stand-out developments comes from the VTT Technical Research Centre of Finland, whose team claimed late last year that they had created the world's first hyperspectral mobile device by converting an iPhone camera into a new kind of optical sensor.
Over recent years, the VTT group has developed various types of novel hyperspectral imagers for applications ranging from skin-cancer detection to drone-based environmental monitoring and light-weight imagers for CubeSat space applications - all based around Fabry-Pérot interferometer tunable optical filters (FPIs).
Food quality sensing
According to research team leader Anna Rissanen, staff at VTT have also recently developed a new breed of MEMS-based FPIs that can be produced at high volumes and low cost - making them suitably cheap for use in smart phones.
"We wanted to show that with MEMS FPI technology it's possible to turn a regular iPhone into a hyperspectral imager," says Rissanen.
Based on their past work with more conventional hyperspectral imaging hardware, the VTT team believes that the new technology could be suitable for a variety of different applications depending on the wavelength range - but has already singled out food quality sensing and health monitoring as the likely key sectors.
"The whole potential of the technology can only come out once the hyperspectral data becomes available for application algorithm developers, which is the key to finding novel ways to interpret our surroundings," Rissanen told optics.org.
Although confidentiality agreements dictate that she is not currently able to share detailed information about the ongoing commercialization of the MEMS-based hyperspectral imagers, Rissanen does reveal that VTT is aiming to engage with different types of companies, supply chains and end-user needs to find different potential paths for commercialization.
The Technical Research Centre has certainly made solid progress in establishing partnerships with a wide range of established companies and startups in recent years - including Senop Oy (formerly Rikola), which worked with VTT to commercialize hyperspectral imagers for drone applications, and Revenio, which is currently commercializing a hyperspectral camera for skin cancer diagnostics.
Reaktor Space Labs is also actively involved in joint efforts to bring hyperspectral imagers to the CubeSat market.
"As a research institute, VTT does not sell products so our aim is to work with company partners for the commercial development of new camera prototypes," added Rissanen. “The markets and applications will be defined by those partners.”
Another initiative focusing on smart phone-integrated hyperspectral imaging for the food monitoring and safety is the new HawkSpex app under development at Fraunhofer IFF in Germany, which can be used to scan apples for pesticide residues.
An initial laboratory-based version of the app has already been successfully trialled at IFF - with a commercial launch currently slated for late 2017.
Elsewhere, the Israeli start-up Unispectral - established in March 2016 after a research project aimed at developing the technology at Tel Aviv University by co-founders Ariel Raz and David Mendlovic - is also working on a novel hyperspectral digital camera.
As well as being built into smartphones, the company believes that the new technology could be used for a number of applications in the burgeoning field of computer and machine vision.
"The majority of the captured images today are stored, shared and presented to humans - and the imaging technology is aimed at mimicking our eyes," explains Amir Lehr, ‘chief business officer’ at Unispectral. “Looking forward, we believe that a hefty proportion of captured images will instead serve a computer for analysis and decision-making.”
Most modern cameras today mimic the human eye, which registers light between 400 and 670 nm, by splitting light into red, green and blue pixels arranged as a Bayer Pattern - while a typical CMOS image sensor is sensitive into the near-infrared, up to around 1000 nm.
According to Lehr, more flexibility in the wavelengths captured within and beyond the visible spectrum holds out the potential to enhance human-visible images in low-light and other situations.
This flexibility, achieved by building a tunable wavelength optical filter as a MEMS device implementing a Fabry-Perot interferometer, means that the Unispectral team can displace two parallel optical surfaces to precisely select the frequency of transmitted light.
"The filter design and the optical surface coatings set the filter transmission curves,” says Lehr. “The filter is integrated into the camera lens structure and displaces - or is added to - the Bayer RBG filter pattern.”
A driver and a set of algorithms manage the camera and the filter according to the particular application requirements, and when combining this solution with a conventional CMOS image sensors, delivers hyperspectral imagery across 400-1000 nm. “A combination with different image sensors would yield a different wavelength range,” he points out.
The development appears to be catching some attention. Following a $7.5 million Series A funding round led by Jerusalem Venture Partners (JVP) and involving Robert Bosch Venture Capital, the Samsung Catalyst Fund and the Tel Aviv University Technology Innovation Momentum Fund just over a year ago, Unispectral has already identified several potential commercial applications.
Sectors expected to benefit from what Lehr describes as the 'democratization' of hyperspectral sensing include pharmaceuticals, precision agriculture, and food intelligence.
"We believe that wellbeing food-related applications, in business-to-business and business-to-consumer markets would also be early adopters of this emerging capacity," Lehr says.
Although coy about sharing any more specific details at this stage, he claims that the company is currently engaged with several “leading global companies” to drive the new capacities inherent in the technology to the market.
"We are targeting multiple hosting platforms and aim to deploy the solution starting at 2018 and expand thereafter," he told optics.org.
About the Author
Andrew Williams is a freelance writer based in Cardiff, UK.