Optics.org
daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase
Optics+Photonics Showcase
News
Menu
Research & Development

Better digitization of transparent objects builds more faithful 3D scenes

26 Sep 2017

Technical University of Denmark technique could assist both VR and analysis of material properties.

Techniques for optical acquisition of physical objects and their surroundings lie at the heart of virtual reality (VR) systems and material design applications, but the digitization process becomes more challenging when those objects are translucent or transparent.

In particular, the acquisition modalities needed for clear materials and transparent objects are different from those used for items with more diffuse reflectance properties, and the process of reassembling a convincingly accurate scene from the combined results of those different modalities can be challenging.

A project at the Technical University of Denmark (DTU) has now developed a multimodal digitization pipeline that can more accurately capture transparent objects, through measurement of bidirectional reflectance distribution functions and high dynamic range imaging of the lighting environment. The work was published in Applied Optics.

"By more accurately digitizing transparent objects, our method helps move us closer to eliminating the barrier between the digital and physical world," said Jonathan Stets, co-leader of the research team. "For example, it could allow a designer to place a physical object into a digital reality and test how changes to the object would look."

Computed tomography (CT) scans are one modality for digitizing transparent objects, while structured light scanning is an established technique for scanning non-translucent items. The initial acquisition stage of the pipeline developed at DTU combined both methods, and applied them to scenes containing a range of transparent glassware and opaque material.

A key development was the use of a robotic arm to control the precise locations of two cameras recording the scene. The detailed spatial information that this provided meant that researchers could take photographs of a scene, remove the transparent object and scan it in a CT scanner, and then place it back into both the real scene and the final rendered digital recreation, to accurately compare the final images pixel by pixel.

Faithful recreation
This kind of quantitative comparison between a photograph and a rendered image has not been possible before, according to the project team, since such judgments are hard to make by eye.

"Transparent objects take most of their appearance from the surroundings, which complicates digitization of scenes with transparent and opaque objects," Jeppe Frisvad of DTU commented to Optics.org.

"We take reference photographs of the scene and then pick it apart, to have the different objects scanned using different imaging modalities. This includes a full digitization of both surface geometry, material reflectance, and lighting environment. We then developed techniques for digital reassembly of these scans, to obtain a digital version of the physical scene that can be rendered and compared directly with the reference images."

Judging the accuracy of photorealistic rendering relies on perceptual experience and on whether the end result appears to look as expected, but the use of a calibrated camera controlled by a robotic arm provides precise camera parameters for the digital version of the scene.

"In this way, our work enables better assessment of the accuracy of digitized physical objects, so that we know if they faithfully represent their physical counterparts when used in a virtual reality," said Frisvad.

The new technique has also created a dataset for photographed glass objects and their corresponding CT scans, which may help the testing of future techniques for reconstruction of transparent objects. The photographed reference images would serve as input, while the CT scans would provide reference surface geometry of the glass objects, Frisvad noted, providing a testbed for developing vision techniques able to deal with notoriously difficult transparent objects.

"What we find particularly interesting is that our digitization pipeline enables us to consider photographs as empirical evidence," Frisvad said. "This means that we can modify optical properties of the materials or different steps in the pipeline, and check the validity of such changes by quantitative comparison between the rendered images and the empirical evidence of the photographs."

About the Author

Tim Hayes is a contributor to Optics.org.

Hyperion OpticsCHROMA TECHNOLOGY CORP.Photon Lines LtdBerkeley Nucleonics CorporationAlluxaOmicron-Laserage Laserprodukte GmbHChangchun Jiu Tian  Optoelectric Co.,Ltd.
© 2024 SPIE Europe
Top of Page