daily coverage of the optics & photonics industry and the markets that it serves
Featured Showcases
Photonics West Showcase

Illumination simulator snapped up by Hollywood

14 Aug 2014

Car industry also interested - as lighting and design innovations debut at this week's Siggraph expo.

Creating a realistic computer simulation of how light illuminates a room is crucial to create a realistic impression in computer-animated movies like “Toy Story” or “Cars”. Such computing methods require great effort and processing power.

Now, computer scientists from Saarbrücken, Germany, have developed a novel approach that has been adopted by film companies in record time - including Pixar, well-known in the movie industry for its computer animation, and now a subsidiary of the Walt Disney Company.

The realistic depiction of light transport in a room is important within the production of computer-generated movies. If it does not work properly, the three-dimensional impression is rapidly lost. Hence, the movie industry’s digital light experts use special computing methods, requiring enormous computational power and therefore raising production costs.

Carmakers' interest

Not only in the film industry, but also in the automobile industry, companies are investing significantly to create lighting conditions for a computer generated designs and images as realistic as possible. During the development process, entire computing centers are used to compute and display realistic pictures of the complex car models in real time. Only by this means can designers and engineers evaluate the design and optimise product features at an early stage.

“Nowadays, they build hardly any real solid prototypes. Hence, the designers want to make sure that the car body on the screen looks exactly as the real vehicle will appear later,” said Philipp Slusallek, professor of computer graphics at Saarland University, Scientific Director at the German Center for Artificial Intelligence and Director of Research at the Intel Visual Computing Institute at Saarland University.

Monte Carlo or bust

Slusallek said that using current computing methods, it has not been possible to compute all illumination effects in an efficient way. The so-called Monte Carlo Path Tracing can accurately depict direct light incidence on surfaces and the indirect illumination by reflecting light from surfaces in a room. But this approach does not work well for illumination around transparent objects, like semi-transparent shadows from glass objects, or illumination by specular surfaces, so-called caustics. This, on the other hand, was the advantage of so-called photon mapping. But this method again led to disappointing results for direct lighting of surfaces.

His international team reformulated photon mapping as a Monte Carlo-compatible process. Hence, they could integrate it directly into the Monte Carlo Path-Tracing method. For every pixel of the image the new algorithm decides automatically, via so-called multiple importance sampling, which of both strategies is suited best to compute the illumination at that spot.

The researchers from Saarbrücken also supplied mathematical proof that the results of the new computing method comply with those of the two former methods.

He explained, “Our new method vastly simplifies and speeds up the whole calculating process.”
The method called Vertex Connection and Merging‚ abbreviated as VCM, was not only accepted at one of the most important conferences within the computer graphics research field - SIGGRAPH - in 2012, but was also well received by industry.

“We know of four different companies that partially integrated VCM in their commercial products only a few months after the scientific publication. The most recent example is the new version of the software Renderman developed by the company Pixar. For decades this has been the most important tool in the movie industry. We are very proud of this achievement,” adde Slusallek. Pixar is notable for producing movies such as “Toy Story,” “Up,” “Finding Nemo,” and “Monsters, Inc.” Now part of the Walt Disney Company, Pixar has so far received twelve Oscars for its movies.

Slusallek and his research group are presenting a new scientific paper at the SIGGRAPH 2014 conference, in Vancouver, this week (August 10-14). They are demonstrating that the new VCM method can be efficiently implemented on highly parallel graphics processing units very efficiently. As this research has been funded by the American semiconductor producer Intel, among others, the researchers will be presenting their results at Intel’s Siggraph booth.

‘Spectacular’ 3D sketching system revolutionizes design collaboration

In a related development, researchers from Canada’s Université de Montréal have been showcasing their Hyve-3D system at the SIGGRAPH 2014 Conference. Their big boast is that “collaborative three-dimensional sketching is now possible with our Hyve-3D system”. Hyve-3D stands for Hybrid Virtual Environment 3D.

Lead researcher Professor Tomás Dorta, of the university's School of Design, explained, “Hyve-3D is an interface for 3D content creation via embodied and collaborative 3D sketching. This is a full-scale immersive 3D environment in which users create drawings on hand-held tables. They can then use the tablets to manipulate the sketches to create a 3D design within the space”.

For example, as the designers are immersed in their work, this could mean designing the outside of a car, and then actually getting into it to work on the interior detailing. Univalor, the university's technology commercialization unit, is supporting the market launch of the system.

The 3D images are the result of an optical illusion created by a widescreen high-resolution projector, a specially designed 5m-diameter spherically concave fabric screen and a 16in dome mirror projecting the image onto the screen. The system is driven by a MacBook Pro laptop, a tracking system with two 3D sensors, and two iPad mini tablets. Each iPad is attached to a tracker.

Dorta added, “The software takes care of all the networking, scene management, 3D graphics and projection, and also couples the sensors’ inputs and iPads. The iPads run a satellite application, which serves as the user interaction front-end of the system. Specialized techniques render the 3D scene onto a spherical projection in real-time. The Hyve-3D software also works on conventional 2D displays.”

About the Author

Matthew Peach is a contributing editor to optics.org.

Universe Kogaku America Inc.First Light ImagingSPECTROGON ABLASEROPTIK GmbHLaCroix Precision OpticsMad City Labs, Inc.Berkeley Nucleonics Corporation
© 2024 SPIE Europe
Top of Page