The usage of 3D scanning as a technique in visual effects is not a new one, but the movement toward more sophisticated visual effects with heightened realism–coupled with shrinking budgets and shortened production schedules–has prompted renewed interest as well as inventive R&D in this area. This work has made scanning an option for all purposes–from the construction of detailed models of talent and full locations to previsualization.
“The driving force in using scanning for commercials is [that one can create] realistic models in a reasonable length of time; this also gives the agency more latitude in creativity,” explains Nick Tesi, VP of 3D scanning firm Eyetronics. “[Scanning coupled with additional visual effects techniques] gives the commercial visual effects team the ability to do movie magic by having an object or person do or have something done to them that would not be practical or possible. It also helps the production speed by off loading [the modeling task] to a more efficient system that allows the commercial house to put their CGI talent on other aspects of the project.”
Eyetronics–headquartered in Leuven, Belgium, and with a U.S. base in Torrance, Calif.– has developed and operates a portable scanning system and service where the team travels with the system on set, for scanning of people, objects, cars and the like.
“In most cases we will go to the set to do the scanning with the actors in wardrobe,” Tesi says. “We have also scanned cars in commercials which can be at our location or close to our studio location.
“Our scan takes a half-hour or less,” he continues. “So for commercials we go on set and scan at the availability of the director. The key for us working on commercials is also flexibility and the ability to turn models over quickly.”
A recent example is the use of Eyetronics’ portable scanning system by West Hollywood-based visual effects house Ring of Fire for “Basketball,” a commercial in the Nike Lebron campaign out of Wieden + Kennedy, Portland, Ore. These spots feature basketball star Lebron James playing various roles. “Basketball” was helmed by Stacy Wall of bicoastal Epoch Films.
Ring of Fire creative director Jerry Spivack explains that during production the Eyetronics team brought its scanning device to the location shoot, and did the scanning betweens setups on the day of the production.
“We did full 360-degrees of Lebron’s head and four or five facial expressions for each,” he relates. Spivack reports that it took five to seven minutes per “character,” a plus as they had a very short window with the athlete. “I like Eyetronics because they have a flexible and fast system,” Spivack says. “The flexibility to bring it on set and shoot there was really great. …A few years ago I would not have thought of [scanning on Nike “Basketball”] as a approach because of time. For most commercial schedules, it would not have worked.”
Eyetronics supplied the wireframe models, textures and all the other needed information to Ring of Fire, which then used this data in the 3D environment to create the Lebron characters. The characters were then composited into the final scenes of the commercial.
Additional spots that were produced using Eyetronics scanning include Santa Monica-based The Syndicate’s British Telecom “Network” ad that was a 2005 nominee for a Visual Effects Society (VES) Award for outstanding visual effects in a commercial. Santa Monica-based effects/animation house Method Studios used the scanning technique for Pepsi’s “Dancetron,” out of Spike DDB, New York.
And Venice-based 2D/3D animation Motion Theory tapped Eyetronics to produce visual effects for Electronic Arts’ “Mechanical Warriors” for Wieden+Kennedy.
LIDAR R&D is also forwarding the use of scanning. An example involves a new process that starts with traditional LIDAR (Light Detection and Ranging) scanning, or the process of collecting potentially millions of points from a location that are measured by distance and then connected to a polymesh, which is an early stage of a 3D model. Traditionally the mesh needs to be cleaned or remodeled before it is useable, which is a time consuming process.
LIDAR scanning is predominantly used for engineering. Venice-based Gradient Effects has developed a LIDAR-based technique designed from a filmmaker’s point of view to streamline, simplify and reduce the processing times of sampled data while creating highly detailed 3D scenes from a set or location. (Gradient was founded last May by Thomas Tannenberger, co-founder of Germany’s Das Werk, and Olcun Tan, who previously held R&D positions at companies including London’s Mill Film and Los Angeles’ Dreamworks Animation.)
“[With Gradient’s technique] it is possible to scan and immediately utilize the data on location, where traditionally the data had to be processed before it could be used at all,” explaind Tan. “Our tools allow us to utilize the data right after the scanning process in our favorite 3D applications [such as Maya]. This reduces the cost for utilizing a scanner on location dramatically. Since Gradient Effects is foremost a visual effects house, we stripped everything away in the process, which we feel is not necessary.”
Tan emphasizes that the system was not developed with the idea to sell it as a service; rather, it was conceived to streamline Gradient’s own visual effects pipeline and to enhance the quality of the work. “The whole thing started out of necessity,” Tan says. “Gradient Effects works very closely with directors and producers and to be able to meet the creative and budgetary requirements, we had to streamline the process of visual effects both on location and back in the visual effects facility.”
The process involves the use of LIDAR scanning coupled with proprietary software written at Gradient. “In-house we use it for previz, modeling, animation, simulations and lighting,” Tan explains. “Our tools can inversely reconstruct camera information [metadata]; this means that we really don’t need any camera notes. Another benefit is that we don’t need to have as many tracking markers on set as usual, which results in less interference with the filming process.”
The Gradient system is still relatively new and is available for commercial and feature productions; it has already been used on several films including the soon to be released Richard Kelly-directed Southland Tales and the Mennan Yapo-directed Premonition. (Tannenberger was visual effects supervisor on both films).
“On Premonition and Southland Tales, we pushed our scanning process further into production,” Tan relates, explaining that it was tapped for previz by using the actual location data. “Since we have a digital fingerprint of the location we can easily determine how many cameras will be required and where to place them. We worked with all departments together to help them communicate better and to save resources and [prevent] surprises.”
Gradient’s scanning techniques were also used as part of the special effects process. “We were able to determine all safety distances from the explosives and we could still make sure that the director was able to achieve his artistic goals,” Tan says. “And the best was, we were able to brief the whole team without moving out of the production offices. That saves a lot of money for the production and the visual effects. You can ramp your resources as you really need them.”