The road to the red carpet isn’t carpeted. Instead it’s often an uncharted path–or sometimes there’s no path to begin with, necessitating that filmmakers construct one to meet assorted creative and technical challenges.
So SHOOT looked to gain further insights into the path taken and its journey, tapping into VFX and/or animation artisans who were behind several of the notable projects to emerge this awards season, such as:
• Harry Potter and the Deathly Hallows Part 2, nominated for the Visual Effects Oscar.
• Hugo, an Academy Award nominee in the Visual Effects category, and earlier this month the winner of two VES Awards® for Outstanding Supporting Visual Effects in a Feature Motion Picture, and for Outstanding Virtual Cinematography in a Live Action Feature.
• Real Steel, which earned a Best Visual Effects Oscar nomination.
• Rise of the Planet of the Apes, also nominated for a Visual Effects Oscar as well as recipient of two VES Awards–for Outstanding Visual Effects in a Visual Effects-Driven Motion Picture, and for Outstanding Animated Character (Caesar) in a Live Action Feature Motion Picture.
• Transformers: Dark of the Moon, Academy Award-nominated for Visual Effects, and a two-time VES Award winner–for Outstanding Created Environment in a Live Action Feature Motion Picture, and for Outstanding Models (Driller) in a Feature.
• DirecTV’s “Hot House,” one of the spots that helped director Noam Murro of Biscuit Filmworks earn the DGA Award as Best Commercial Director of 2011. Out of Grey New York, “Hot House” was also nominated in the VES category for Outstanding Compositing in a Broadcast Program or Commercial.
• And Volkswagen’s “Black Beetle,” one of the commercials that helped director Dante Ariola of MJZ earn his sixth career DGA Award nomination. The spot came out of Deutsch LA.
SHOOT posed the following question to VFX artisans involved in these projects:
What was (were) the greatest creative and/or technical challenge(s) from a visual effects standpoint posed by your Oscar, VES or DGA-nominated work?
Here’s a sampling of the feedback we received:
Matthew Butler, Digital Domain, VFX supervisor, Transformers: Dark of the Moon (ILM was the lead VFX house on this film; DD’s Butler is one of the movie’s Oscar-nominated artists.) Transformers: Dark of the Moon brought Digital Domain back together with director Michael Bay and ILM. It also brought the franchise to audiences in stereo 3D for the first time. Our primary tasks were to create and animate a number of new and returning characters, a sequence where protoforms rise out of the Moon’s surface and escape through a space portal, and one featuring live-action skydiving soldiers bailing out of flaming helicopters and flying over a destroyed Chicago. That “Birdmen” sequence had several challenges. Michael knew that he wanted to shoot it with an overcranked camera–at 120 fps. Creatively that hyper-slow motion is beautiful. Technically, it’s incredibly complex. The plate was shot monoscopically, because of the high-speed camera, which meant we had to dimensionalize it later. There’s a completely CG aircraft, on fire, falling through the frame right in front of the audience’s eyes. Creating fire and fluid simulations that were shown in slow motion meant being completely exposed. Because of the detail in that CG Osprey and associated fire and smoke, we wanted to avoid dimensionalizing the CG work. So, instead of turning over the whole shot for conversion, we designed a CG camera and gave our Stereo Group the specs to match, so they could dimensionalize the plate elements only. We then integrated our CG elements into it. The “Moon Portal” sequence was challenging in that the moon’s environment has no atmosphere, with dynamics and visual attenuation that are different from Earth’s. It’s quite beautiful but smacks of unreality. To achieve a more believable look we used real elevation map data to generate accurate topography, and studied the moon in hundreds of reference photos. For animation, we had to consider that gravity on the moon is 1/6th of Earth’s, so the protoforms had to move according to the laws of physics without appearing to run in slow motion. Also, the movie was shot with many different cameras–stereo digital F35s, anamorphic 35mm film mono, spherical 35mm film mono from the Spacecam work and stereo SI-2K as well as 16 mm color and black-and-white. Because Michael wanted the Decepticon vision to feel real and hand-held, we also had video. Each format has its own idiosyncrasy, and we developed approaches to accommodate for their differences and to integrate all of this footage as seamlessly as possible. |
Ben Grossmann, PixomondoVFX supervisor, HugoThe greatest technical challenge on Hugo was also a creative challenge, as we had to adapt our techniques to reflect Marty’s [Martin Scorsese’s] vision for early filmmaking. Hugo revolves around the pioneer of our craft, Georges Melies, and as such, we wanted to pay homage to him as much as possible. It gave us a great challenge to do things differently and create some of the visual effects the way he would have. Melies was a genius. We didn’t fully appreciate his work until we really started studying his films during pre-production. Once we had absorbed his body of work more fully, we came away with the inspiration and techniques to tackle many of the film’s visual challenges. Rather than immediately jump to the latest technology to tackle every challenge Marty gave us, we started with the simplest and oldest methods first, and worked our way to the modern day approaches as a “last resort.” For example, we had to create a shot where Sasha Baron Cohen is dragged by a train, but we had a train that couldn’t move. In Melies’ films, he would create the illusion of something moving or growing by actually moving the opposite thing. So instead of moving the train, we built the set around it on wheels, positioned actors and props on it, and moved the platform. It gave us a convincing, nearly in-camera solution to a 100-year-old problem. Another scene required a wind-up mouse to give a directed performance. Rather than make a computer-generated mouse, we opted for another old Melies trick, stop-motion animation. The mouse prop was animated and photographed one frame at a time, to create the illusion of movement, and then composited with the live-action performances of Sir Ben Kingsley and Asa Butterfield. We also used a lot of fun old tricks–like miniatures and timelapse photography–that swept us into the spirit of Melies’ work and helped create the essence of the film. Although there was cutting-edge technology employed to create the stereoscopic VFX, we approached everything the way we thought Melies the Magician might have if he were here today and that is a big reason Hugo has resonated with audiences. |
Dan Lemmon, Weta Digital, VFX supervisor, Rise of the Planet of the ApesWe had a number of significant challenges in making Rise of the Planet of the Apes. On the technical side, we had to adapt our performance capture system to work in a live-action shooting environment, which meant making it more portable, more flexible, and reconfiguring our hardware so that it could work outdoors in broad daylight. We switched from a passive reflective marker system to an active LED marker system that flashed intense bursts of infrared light directly from the performers’ bodies. That allowed the markers to punch through the bright ambient light levels of the outdoor sets. We also revamped our systems so that it took far less time to set up and calibrate a performance volume, and we used new techniques to mask out problematic objects in the scene like shiny cars or bright lights. These advances enabled us to work in cramped and cluttered sets as well large open spaces, and helped us quickly build new performance volumes each time the motion-picture camera moved to a new setup. Our biggest creative challenge was to preserve the actors’ performances while making them look as much like real, present-day apes as possible. One thing we did was use arm extensions anytime the performers were moving quadrupedally. These extensions served to lengthen the performers arms by about 10 inches, bringing their proportions and the orientation of their bodies closer to that of real apes. We also made a number of small modifications to Caesar’s facial design so that he could more closely match Andy Serkis’ facial expressions. There are obviously big differences between the facial anatomy of a chimpanzee and a human, so the challenge was to sneak in bits of Andy wherever we could without making Caesar look too human. We adapted Caesar’s eyes to include folds above the upper eyelids that more closely matched Andy, and we added complexity to Caesar’s eyebrows, giving them a slight crease that we saw in many of Andy’s facial expressions. That allowed us to translate Andy’s performance more directly across to Caesar while still preserving Caesar’s “chimpness.” |
Erik Nash, Digital Domain,VFX supervisor, Real SteelReal Steel is set in the very near future; it just happens to take place in the world of robot boxing. As such, the visual effects effort was geared toward creating a seamless and unquestioned reality to serve as the setting for our father and son story of redemption. The boxing robot conceit had to be realized so as to be absolutely plausible and integrated into the live photography, leaving no doubt in the audience’s mind that these robots are real. Legacy Effects built three animatronic robots that were essential to the realism that director Shawn Levy wanted for interactions between the father, son and their robots. They also served as an invaluable reference for the CG counterparts we developed at Digital Domain. Making the practical and CG robots indistinguishable–making the steel real, was a big challenge. In addition to the three CG robots created as counterparts to the animatronic robots, Digital Domain modeled and animated nine additional CG robots. Making the boxing matches between these eight-foot tall CG robots feel authentic was a major undertaking, and we designed a virtual production workflow to help us achieve that goal. Working with Giant Studios we first motion-captured the fights (using keyframe animation to remove the rehearsed feel of fight choreography), shot them virtually, and edited the sequences, creating extremely detailed pre-vis to guide principal photography. We brought our motion capture team and hardware with us on location to Detroit. This enabled Shawn and the camera department to shoot the boxing matches using Simul-Cam to make the pre-captured fight animation visible through the shooting camera, in the practical environment, with complete spatial and temporal accuracy. This gave us the potential to imbue the fight cinematography with a visceral immediacy impossible to achieve shooting empty plates in a traditional manner. Another key creative challenge was to enable the audience to connect with the lead robot, Atom, without the benefit of dialogue or facial expression. Atom’s character arc had to be communicated through subtle body movements and personal interactions outside the ring. All three parties involved in creating his performance–puppeteer Jason Mathews of Legacy Effects, MoCap performers Garrett Warren and Edie Davenport and the Digital Domain animation team–had to make Atom an appealing and sympathetic character through the uses of subtle and nuanced body language alone. |
David Vickery, Double Negative Visual Effects, VFX supervisor, Harry Potter and the Deathly Hallows Part 2It was obvious from the outset that the hugely varied creative and technical hurdles being posed to visual effects by The Deathly Hallows part 2 were going to be some of the most challenging of the series so far. Creative industries like ours are driven by rapid technological developments. It’s inevitable that at some point after any project, whether its six months or two years later, you will look back at your hard work and know you can improve on it. Part of the beauty of Harry Potter is that it’s kept letting us go back to it. Deathly Hallows Part 2 was the eighth movie in the series, and we have worked hard to make each one more spectacular than the last. Directors and movie goers alike quite rightly expect everything to be “better” than last time and being asked to continually re-invent ideas does not get any easier with experience. One of the brand new visual effects challenges on HPATDHpt2 was Hogwarts. For each of the previous seven films, Hogwarts has been primarily realized through a combination of practical locations; Art department fabricated sets and scale miniatures, much of which did not exist any more. Shooting the battle of Hogwarts using the existing 1/24th scale miniature would have required many hundreds of model shots. Set pieces would need to be rebuilt at multiple scales and in many states of repair. A shoot of this magnitude would have cost the production many months of painstaking motion control work. The solution for Deathly Hallows Part 2 was to replace the Hogwarts miniature with a fully digital counterpart. Vital collaboration with the art department produced over 1,400 architectural blueprints that were the result of over 10 years worth of set construction and miniature builds. VFX had the daunting task of sorting and cataloguing these drawings. The 3D build effort required more than two years worth of modelling and texturing for a team of over 30 artists. The school itself was made up from over 74 individual 3D buildings modelled to three levels of detail. All of which combined to create an asset built from more than 7 million polygons, covering a virtual landscape that stretched for over seven miles. In order to realize this digital set in a detailed enough way to serve as the arena for the series’ grand finale, no detail could be left out. The team referenced heritage castles and cathedrals around the UK in order to model the digital school as it would have been physically built. In the computer we constructed roofs from individual slate tiles all supported by 3D batons, joists and rafters. Towers contained winding stone staircases with beautifully detailed handrails. As we destroyed the school, this wealth of detail we had built into really helped bring it to life. |
Tom Bussell, head of 3D, The Mill; Volkswagen’s “Black Beetle”When a project is predominantly based around animation, our strong relationship with director Dante Ariola meant that he was happy to take a leap of faith trusting us creatively and technically. The reality of such a quick turnaround and so much CGI in a commercial like VW Beetle is that it only comes together in the final few days, and with the spot debuting during Super Bowl 2011, the pressure was on. The biggest challenge Dante presented us with was getting the hero beetle’s design just right. In a car commercial with no actual car, we needed our beetle to subtly reference the VW Bug’s design without the insect feeling too engineered. Although our brief was to create an insect that behaved like a car, it was important to stay anatomically correct in order for the animation to be believable. If you look closely, you can make out subtle shapes in the beetle’s shell that act as wheel arches, the eyes are headlamps and the silhouette from the profile is very similar to the VW’s design. To achieve this, we studied nature documentaries on insects and gathered slow motion footage, building our digital insects with this in mind. We also referenced iconic movie car chases including Starsky and Hutch, Fast and the Furious, the Matrix and Bullet. Another big creative challenge was ensuring all of the FX around the action looked realistic. We created eight main insects and in cases like the mantis and ants, tweaked each to be unique. For all the other insects, we matched them to how nature intended them to be. That was the easier part… We wanted to go that extra step to ensure the texturing was as close to nature as possible so approached London’s Natural History Museum. They helped us find the specific creatures we needed so we could take high-res photographs that would then be used to texture our CGI insects, combining these stills with hand painted textures in Photoshop to get the final result. We then took the model back into ZBrush to add the final details before rendering in XSI and Mental Ray. All of the background FX were done in Maya and we added particle atmosphere like pollen and small flying insects. Again we used references from various elements, such as radio-controlled cars skidding through dust to cars driving through the desert, although we had to use some artistic license here to ensure we gave the necessary drama Dante’s script needed. |
Michael Wynd, visual effects supervisor, MPC LA; DirecTV’s “Hot House”DirecTV’s “Hot House” was the fourth of the “frozen moment”-themed commercials that we’d produced at MPC LA, but the first that we’d completed with Noam Murro directing. Our first challenge was to provide Noam the ability to tell the story as he wished without getting too bogged down in the technical issues demanded by postproduction. We wanted to keep the camera fluid without imposing shots to be either locked off or motion controlled, and we needed to develop the means by which we could place the observer and child talent into the inferno without them wearing any fire protection. This basically meant the requirement to create fire where there was none. Creatively the issue was to resolve just how to get in and out of the frozen moments, a topic that was common to all the commercials in the series. Noam’s collaborative approach meant that we all had a clear understanding of what was to take place on the shoot and what expectations were being placed on the visual effects. Noam’s team of DP Simon Duggan, Bruce McCloskey’s production design, along with the special effects provided by Full Scale Effects presented us with the foundation upon which to build a great commercial. Obviously the frozen fire was an enormous technical challenge. We used Maya’s fluid tools to create many of the fire elements and it was interesting to see in our initial tests just how fluid-like frozen fire actually looks. Some very fine tweaking was required to ensure that the frozen fire actually looked like fire and not a liquid. The shoot itself was spectacular. Buildings on a sound stage set on fire…repeatedly! Between takes we were literally waiting for the smoke to clear. From that point forward our task was to enhance the elements that were provided and where necessary, create a blazing inferno in a room that wasn’t set alight. MPC’s toolset of Flame, Nuke and Maya all came together on this project. Fire, smoke, debris, embers, shadows, highlights, reflections and breaking structures were all digitally created or enhanced, then tracked into the live-action footage to produce the final product. |