Visual effects supervisor Rob Legato used a wide variety of Blackmagic Design products to create the virtual production environment for Disney’s “The Lion King.” The film, which was directed by Jon Favreau and features the voices of Donald Glover and Beyoncé Knowles-Carter, has earned nearly $1.5 billion worldwide since its July 19, 2019 opening.
With the technology available today, producing a 3D animated feature film doesn’t have to be a process of waiting for test animations from an animation team. Visual effects supervisor Rob Legato, an Academy Award winner for films such as “Hugo” and “The Jungle Book,” wanted to take the technology to a new level, and create a space where traditional filmmakers could work in a digital environment, using the familiar tools found on live action sets. “The goal wasn’t to generate each shot in the computer,” said Legato, “but to photograph the digital environment as if it were a real set.”
Bringing beloved characters back to the big screen in a whole new way, the story journeys to the African savanna where a future king must overcome betrayal and tragedy to assume his rightful place on Pride Rock. Like the original 1994 movie from Disney Animation, which was for its time an amazing accomplishment in 2D animation, the 2019 version pushed the abilities of modern technology once more, this time utilizing advanced computer graphics to create a never before seen photorealistic style. But beyond the final look, the project embraced new technology throughout, including during production, utilizing a cutting edge virtual environment.
The production stage where “The Lion King” was shot might look strange, with unusual devices filling the main floor and an array of technicians behind computers around the perimeter, but these were just the bones of the process. To begin shooting, director Favreau and cinematographer Caleb Deschanel wore headsets that placed them in the virtual world of Mufasa and Simba.
Rather than forcing the filmmakers to adapt to digital tools, Legato modified physical filmmaking devices to work within the virtual world. A crane unit was modified with tracking devices to allow the computers to recreate its motion precisely in the computer. Even a Steadicam was brought in, allowing Deschanel to move the camera virtually with the same tools as a live action shoot. The goal was to let production create in a traditional way, using standard tools that existed not just on a stage but in the computer. “In traditional pre vis you would move the camera entirely within the computer,” said Legato. “But in our virtual environment, we literally laid down dolly track on the stage, and it was represented accurately on the digital set.”
Blackmagic Design was not simply a part of the system, but the backbone for the process, providing the infrastructure for the virtual world as well as the studio as a whole. “We used Blackmagic products first as video routing for the entire building,” said visual effects producer Matt Rubin, “and at every stage of handling video, from capturing footage shot by the team using DeckLink cards, through Micro Studio Camera 4Ks as witness cameras, Teranex standards converters and various ATEM video switchers such as the ATEM Production Studio 4K and ATEM Television Studio HD.”
Editorial and visual effects were networked together via Smart Videohub routers to allow both departments access to the screening room, as well as act as sources to the screening room for shots. During virtual production, as the computers generated the virtual environment, DeckLink capture and playback cards captured the footage and played through a video network, feeding into a control station and recorded on HyperDeck Studio Minis.
Once footage was shot and captured onto computers, the setup was turned over to visual effects company MPC to create the photorealistic imagery. Throughout the process of reviewing footage and maintaining an up to date edit, Legato and his team utilized DaVinci Resolve Studio and DaVinci Resolve Advanced Panels in two suites, with Legato applying color to shots as guides to the final colorists. The DaVinci Resolve project was often updated many times a day with new footage from MPC. Legato only screened for Favreau in context of the cut, rather than showing individual shots, so it was important to be able to balance shots to provide a smooth screening experience. The facility shared a DaVinci Resolve database to allow various team members around the facility to view the same timeline without tying up a screening room.
Despite the cutting edge systems used to virtually shoot the film, the final product reflects the true art form of filmmaking, simply by providing real tools for cinematography and a creative workflow throughout. “The virtual environment created a truly flexible world to shoot in,” said Legato. “From Caleb being able to move the sun to achieve the right time of day, or the art director able to place trees or set pieces during production, the virtual world allowed us an amazing platform to shoot the movie. It was definitely a new type of filmmaking, one with all the trappings of standard production, but even more flexibility to be creative.”