It’s an exciting time for our industry. With access to more affordable, more powerful and more flexible tools, we now have the ability to create seamless effects in less time and with higher quality.
A whole new world of tools, such as match moving, image modeling and element removal utilities are allowing us to integrate synthetic elements into live action sequences with totally transparent results. In a recent Visual Effects Society Festival, held in San Rafael, Calif., we were shown an amazing array of set extension, 3-D augmentation and camera tracking tricks that were used for movies such as Black Hawk Down, Spider-Man and Star Wars Episode II: Attack of the Clones.
Imagine removing a rollercoaster car from its ride and being able to produce a seamlessly clean plate for further manipulation. Imagine rebuilding a whole city using only digital stills. The use of digital set extensions has already changed the way films are shot, enabling directors to plan elaborate environments while using basic elements for the actors to interact with on set.
I’m excited about the transparency of the results.
A recent American Airlines commercial comes to mind. Every scene in this spot had an effect or was touched in some way by the artist, yet it is hard to find these enhancements. A street sign is moved, a billboard altered, a scene’s color improved. There are no obvious signs of modification; the commercial just feels nicer, more integrated.
When the shot calls for large numbers of people, new digital tools come to the rescue. Creating upwards of 70,000 extras for a Lord of the Rings battle scene. These are intelligent tools that can create flocks of fish that automatically escape from a shark—or soldiers that are attracted to the leader of the "bad guys." They approach, attack battle and die in combat based on general "rules of the game".
In the opening scene for the movie Black Hawk Down, where thousands of people approach a Red Cross food distribution center, a huge portion of the actors are digital. They look totally real, no sliding, no obvious patterns. The job of measuring the huge area in which the scene develops and being able to use this data alone was monumental as the creators used mostly digital stills as reference.
The digital actors are not limited to extras alone. Many of the shots in Spider-Man used a digital stuntman derived from cyber scans of Tobey Maguire. Scenes that would be impossible or extremely dangerous were possible, including a tremendous sweeping shot of Spider-Man flying over a city street. Using the digital tools allowed the movie to have a very human feel. As the animators became better acquainted with the character and got used to animating the actor and the camera, their movements became smoother and more precise, just as it would have been in real life.
By using new photogrametry software, large scale environments are being reproduced in the digital world and very accurate data is being generated that allows our artists to do whatever they want in these environments without worries about huge production budgets or dangerous situations. Want to add two floors to that building? No problem. Need more cars? No problem. The cars have to fly?? No problem. All it takes is time and the talent of the artist. "Money is no object," remember?
Illumination has also been revolutionized with ambient lighting simulations and image files that contain all possible exposures of a particular still image. These new tools will allow us to re-light a set, lighten or darken areas of an environment and possibly modify the original lighting of a scene after it’s been shot. In the short term we might even be able to computer control the lighting in our blue-screen shoots to recreate that of the background plate, producing a more convincing element.
The use of miniatures has also come of age in the new digital world. Many set extensions and even whole sets were shot as miniatures in HD for Star Wars Episode II: Attack of the Clones.
Motion control data as well as match moving tools were used to match the live action photography to the miniatures. In some cases, the motion control camera was actually driven by 3-D animation data from software used for previsualization.
I’m excited about the possibilities.
Imagine being able to create totally realistic water, clouds and fire. Imagine being able to animate liquids that transition from normal behavior to fantastic action and can take on particular shapes. Imagine clouds that organize themselves to mimic an actor’s actions. New tools are allowing us to create the impossible with less trouble, more control and without staff programmers.
The ability to give characters the human touch is the next Holy Grail. The small details that make an animated character look real, the little twitch of an eye and the slight jiggle of skin are all needed for an extra dose of reality.
Even puppets and animatronics are joining in. New technologies in puppet control allow the traditional puppeteers to apply their skills to animatronics. They can impart them with a human touch, while being able to repeat or even edit the "acting" of their characters. Virtual sets that follow the camera, such as Douglas Trumbull’s "Magicam," create virtual environments for real puppets as well.
I’m excited about all these and more.
Being able to collaborate with colleagues from all over the world, sharing ideas and insight. Instant software updates. Even same-day FedEx!
We are experiencing the best that technology has to offer and it keeps getting better. The future holds advancements in animation, integration and collaboration. These advanced systems are the new tools of the trade and I like my trade.