How Animators at ILM Helped Call the Shots
Adding New Scenes After the Shoot Wraps
Nelson cites the “suit-up” sequence as an example of how much the VFX team was encouraged to collaborate in a creative way. The scene takes place when Stark has perfected the third and final version of his Iron Man suit, the red and gold Mark III. Ben Snow, visual effects supervisor at ILM, explains: “This was a sequence we pretty much created out of thin air. Jon Favreau got very excited about the shot with all the bits moving on the silver suit and felt like people would like a glimpse of what it was like underneath.”
The “silver suit” was the Mark II, the suit Stark builds at home after his escape from the terrorists. To create it, ILM had to match the brushed metal practical suit built by Winston Studio, which was a major technical challenge. The miniscule grooves and imperfections in the brushed metal surface reflected light in such varying ways that even the shaders created for Transformers could not duplicate the look of the practical suit. ILM’s Doug Smythe and Pat Myers developed a texture-map-controlled shader that, using thousands of lines of RenderMan shader code, modulated localized parameters of basic shader functions to imitate the complex reflections caused by the brushed metal of the silver suit and, later, the red and gold Mark III.
Once ILM perfected the match, the resulting digital suits had several advantages over the practical beyond allowing Stark to fly. The practical suit, although beautiful, didn’t have moving parts-the gadgets and weapons hidden inside slick, sliding surfaces. It couldn’t have the trim waist of a superhero and still fit an actor inside. And it was uncomfortable for the actors to wear.
“We started by designing our suit like the practical suit because we knew we’d have to intercut, but Jon [Favreau] asked us to take it further,” says Hal Hickel, ILM’s animation supervisor. “Within reason, we were not bound by the compromises of fitting a practical suit on a human. So, our design process happened after Stan’s [Winston] design process. Once that was approved, we added all the gadgets, the flying surfaces and the weapons.”
And that led to Favreau wanting to show how it all worked. “He wanted to add the suit-up sequence after principal photography,” Nelson says. “It was a classic example of ‘how are we going to make this?'”
“What Can You Come Up With?”
Nelson found a 360-degree Steadicam plate shot in Stark’s workshop that he and Kent Seki at PLF used to pick camera angles, and he worked with storybook artist Philip Keller on designs. “We gave those to ILM and they ran with it,” Nelson says. “It was probably half to two-thirds designed.”
At ILM, modeling supervisor Bruce Holcomb and modeler Russell Paul brainstormed with Snow and art director Aaron McBride to design the machines and the sequence. “Jon liked part of the animatic, but he wanted something different,” says Snow. “He asked, ‘What can you come up with?'” McBride created concept art, Holcomb and Paul created the under-layer of armor for Stark and the complicated robotic arms, and Hickel worked with creature technical director Keiji Yamaguchi, who rigged and animated the machines. “He’s our resident creature genius guy with a great affinity for mechanical motion,” says Hickel.
In the sequence, we see Downey Jr. looking around his workshop as the huge robotic arms apply the heavy-metal parts piece by piece to his body. To put the shot together, ILM’s digimatte artist Richard Bluff created a virtual background from high-resolution stills the studio had taken in the workshop during a sequence when Downey Jr. as Stark learns how to fly. To have something onto which they could place the suit parts, the studio shot a stand-in in a wetsuit on a blue-screen stage.
“We basically created the surrounds out of stills projected onto geometry,” says Snow. “Then we created all the machines, the suit, everything else in CG, and used some bluescreen shots of Robert [Downey Jr.]. It’s now possible, when a filmmaker decides to add more shots, to accommodate that, and to cut from the real set to the virtual set. And, it was tremendous fun to do that.”
For this shot, ILM could use the stand in. For shots with Downey Jr. fully enclosed in the suit, the studio often relied on motion capture data of a performer brought into ILM’s mocap stage. And, when it became obvious that no one could tell the real from the digital, for shots in which Stark tests parts of the suit, and often when we see the suited Iron Man without his helmet, Downey Jr. often shed some of the cumbersome practical shell knowing ILM could add digital armor later. For those shots, Downey Jr. wore red iMocap “pajamas.”
ILM had developed iMocap, a process that captures data from actors on set during principal photography, for Pirates of the Caribbean: Dead Man’s Chest. For Iron Man, the studio tweaked the algorithms to provide enough accuracy for precise digital costuming. “We had to have a really tight track of his body movement so the suit looked like it was on him,” says Hickel. The studio points proudly, for example, to the shot in which Stark is first learning to fly. In that shot, one of his gauntlets is real and the other is CG.
When Stark actually flies, though, the suit is fully CG and animated entirely with keyframe animation, based on reference material provided by Nelson, who had Giant Studios motion-capture sky divers in a wind tunnel. “We tethered them because Iron Man’s thrust has to come from his feet,” Nelson says.
Taking to the Sky
For shots when Iron Man has an aerial dogfight with two F22s, Nelson also provided ILM with reference, this time shooting a MIG and an F85 from a Lear Jet to provide previs choreography. Sometimes the F85 and MIG represented the F22s, sometimes one of the jets represented Iron Man.
“The jets not only gave us something to photograph, we had real air-to-air photography done by a cameraman with realism and lag, dust flying by the lens, clouds going by, condensation,” Nelson says. “And the specular component of how the light reacted to the metal was almost like lighting-ball reference.” In addition, Snow helicoptered up to the top of a mountain in a desert to shoot HDRI tile sets that ILM used as environments to light the suit.
The previs group then cut footage of the aerial choreography together and slotted in rough animations of Iron Man to create an animatic for ILM. “Other clients might have said, ‘Here it is. This is the sequence. We might tinker with the cut, but we like most of the shots, so go for it. Put your guy in,'” Hickel says. “Jon was like, ‘Well, actually there’s a lot of this that I don’t think is really working yet and we’ve got a lot to do our end, so you should just run with it. If you’ve got a better idea for a shot, show it to me. And, in fact, if you guys want to re-cut some of this and move the shots around …”
So Hickel and the animators redesigned many of the shots for that sequence, and for others, as well.
“The creative partnership we had with the client was above and beyond what we usually have,” Hickel says. “I would throw shot design over to the animators saying, ‘Jon invited us to help fix some problem they had with this sequence or that sequence.’ It was very exciting.”
Better Hero Moves Through Virtual Reality
During the fight at the end of the film, for example, when Iron Man battles Iron Monger, ILM amped up many of the shots using virtual backgrounds to duplicate the filmed footage. “We started departing from the plate photography and used the virtual background process to recreate the environments,” Snow says. “With our tool set, we’re now pretty free to do that as long as we document the set really thoroughly.” Being able to reframe the cameras allowed the animators to create more heroic comic book moves.
“There’s a gag where Stark starts to take his suit off and Monger shows up,” Hickel says. “He pops his hand up and there’s no RT [repulsor technology]. That was our idea. And then, he sort of hops up and punches Monger. We used our high-res stills to spread out the set with a virtual reality background so he could jump through the air and get a big Marvel Comics pose before he comes down.”
Breaking Out the “Tweak Cam”
Similarly, for the sky fight between Iron Man and Iron Monger, the digimatte team at ILM led by Chris Stosky provided proxy backgrounds that they refined after the animators completed their work. For these shots, because there were no plates, the animators performed the characters and the camera. But even when the camera move was in the plate, the animators often refined the move using a tool called a “tweak cam” developed at ILM for Transformers.
Hickel describes how it works: “In the 3D scene you have a match-move camera and your character standing in front of a background plate. We give the animators a new camera in exactly the same spot with the same match-move animation on it that they could mess with.”
For example, animators might shake the camera when they knock over a heavy character. Or, they might zoom in and frame down on the character. Before, to change the camera in a shot, they needed to kick the shot back to the match-move or layout department. On this film, the camera shake moved from the animators to the match-movers as corner pins. The match-movers would re-register the frame in 2D, but the change was driven by what animators had done with the 3D camera.
Of course, that opened the potential for chaos in the pipeline had not ILM anticipated just such a possibility. “It seemed like we needed to do something all the time, but we put good processes in place to communicate that the camera changed to all the other departments,” Hickel says. “You don’t want the background rendered with one camera and the creature with another. And, everyone on this crew had been here quite a while so I didn’t worry.”
For Nelson, techniques such as these pleased his perfectionist nature. “We never stopped changing until we couldn’t change things any more,” he says, noting that he had asked for tests from eight studios before settling on ILM, The Orphanage, and The Embassy. “We didn’t just cast the actors, we cast the vendors. Animation is acting. A lot of people would have just done the previs, but we didn’t hire them to do the previs. We hired them to do the value added. We wanted minds, not fingers. And, everyone contributed.”
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.