How the Cool Tools Devised by Blue Sky Studios Created Twice the Toon in Half the Time
feature Ice Age: The Meltdown, the prehistoric
characters spend most of the film escaping from an oncoming flood.
That's presented its own challenge for the filmmakers, because
animating water is one of the trickiest tasks in CG.
too. “We had to do the movie in half the time,” says director Carlos
Saldanha, who quickly warmed to the task. “It wasn’t an easy decision,
but when the studio asked if we wanted to face the challenge, everybody
said, ‘Let’s do it.’ The team was motivated — they loved the story and
they loved the characters.”
Robots, the studio’s first two features, and
directed “Gone Nutty,” an Oscar-nominated short, was ready. “The good
news was that we had a solid script up on reels,” Saldanha says. “Every
sequence had to work. We pounded on the story every day. We didn’t have
room for mistakes.” At the end, the script was so tight that there were
no outtakes for the DVD. “Sequences went from the edit room to the
animators. We didn’t have time to rethink the dialog.”
from modeling through animation and rendering. Last year's
Robots was a testing ground for new technology that
made Ice Age: The Meltdown possible, but when the
studio turned up the heat on the schedule, the crew devised new
technology and techniques especially for this film. “We had an
adrenalin rush for a year,” says Saldanha.
character development: All the lead characters from the original
Ice Age returned: Manny the mammoth (Ray Romano),
Sid the sloth (John Leguizamo), Diego the lion (Denis Leary), and Scrat
the acorn-loving squirrel (Chris Wedge). However, the modelers created
around 35 new characters with multiple variations for migrating herds
and 40 environmental sets that included more than 150 different types
of trees. In addition, the modelers had to convert the original
characters from NURBS models into subdivision surfaces.
Autodesk’s Maya. For rendering, though, the studio relies on its own
CGI Studio, sophisticated ray tracing software that incorporates an
object-oriented graphics programming language. The renderer and other
software tools have been in development since the studio’s founding in
1987.
“The founders, and [CTO] Carl Ludwig in particular, have developed
robust solutions that are physically accurate, look correct, and work
correctly for the user with very little intervention.”
rather than, as with many ray tracers, only polygonal faces. “We have
our own subdivision representation — hierarchical subdivisions that
are all parametric patches, continuous curves,” explains Ludwig. Thus,
rather than create characters in NURBS and then convert the patches
into smooth surfaces made with ever-tinier polygons, the modelers now
work with subdivision surfaces.
rigger Steve Unterfrantz. “We no longer have to worry about seams,
pinching, and tearing where the NURBS surfaces touch each other. It was
a challenge for the animators to create the same performances for the
[returning] characters using new models and rigs, but there were so
many savings in other ways, it was the logical thing to do.”
complex problem for a ray tracer to solve than bouncing rays off
thousands of flat surfaces, rendering the patch-based surfaces takes
less memory, especially when a ray tracer has to calculate secondary
reflections and refractions.
departments devised methods to speed their workflow. For new lead
characters, modelers used templates from a Polhemus scanner. To create
multitudes of characters, riggers created variations from generic
models. Character riggers also developed modular rigs and Mel scripts
that assembled the modules into characters. The modular approach not
only sped the rigging process, it meant that animators could more
easily move from character to character.
— nearly double the number on Ice Age — and only
eight of them had animated the original characters. “Four or five of us
set up libraries for the new animators that showed how a character
moves and why it moves that way, to make sure the other 55 or 60 people
always stayed on model,” says Aaron Hartline, an animation lead.
the characters in a shot, rather than following individual characters.
And they worked fast. “We were so focused, so
direct, it was almost like we stuck with the first gut instinct that
came out,” says Saldanha. “I’d say, ‘OK, this is the scene and this is
what we need. Go, go, go, go. We didn’t overthink or second-guess. I
think that rawness of the idea is there and that’s what I love the
most.”
first approach,” he says. “We’d just tweak that raw performance to get
the best out of it. There was something good about that, something
fresh.”
fur system, groomers set parameters for guide hairs that control the
hair’s style; the system then procedurally generates millions of hairs
on the creatures’ coats using these parameters. It’s a typical
fur-grooming system. However, three things about the fur system stand
out: procedural motion is rigged into the guide hairs, no texture maps
are used, and a voxel-based system renders the fur.
“follow-through,” which was developed by Adam Burr, calculates the
fur’s movement by predicting a result based on previous motion.
Animators control stiffness, the amount of follow-through, and
something Maurer calls “settle time.”
isn’t solving the problem from a dynamics point of view. It’s solving
it almost from a probability point of view. You need a team of people
to finesse a simulation engine. This tool is automatic.”
than painted texture maps, an approach that was tried successfully on
Robots and implemented throughout Ice Age:
The Meltdown. “You could probably count on two hands the
total number of texture maps in the whole movie,” says Brian Hill,
materials technical director (TD). The procedural shading system uses a
Maya interface over a custom-node based network and compiles the
shaders in Maya. Plug-in splines describe areas for color, noise
creates textures, and TDs used nodes to create such other material
properties as specular highlights and roughness.
they have to be repainted, and they’re more difficult to work with.”
maps, the procedural shaders can move from character to character. “I
created a procedural material for a condor’s foot — layer after layer
of procedural elements that gave the nasty-looking claw foot fine,
cellular detail,” says Hill. “Then, I used that same network for the
other foot, for the condor chicks and with a little variation for all
the other birds. It takes more time at the beginning, but it’s much
more modular than maps.”
volumetrically-based fur system accomplished two goals: the fur looks
three-dimensional and the lighters’ job became easier. “The lighters
might use separate lights for the characters to help them stand out
from the background, but they didn’t need to do any special grooming or
use separate lights for the fur,” says Ludwig. “The volume-based
lighting model was integrated within the lighting system, so it acts
properly with lights in the environment and with shadows and global
illumination. When you get something that works right all the time, you
need to do less special stuff to make it look right.”
materials side: “Most 3D packages have specular and diffuse sliders,”
he says, “but in the real world, specular and diffuse have an inverse
relationship. So our materials usually need only specular. Diffuse is
calculated from specular.”
create gallons of CG water, from a water park showcased at the
beginning of the film to a threatening flood. “We know water is hard to
animate and control in CG and expensive to render” says Rob Cavaleri,
effects supervisor. “Our challenge was to blend physical reality with a
directable caricatured style for our cartoon world. We managed to pull
it off in a year.”
particles to create splashes, with procedurally animated surfaces based
on physical models of waveforms, and with simulations created with Next
Limit Real Flow software. Special rigs attached to characters emitted
bubbles as characters swam through the water. To create froth, during
rendering a system written in the CGI Studio language automatically
generated particles around characters and objects such as rocks, and
created a mesh around the particles to have a renderable surface. “We
could tell it how long the froth lived and how swiftly it moved,” says
Cavaleri. “The animation was all procedural; the particles moved along
wave surfaces automatically. We would never have managed without these
clever techniques.”
icebergs to grasslands to forests. Few matte paintings were used —
modelers created most of the sets. For Ice Age, the
modeling department built the sets on a per-shot basis from the camera
view as described in the layout. Despite the short timeframe, for
Ice Age: The Meltdown, modelers created complete
environments that had some details even in areas where the camera
wasn’t expected to go. “When we were working on the first Ice
Age, sometimes the director wanted to move the camera where
there was no set,” says Shaun Cusick, co-head of modeling. “There were
lots of charge-backs to other departments and it slowed things down. So
for this one we tried to build the set as an entire environment.”
shaders also textured the environments. Thus, rendering the
environments became an interesting technical challenge that was solved
by staging the geometry from complex in the foreground to less complex
in the distance. It was only one of many challenges as the team raced
to the finish line.
technical lead, who fed shots from the animators into the rendering
system. “We were going 150 percent the whole time. There was no
ramp-down. But luckily, everyone was great and it was fun. I already
miss working with Carlos.”
film in little more than five years? “I think when everything is done,
I’ll be standing alone in a room saying, ‘Hello — anyone? What shall I
do?’ It’s a miracle. It’s one of those amazing things. Those guys …
it’s extremely emotional for me."
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.