10 Takeaways from the Production Notes
Disney’s new remake of The Lion King, opening today in North America, features the kind of creature animations that have led at least one film critic to conclude that at least some of the film’s lions and hyenas are real animals rather than digital characters. (Without naming names, that critic is wrong. Everything seen in the film outside a single photographed shot, is computer generated.) That is to say, they’re photoreal, and they’re captured in a way that could lead you to think there was an actual human being behind an actual camera on an actual location.
What makes them so convincing? Part of it is a scrupulous attention to the details of real animals that inspired the film’s characterizations and real locations that inspire the film’s settings. And part of it is a dogged attempt by director Jon Favreau to put his crew at ease by letting them use traditional filmmaking techniques to move cameras around a virtual world. The film’s production notes, published by Walt Disney Pictures, help explain how this was achieved. Here, we excerpt the most interesting takeaways — including quotes from the filmmakers themselves — from that document.
The film’s human actors performed in a black box. Because they wouldn’t be involved in the film’s principal photography, The Lion King‘s human actors were often asked to perform with each other on the stage of a plain “black box” theater rather than simply reading script pages from a standing stationary position at a mic. VFX supervisor Rob Legato said the stage environment helped the actors deliver real, physical performances as references for the animators.
“We photographed with multiple cameras so the animators could see the intent of the actor even though it’s not a direct translation because they’re not an animal,” said Legato. “But when they pause and they look and you see them thinking, you know that that’s what drives the performance. You make the translation — what does a lion do to do the same thing? It’s much more informed than just voices only. And voices disembodied — reading off of a piece of paper is way different than interacting in a scene and bouncing off your idea. If you make a mistake and I cover it, maybe that’s more interesting.”
The crew went on safari. Thirteen members of the filmmaking team — including Legato, MPC Film VFX supervisor Adam Valdez, animation supervisor Andy Jones and DP Caleb Deschanel — took a two-week safari in Kenya in early 2017 to take notes on the film’s locations and study the native animals. They brought along 2,200 pounds of camera equipment and shot more than 12 TB of photographs, made reference videos of animal movement, and even scanned actual rocks that would become part of the film’s landscape. “It’s amazing how extreme the temperatures can be and how dry it is at times,” said Jones. “The animals learn to cope with all of it and survive.”
Digitally built environments were inspired by real locations. For example, Pride Rock was inspired by rock formations in the Chyulu Hills mountain range of southeast Kenya. The film’s wildebeest stampede takes place in a narrow canyon similar to Namibia’s Sesriem Canyon. And reference for the elephant graveyard came from tufas (porous limestone formations make by deposits from springs or streams) in Mono Lake, California, and geothermal areas in Yellowstone National Park, Wyoming.
“Much of the new technology is really procedural, where you use a tool in order to populate the savanna with the assets or to create textures that will repeat, and then you can apply them everywhere,” explained MPC set supervisor Audrey Ferrara. “You still need the human eye to keep it in order, because it can become really messy really quickly. Then, sometimes, it just appears in front of your eyes and you think, ‘Is this real or animation? I can’t really tell the difference right now.’”
Favreau was inspired by nature documentaries. “We set out to create something using these mythic archetypes that also feels naturalistic and beautiful and real,” he said. “We looked at a lot of nature documentaries to see how beautiful it could all look and how lyrical it is, in nature when photographed and painstakingly edited with good music to create stories out of documentary footage.”
Production took place in an unmarked facility in Playa Vista, CA. The new facility included the black-box theater, which later became a VR volume for production, and two screening rooms, the Simba and Nala theaters, that facilitated real-time collaboration between the filmmakers in L.A. and the MPC Film VFX crew in London. “We had different VR systems and a dozen different VR stations around the bullpen,” Favreau said. “We wanted to make it feel more like a tech company than a movie studio, so we created a campus environment. We had food trucks pull up for the crew out front, or I’d be cooking upstairs.”
The film’s virtual production extended techniques developed on The Jungle Book. The film may be animated, but it employed a live-action film crew that could work inside a VR volume, using traditional camera equipment to set up and execute shots in the animated world exactly the same way they would be achieved in a real-world environment. Where Avatar broke ground by giving the filmmakers a window on the VFX world — they could see the CG environment in real time during production as if they were looking at it through the camera’s viewfinder — The Lion King inverts that idea by putting the filmmakers and their gear inside a game engine that renders the world of the film.
Oscar-winning VFX supervisor and Magnopus co-founder Ben Grossman explains: “Physical devices are custom built, and traditional cinema gear was modified to allow filmmakers to ‘touch’ their equipment— cameras, cranes, dollies — while in VR to let them use the skills they’ve built up for decades on live-action sets,” adds Grossman. “They don’t have to point at a computer monitor over an operator’s shoulder anymore — the most sophisticated next-gen technology is approachable to any filmmaker who’s ever been on a traditional set.”
Making the shots feel real was all about emulation. The production created physical representations of traditional gear, even when it seemed like overkill, because Favreau believed it would help the film feel like it was photographed, rather than made with a computer. There was an emulated Steadicam rig and an emulated handheld camera rig. There were cranes and dollies. There was even a virtual helicopter, operated by Favreau himself.
“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” Favreau explained. “Even though the sensor is the size of a hockey puck, we built it onto a real dolly and a real dolly track. And we have a real dolly grip pushing it that is then interacting with Caleb [Deschanel], our cinematographer, who is working real wheels that encode that data and move the camera in virtual space. There are a lot of little idiosyncrasies that occur that you would never have the wherewithal to include in a digital shot.” Animation used the data from virtual production to export video files to editorial and to send data files to VFX.
Character development was a nine month process. Character designs were based on creative guidance from Favreau and his team. “Translating an animated character into a photorealistic creature required a full rethink,” said production designer James Chinlund. “Digging deep into research and our experiences scouting [in Africa] was always the kickoff. Jon [Favreau] and the team would land on a group of key images that captured the feeling we were pursuing, and that would launch our character illustrators. They would produce both paintings and 3D sculpts of our characters, which went through rounds of reviews with Jon and the team. Then, when we got close to final, we would output a 3D print of the character for last looks using our in-house 3D printer.” The final designs were sent to artists at MPC, who built them using new proprietary tools for improved simulation of muscles, skin and fur.
Will you believe an animal can talk? And sing? The animators hope so. Animation supervisor Andrew Jones recalls the effort that went into making it all plausible: “We tried to tilt their heads down so we are not staring directly into the mouth. At the same time, we did our best to make sure that we were not adding attributes in terms of how each animal can physically move their mouths. So, every kind of muscle control we have around the mouth makes them move in the ways they can really move their mouths. We found lip-sync through that approach — moving mouths into shapes that, for instance, a cat can really do, and trying to have the right kinds of sounds coming out to match those shapes. We had the belly muscles and diaphragm tighten so that you feel like the animal is forcing air out his mouth as he is talking, timed with particular syllables. With female lions, whose necks we can actually see because they do not have manes we added particular esophagus and neck movements to help sell the fact that they are talking, with tongue and larynx moving.”
Anyway, it’s gonna make a ton of money. This isn’t in the production notes but, at this writing, The Lion King has earned a roaring $23 million in the U.S. based on Thursday-night screenings alone, not to mention the $100 million it has already racked up internationally. Whether or not this style of virtual production is the future of animated movies, this particular animated movie is going to make bank.
Crafts: Shooting
Sections: Creativity
Topics: Project/Case study caleb deschanel jon favreau rob legato studiodaily dossier
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.