Building a Library of Spaceship Parts, Avoiding Storyboarded Looks, and Deciphering 3D Data Encoded in Old Movies
Chief Creative Officer and Senior Visual Effects Supervisor John Knoll joined Industrial Light & Magic in 1986. His first film credits were as a motion-control camera operator for Captain EO and later for Willow. He moved into digital effects with The Abyss and became an associate effects supervisor for The Hunt for Red October. He was a visual effects supervisor on the 1994 Star Trek: Generations and then on Mission Impossible, Star Wars Episodes I, II, and III, Mission to Mars, three Pirates of the Caribbean films, Avatar, and many other films. In addition to his visual effects work at ILM on feature films, Knoll is known for having invented Photoshop with his brother.
John Knoll’s nomination for Rogue One: A Star Wars Story is his sixth visual effects Oscar nomination. He won an Oscar for best achievement in visual effects for Pirates of the Caribbean: Dead Man’s Chest. He has also received six BAFTA nominations, 11 VES nominations including a recent nomination for Rogue One, and two VES awards.
Rogue One was his idea. He is credited as writer (story by), executive producer, and visual effects supervisor on the film. Rogue One has received seven VES nominations, rose to number one at the US box office in 2016, has earned more than one billion dollars worldwide. Gareth Edwards directed the Lucasfilm production, which was released by Walt Disney Pictures.
Also receiving Oscar and BAFTA nominations for Rogue One are Mohen Leo, who led ILM’s team in London; Hal Hickel, who was ILM’s overall animation supervisor; and special effects supervisor Neil Corbould.
Studio Daily: Why do you think your colleagues voted for Rogue One: A Star Wars Story?
John Knoll: Well, I value consistency, so we try to make sure everything we do has a high level of polish. I’ve had the experience of being in a theater and thrown out of a film by a shot or two that weren’t up to the same standards as the rest of the work. It sticks in your mind. We try to make sure that nothing stands out in that way.
But I think some of the positive reaction comes from the emotional content. A big part of the appeal for a lot of us is that [Rogue One] directly connects with A New Hope, which, for a lot of us, was important to our careers. That first Star Wars film was largely responsible for my deciding to go into entertainment in the first place, and I think others feel the same way. So there’s an emotional sense of proximity to something that’s cherished. "Oh, look at that X-Wing. The Death Star looks good. Look at the detail on the Star Destroyer."
I’ve read that the modeling team actually made the digital ships by starting with old model kits.
Yeah, that’s a fun anecdote. The important part of the aesthetics of the ships in A New Hope was a byproduct of the process used to build them. They built miniatures from plastic model kits. We had the opportunity to create a Star Wars-themed parts library, and I thought if we don’t guide and control it, it will happen in an uncontrolled fashion. I had images of the first model having detail, and then for the second, someone would steal some pieces from the first in an uncontrolled way.
So we planned it as a process. We decided to pick 300 of the most important, recognizable bits from model kits that were used back in the day. We bought them on eBay. We had our model shop veterans, Paul Huston and John Goodson, pick the parts and build a [digital] library of the 300 pieces that would be most useful. They built highly optimized versions, lightweight and well-made. Since we’d see them a lot, we kept them as light as we could.
There was another benefit that I hoped we would get. If modelers have a digital analog of what was built in the day and they can go to a virtual model bucket, it would enable a digital workflow the same [as practical model-making]. It would get things to fit in nicely. I felt it was pretty successful part of what people thought of the aesthetic of the film.
Will these parts be useful for future Star Wars films?
Part of the justification for starting this library is that we have a slate of other Star Wars films that could use this. We didn’t just build the library and that was it. Episode VIII has added some. Han Solo is adding more. We have a good, solid set of pieces going forward.
Did you use any new techniques for the digital materials?
We have been using plausible energy conserving materials for a number of years based on the BRDF paper from Disney [PDF] a few years back. But this is my first RIS show, the path-tracer version of Renderman with plausible shading. It generates very nice looking results. They feel real because the shading model is measured from real-world materials. You see that result more on models.
Did that affect how you lit the models?
The Star Destroyer has light, almost white colors and when you have an object painted white or nearly white it shows indirect bounce dramatically. So we were shameless about how we lit the shots. I think that indirect bounce is beautiful so I did three-quarters backlit on the shots. The two Star Destroyers have shadow on one side and the sunlit side bouncing back into the other face. It looks great. Those indirect lighting interactions are somewhat present [in A New Hope], but they generally shot the ships one at a time so they didn’t get the ship-to-ship indirect light bouncing like we did. But you don’t remember that. We got the benefit of new technology in a way that matches how you remembered.
The space battle in the third act is complex. How did you choreograph it?
We had [L.A. previs studio] The Third Floor do a first pass on a lot of the space battle, then as the edit developed, we clarified this point and that. Everyone got pretty busy and parts had to be re-throught and re-envisioned, and editorial and postvis were super busy with the live-action portion. Much of the third act is virtual, so they left that to late in the game. Eventually they called Hal [Hickel] and me and said they didn’t know if they could get to the level of detail they wanted. They had the story beats of what needed to happen — cut away, space battle, this happens — but they didn’t have a lot of specifics.
One beat was, “The Rebel Fleet takes out two Star Destroyers in an interesting way.” They left it to us to come up with the interesting way. That was a really fun opportunity. I didn’t want to do another explosion. I wanted something that felt different. So I liked the idea of making it about mechanical damage with a disabled Star Destroyer pushed like a tug into another. The Star Destroyer represents so much mass, if you get that thing moving, a collision would destroy both ships. We had a lot of fun designing and pushing that shot into such a memorable part of the film.
Did you do anything new on set to help film the space battles?
Greig Fraser [DP] talked about something that is always a challenge with cockpit footage and car driving scenes in conventional films with tricky, complex lighting environments. We’ve all seen gags they do on soundstages with conventional lighting instruments like having the grips put propellers in front of lights. We wanted to do something better. In the space battle, the ships are next to things that cast shadows, there is sunlight bouncing off the ship into the cockpit, explosions going by, lasers, and other ships bouncing light.
So we thought about using LEDs to solve those problems. I’m always keen on every project to try some experiments and learn something new, and we had talked for a while about using LED screens as image-based lighting. We had used them in Mission Impossible 4 [Ghost Protocol] and for the oracle time machine in Tomorrowland. And they were famously used on Gravity for lighting actors.
So using large LED panels was something Greig and I were excited about. We constructed a horseshoe of LED screens. Mohen Leo and I worked with The Third Floor to create the CG animation that would be on the LED panels — the environment around the sets and actors. The CG had previs-level fidelity, but we did it with high dynamic range and photographically true contrast ratios. Because the LED panels would be lighting the actors, we worked to make sure the brightness was done right.
Were you pleased with how the LED panels worked out?
My primary motivation was the lighting on the characters. That drove the decision. But it had additional benefits that were obvious in hindsight. I was pleased to see that the actors liked it. Instead of staring into a blue screen and having someone say that a Star Destroyer was in the corner over there, they had something to look at. They didn’t have to imagine it.
Because the level of fidelity wasn’t something we’d put in a film, we replaced those images with better versions in post — with something similar in tone and with about the same imagery. We didn’t have a blue screen, so it wasn’t a simple pull key. It all had to be roto’d. But the improved lighting was a bigger win than the negative of doing roto.
Even though they shot into the screen and we always replaced the images, it was good for Greig, as well. When you can see an approximate version, you light it differently than a blue-screen element. You get higher quality lighting. I’d love to keep doing this.
How did director Gareth Edwards’ shooting style affect the post-production?
When I watched the way he was working during the pre-production test shoots I saw what was in store. When we started shooting, I saw his style. It was something really wonderful — the idea of being in the Star Wars universe, but with a camera that had this vérité style. The more the objects — a space ship in the background or some building or robot — weren’t deliberately composed in a way that felt storyboarded, the more you happened to be there. You were in that world. I felt that could be a fantastic experience in the Star Wars universe. But it meant it would be harder to know what was coming. Often, we didn’t know what was in a shot until it was shot. I’d look at the monitor and think, oh, yeah, I wish we had a blue screen up there. But I did not want to interfere because it would have changed the style. It was harder but it would look better. But we had a massive amount of roto.
How did you involve the director in the cinematography for the all-CG shots?
I started thinking about scenes that wouldn’t have a live-action component, like the space battle. How do we inject his style and energy into the space battle and get that unplanned feel? I decided we should do this as a big virtual shoot: Animate the scenes in advance, put Gareth [Edwards] on a motion-capture stage, have him look through a viewfinder, and find what he feels is a good composition. I ran that idea past him and he sounded excited, so we did a test shoot. He loved it. He could dive into a scene and look around for interesting angles.
There has been much talk about the digital humans. Why did you decide ILM could handle creating digital humans that would appear in close-ups — the digital Grand Moff Tarkin (Peter Cushing) and Princess Leia (Carrie Fisher)?
It’s a really hard problem. A nontrivial challenge. But when I saw the work on Warcraft going through the building — the quality of the facial motion-capture, the hair rendering and grooming, and the eyes — I felt we could do a close-up human. We built on previous projects, on techniques and projects like Warcraft.
Did you come up with any new technology to help create the digital actors?
Flux was new. It tries to solve shape and expression from archival footage. Knowing that we didn’t have Peter and couldn’t scan Carrie as what she looked like before, we had to use other sources. I’ve been thinking about this for years, that there’s 3D data encoded in old footage.
3D data encoded in old footage?
We often do photogrammetry, but the trick with photogrammetry is that the camera moves, but the object is usually stationary and the lighting doesn’t change. So if you have a lot of photos of an object from different points of view, you can identify the same landmarks and solve for the shape. But as a human performs, the shape changes on every frame and the head turns through the lighting. The thought was that if you have an actor turning his head and presenting different parts to the camera, you could potentially do a variation of dense stereo to recover the shape. And you can estimate the lights and shape from the shading: You can look at the shading and figure out what shape was required to make that shading. So that’s how Flux helped us debug some of our likenesses. We could give it a base model, match-move an old plate. run Flux on the plate with the camera solve, and Flux would do various small-scale tweaks and adjustments to make the base model fit an image. We could look at that to see where we were off. It helped us nail a likeness.
Which visual effects studios worked on the film?
All four ILM studios, plus 11 other companies including Hybride, Ghost, Scanline, Atomic Fiction, Whiskey Tree, [and] Virtuous. Even Stereo D. We had massive amounts of roto. so I thought, “We’ll have a lot of roto. How do we mitigate the cost? What if we do a partnership with Stereo D? They’ll end up doing the stereo conversion, and will have to roto a number of things for that.” So, we tried to kill two birds with one stone by having them do some of that roto early. When it was time to do the stereo conversion, replacing backgrounds behind characters, they had already completed some of it.
How did being an executive producer and writer change your role as visual effects supervisor — or did it?
The boundaries of responsibility are always a little blurry. I’ve been on projects where I’ve made comments and suggestions about story, and been involved in casting decisions, so it wasn’t foreign territory. But, it was nice to officially be part of the production team and have a say in things across the whole picture, across the whole production. My voice perhaps carried a little more weight in production meetings. If I had a strong viewpoint that we should do something in a particular way, my comments might have been taken with a little more weight than they would have been otherwise.
Felicity Jones as Jyn Erso
Why did you give this story a female protagonist?
I have four kids — three girls and a boy. The girls were all little when we were working on the [Star Wars] prequels. I felt like there were plenty of really great male characters to identify with. There were a few females, but there could be more, and why not? I wanted Jyn to be the character that I wished was around for my girls to identify with.
Did you learn anything new from this project?
If I ever worked on a project where I didn’t learn, I’d think that was a tragic waste of time. I learned a lot about what it takes to do realistic skin shading and motion. Things that are subtle, like blood flow from heartbeats, compression and tension on the skin. It’s a subtle thing, but it helps make a character look alive. The importance of micro displacement and fine wrinkles and skin texture and how that changes as the skin flexes. We had a mechanism for dialing in displacement maps as the skin pulls tighter or compresses, which was important when we framed in tight. This was the most challenging bit of digital human work I’ve been involved with.
What was it like to see your story on screen in a theater?
Very exciting. I had a bunch of goals: I wanted this to be a movie I liked, and at the end I was happy with it. I hoped audiences would like it, and it is popular with fans. I hoped it would get good reviews, and it did. I hoped it would make money and it did. So, it succeeded.
But, also, I worked with George Lucas for 25 years and directly with him on a number of projects. I really hoped he would like this movie. It was so important to me that we did something worthwhile with this beautiful playground he built for us.
And did he?
I got a call right after he got out of the screening. And, yeah, he really liked it.
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.
Cool stuff, by a cool guy!
ILM is the best!