Six Ways Blackmagic Fusion Built a Better Workflow for Temp Effects, Performance Tweaks and More
A brooding, female-driven fight to the death against hypocrisy and political corruption, The Hunger Games: Mockingjay – Part 2 closes the dystopian franchise based on Suzanne Collins' popular young adult books. Editor Alan Edward Bell, ACE, has edited the last three films in the series for director Francis Lawrence, collaborating with editor Mark Yoshikawa on both Mockingjay installments, which were shot simultaneously in Atlanta, Germany and the streets of Paris. Cutting scenes for the last film while still in production for Part 1 "made it feel like we were working on one really long movie," says Bell.
An accomplished compositor and long-time user of Blackmagic Fusion, Bell edits in Avid Media Composer and uses the Edit Connect plug-in to pull scenes into Fusion to perform all sorts of temporary effects and editorial magic. We asked him to tell us what Fusion lets him do during the edit by detailing some of the Hunger Games scenes it touched along the way.
1. Build temp effects using actual VFX elements
"There were numerous elements that were shot, both for Catching Fire and the Mockingjays," says Bell, "that were effects elements I had access to at any time. By the end of the final movie, we had all of the elements from all four Hunger Games films on an Avid Isis system we could pull anything from, even if it just meant we were able to do research. But that also meant I was able to access all my past temp effects to refer to them or utilize them across the films. With the Avid's Edit Connect plug-in, I could pull something from the Avid ISIS and throw something in my source monitor, cut it into the timeline and immediately start using it. If I needed a fireball, I had a library of fireballs that were shot and I could pull from them at any time. Because I was using actual effects elements to build these temp effects, when those temp effects went to final, they knew exactly which element I used. If we liked that fireball in the rough cuts, the compositors could use that actual fireball in the final render."
2. Go nodal and fast-track temp effects to VFX
"In Catching Fire, when a number of tributes had died, you'd hear the cannon sounds and see an image of them with a lower third of who they were projected onto the sky," he explains. "We needed to do that again in Mockingjay 2. So rather than starting from scratch I was able to go back and open up the original Fusion project when I was temping them in Catching Fire, and then just swap out the images on the monitor used to suggest that Katniss and her crew have perished in the apartment building collapse. Not only was I able to pull up the project and utilize it but I was also able to rapidly figure out exactly what was going on in the scene, shortening and simplifying my workflow. If I had to do that with a layering tool like After Effects or even in the Avid, it would have been very difficult. It's just a lot harder when you're looking at layers to figure out how things are effecting each other. When you have this nodal schematic in front of you and you can see very quickly that this is piping into that—or this is isolated here and is not affecting that node, and everything is going downstream and converging at this merge—you know exactly what your effect is popping out of. If you want to add something on top of an effect with a layer-based system, you are often affecting all the layers beneath. It's harder to isolate that single layer just to make your change. With nodal compositing, it's much easier to make that change and understand what's happening while you're doing it."
3. Paint it out to keep your PG-13 rating
"Fusion was really useful when we got to the MPAA ratings review," says Bell. "We had to start thinking about, how are we going to keep this in the PG-13 zone and not veer into an R rating? There's an awful lot of give and take when dealing with the MPAA. They may broadly say, 'That's too grim' or 'There are too many dead bodies and many of them are children,' which we knew we'd have to deal with, given the subject of the books and film. But we took great care to make sure the only dead bodies you see after an explosion were adults. It was a conscious choice by the filmmakers. I ended up using Fusion a lot to paint people out and make them less gruesome so we could turn it around quickly and create clean plates. I actually started using the paint node a lot more than I normally would. Traditionally, I would paint things outside of Fusion, in Photoshop, and bring the clean plate back in. But I didn't have the time on the last Mockingjay, so I would do it all in the comp. It worked really well. I was surprised I'd waited this long to actually use the paint node."
4. Animate paint effects you could never do in Photoshop
Bell says he has experimented with Fusion both on the job and after hours to better learn what its feature sets could help him with in the edit suite. "Because Fusion can be time-based as well, you can animate your paint strokes in ways you can't do in Photoshop," he says. "The animation aspect of it is pretty powerful. The fact that you can paint something in Fusion and then, without doing a render, just right-click to save image and save the comp at it's highest resolution, is pretty mind-blowing. There's your clean plate that you can re-import it into Media Composer as a piece of footage. It's so great. You can throw a paint node on your source footage, stop anywhere you want, paint your clean frame, save it then pop it back in and then just turn the paint node off and write a little note to yourself that's what you did." In Mockingjay — Parts 1 and 2, he did some "very preliminary" temp 3D effects as well. "I used some 3D camera projection to tweak some of the performances. I discovered that the particle system is super powerful and very easy to work with. Because fire creates heat distortion, I started playing around with displacements and warping the image in ways that heat distortion would. I did a little of that, not a tremendous amount, in the movie."
5. Perfect performances that don't even need it
As Bell discovered when cutting Catching Fire, the cast performances, especially those delivered by Jennifer Lawrence, were nothing short of exceptional. "It was like picking gold from a pile of treasures," he says. "Every single take was great and if I picked all the worst takes, the film would still be fantastic. It is true that Jennifer's worst take is often a lot better than the best take from some actors I've worked with in the past. She is that good. But I still used Fusion to change performances or modulate timings throughout the movie based on what Francis and I felt needed to happen in the story. There's a moment in Mockingjay – Part 1 where I used Fusion to drop one of Jennifer's line, but not because her performance wasn't good. Katniss is at the bottom of a crater after District 13 has gone through an intensive bombing, and they are asking her to do a propaganda film. She says, 'I can't do this. He is going to kill Peeta. He is going to kill Peeta.' That's how all the takes were filmed and how it played in dailies. But Francis and I felt that it would be stronger for her to build up to the decision and think about it, so we can see on her face what she's experiencing internally. Instead, we had her say, 'He's going to kill Peeta. He's going to kill Peeta. I can't do this.' I didn't want to cut away from her because the moment is all about her and what she's thinking. If I had and dropped the line, it would have killed the moment. I used Fusion to slow her face down and morph it over herself where she said the line "I can't do this" at the very beginning. This ia a hand-held shot, too. We stayed on her through the duration of the take. If you look closely in the final film, you can actually see her swallow and then say the line. But she's not saying the line on camera any more. I showed this scene to an ACE tech group recently, then had my wife pretend to be Katniss Everdeen and did the effect live on her. I also routinely use Fusion to do split screens and tighten performances between actors or do head replacements. It's really indispensable in those situations."
6. Let sleeping babies lie—and fix it in Fusion
"The baby at the end of the film is actually Jennifer's nephew," says Bell. "He slept most of the time and only woke up and made eye contact with her once before falling back asleep again. I had to replace the baby's eyes and mouth and other things throughout the whole scene. All the over-the-shoulder shots were replaced because Jennifer was holding either a different baby or a fake baby. At one point, I think I grafted on the arm that the baby lifts over its head from some of the footage. There isn't a single shot in that final sequence that hasn't been affected one way or the other to make it play as intimately as possible. We used Fusion for every one of the temps. That's the thing about it: once you know how to use it, and you're in the Avid, it's just a pleasure to use. I love it."
Crafts: Editing Post/Finishing
Sections: Creativity
Topics: alan bell avid media composer Blackmagic Design Fusion
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.
Leave a Reply