On the Role of the Colorist, the Scratch VR Suite, and Making All 360 Degrees of Picture Look Their Best

Proving once again that classic rockers can leverage new media as well as the young'uns, Pure McCartney VR, a five-part series directed by music-video veteran Tony Kaye (American History XLake of Fire) and produced by the VR specialists at Jaunt Studios, puts viewers in the studio with McCartney himself as he discusses the creation stories behind some of his most famous music. 360-degree interview footage was captured at McCartney's rural recording studio with the new Jaunt One camera so viewers can look around the room and check out the scattered gear and other artifacts while McCartney talks. Red motion picture cameras and still cameras captured more picture elements, some of which are composited into the scene along with archival video projections and other imagery that add visual panache to the proceedings.

Jaunt Cloud Services was used to stitch the 24 separate 1920×1200 video feeds together into footage for editorial (led by Duncan Shepherd) to work with in stereo 3D and at 4K resolution. Once Shepherd handed over the sequnces, color-grading and finishing were managed by digital colorist Dave Franks in the new Scratch VR suite from Assimilate. Franks worked in consultation with Kaye, grading each sequence independently in between sessions where he dialed in looks with the director. His color environment is carefully controlled, and he carries a set of "confidence images" that he's very familiar with to help him gain self-assurance that any new color suite has been appropriately calibrated. Scratch's 360-degree toolset was used via a calibrated viewing monitor, while both QC and client-review involved the use of head-mounted displays.

StudioDaily asked Franks about his experience on Pure McCartney VR, the current state of the technology, and where VR narratives might take us in the future.


Pure McCartney VR

StudioDaily: Many VR experiences are still sub-optimal — the cameras being used to capture them don't have the quality of cinema cameras, and the devices being used to view them lack the resolution and field of view that would make a truly enveloping experience. How do you make VR look its best under those less-than-ideal conditions?

Dave Franks: You simply do the best you can. I have a certain aesthetic, which tends to steer away from a super-poppy look, where the imagery looks overly processed. But sometimes the original footage is actually turned over to me already that way, unfortunately. So I work to clean up the image as much as possible, removing some digital artifacts by keying, and generally toning down the picture — all of this, while simultaneously trying to hold true to my personal aesthetic and make the project look as compelling as I can. While I’m thoroughly enjoying it all, I’m accepting that VR is a a bit of a work in progress right now, both in terms of imaging and getting the post-production processes to be better by having higher fidelity for a given bandwidth.

Also, when grading a VR project, you have to know what your hero release format is or final output device is going to be. If the hero device is a [Samsung] Gear VR, then that's what you’ll ultimately need to look at when grading a project, and that's also what you’ll need to have your client look at when they approve your work.

If it's not as good or less compelling on other devices, that's unfortunately just how it is. I follow the same approach when grading a feature. It's never going to look any better than it does in my color bay, or in the client’s calibrated viewing environment. So rather than chasing yourself, trying to perfect it for every screen, which can often lead to a dulled, mediocre result, we need to make it look great in the DI environment and let the other worries about its appearance elsewhere go since they are beyond our control. Similarly with VR, we simply can't manage it throughout the entire ecosphere, so we have to make it look the best we can in the environment we're grading in and on the hero device that’s been specified.

I still remember seeing a screening of [concert documentary] U2 3D, which you worked on, in a Manhattan screening room under basically ideal viewing conditions. That's still my favorite 3D viewing experience ever. When I mention that to people who worked on it, they always say that film was an enormously difficult project, demanding a lot of time and money to get it just right. But the results were fantastic.

Thanks. It was an incredibly hard show, both politically and technically challenging, but at the end of the day, I think we did good, and I have no regrets on the choices we made to perfect that imagery. Every time I see it, I sit back, look at it and take it in, and think, “We really nailed this.” I was the VFX supervisor for the show, but I also built the team of post personnel, handled the imaging, supervised the color grading at Technicolor and guided all things picture well into release. It was great.

How do you generally see your role as a colorist?

Apart from the creative and technical skills I bring to a project, I see my role almost like, I suppose, a legal justice — someone with a very straight and narrow obligation, who needs to separate that obligation from personal bias that can often distract from the task at hand. As a colorist, my obligation is to the DP, the director, and the other filmmakers. Whatever the specific role I have on a show might be, my responsibility is ultimately to support their careful work and vision and shepherd the picture through the process as best I can. Sometimes the color or VFX choices that wind up in a project are not my preference because I might have a different aesthetic or wouldn't make the same choices a client might make. But, ultimately, worrying about those issues distracts from the job. The job, as I see it, is to learn and deliver on the filmmakers’ vision, separate from my own. I compare this role to that of a WWII submarine, where we’re a silent service. You guide your clients as best you can, pitching ideas where you can, all toward a good aesthetic. But at the end of the day, the obligation remains to faithfully support the vision of the filmmakers and guide their work through the process, not stepping on the DP’s imagery along the way. And it’s thrilling for me to have this responsibility when delivering a client’s picture.

Paul McCartney

You used Assimilate Scratch on Pure McCartney. What are your feelings about Scratch as an overall finishing tool?

I like Scratch a great deal, and think it’s extremely versatile. It definitely allows me to do everything I need to do. From a strictly color-grading standpoint, I’ve used other tools that I find can be more of a visceral extension of my own hand sometimes, similar to a favorite hand tool or the precision of a Leica or Panavision camera. Really. Scratch is working toward being that extension for me, although I've found that sometimes other tools offer a bit more nuance, precision, and control in certain areas.

That said, we simply could not have done Pure McCartney without Scratch, nor could I have completed many other projects in my career without Scratch’s breadth, versatility, and Assimilate’s commitment to supporting its users. For example, we couldn't have done U2 3D or Superman Returns without Scratch, either. It's a Swiss Army knife. It isn't always the best tool for all tasks, but as an arsenal it’s invaluable, and often the first tool on the scene. It's now very strong in the VR space.

Having worked on a high-profile VR project, what do you think of VR’s potential as a storytelling medium?

To be honest, I really was on the fence when considering working in VR — having ridden out the 3D wave. Things were huge for a few years in 3D until things tapered off and 3D found its niche in animated features and converted tentpole movies — apart from those unique projects from strong 3D proponents like Ang Lee, James Cameron, etc. So I wondered if VR would be another passing trend to sell a bunch of hardware, or if it were really legit as a compelling medium. Having come to Jaunt, seeing what it is and working with it, well … I wouldn't say that I drank the Kool-Aid completely, but my perspective on VR has shifted toward the positive, realizing it's definitely for real. These are the early days, however, and the technology actually may be farther along than the storytelling language.

We're going to need early-adopter filmmakers to find unique ways of using the medium. For example, the 2D language of the establishing shot, two-shot, reverse, etc. — all of these things we take for granted when viewing a story. This now-traditional cinematic language that was easily portable to stereoscopic features is not yet established for VR, and I think those early adopters will be working to develop that language.

Also, while I'm a picture person, I think the significance of sound sync and integration with picture hasn't been fully realized in VR. Three-dimensional sound, as a cue to where someone might want to look [in a 360-degree environment], still needs to be fleshed out and prioritized when developing a VR experience. When we did U2 3D, sound sync was critical. Depending on how big a movie theater was — with sound traveling at 1,100 feet per second, there would be different places in the theater where Bono appeared to be in sync and others where the sync was soft. But when it was shit-tight, pardon the expression, the movie was so much more compelling. I think the same is true for VR. Sound direction and sync is so critical to being fully immersed, enveloped, and engaged.

Another thing that is unique to VR, and still being fleshed out is active vs. passive viewing by an audience. Unlike a 2D movie, where you watch it once and then every incremental re-watch of it yields an insignificant amount of new information, when you re-watch a VR experience, you often get significantly more information that you might not have taken in the first time. Unlike a traditional proscenium-style movie where someone lets the movie happen to them, the potential is there for more active engagement in VR, and it truly requires an engaged and active viewer. Sound and picture sync are key to that engagement through syncopated sensory cues.

Have you seen anything recently that you thought really does point the way forward for what's possible in VR?

Yes. At the Kaleidoscope VR Showcase at NAB VR Pavilion this year, I saw a VR experience created for the HTC Vive called "La Péri." It's a French VR piece set to music where you're experiencing a wire-frame ballet dancer — a motion-captured, CG character — dancing before you and through you. It’s a very intimate, moving experience. You can be as close or as far from her as you want, and follow her with your display and a flashlight you hold as one of the VR virtual tools, while completely immersed in the classical music piece La Peri (by the French composer Paul Dukas). You're standing in the middle of the Las Vegas Convention Center, but it's still so moving. It's truly one of the most compelling VR experiences I've ever seen.