Shooting a New York Train on a U.K. Set, Pulling Depth-of-Field with CineFade, and Keeping HDR Under Control
In The Commuter, Michael MacCauley [Liam Neeson] seems to be having one of those days. First, the ex-cop loses his insurance job. Then, while commuting home, he finds himself drawn into a conspiracy that will play out with deadly consequences on — and eventually right off — the rails. This film marks the fourth time Neeson has worked with director Jaume Collet-Serra, after their popular collaborations on the thrillers Unknown, Non-Stop and Run All Night. The filmmaker selected Paul Cameron, ASC, as his director of photography. Cameron, whose work on the pilot episode of HBO’s Westworld was profiled in Studio Daily last year, relates the challenges involved in filming a set-in-New York tale entirely on U.K. soil, with all its train interiors shot on soundstages.
Studio Daily: With all the action and camera movement, did you rely on any previs?
Paul Cameron: It wasn’t necessary. I had worked with the director years back on commercials, so we were already comfortable designing sequences together on our own. There was some previs [from Nvizible] for visual effects supervisor Steve Begg to plan out the actual train crash, but that was about the CG effort [handled principally by Cinesite, aided by Iloura], not the live-action. However, we did do some storyboarding for our end of things, which helped with planning the dynamics. There were a lot of really extreme and dramatic moves to accomplish in very tight spaces, so I chose the ARRI Alexa Mini [provided by ARRI London.]
I’ve read it was the director’s idea right from the onset to shoot all of the live-action train sequences entirely on stage with an articulated 30-ton set. How did that impact your approach?
First off, it meant we were going to be shooting blue screen for the windows, so I had to figure out a lighting plan that would take us all the way through the afternoon and into evening for the character’s ride. In New York, I shot 5D reference plates while traveling north on the actual train route, at the times of day that were reflected in the script. So that gave us a solid idea of what we’d need to emulate in terms of light levels and interactives. Most of it takes place on the train after it leaves Grand Central; they’re in the tunnel for 10 minutes, then pop out in late afternoon. The art department [under production designer Richard Bridgland] built a representation at Pinewood of the Grand Central platform, so we actually drive our train in and out of that station.
So your recon of the route gave you an idea of the various looks. What kinds of units were used to recreate that on stage?
My gaffer Mark Clayton did a terrific job, building a rig with 60 vari-lites [Martin Mac Viper Performance moving lights] and 60 ARRI SkyPanels. We used LEDs that let us make various color changes, programming patterns of light that gave us the dappling effect of sunlight through trees. Once that was programmed, we were able to switch very quickly during shooting from scenes taking place at one time of day to another.
Were you able to use the same lighting for close-ups, or did you usually enhance those shots?
Many times, when you want to get some real interest for the faces or create a pattern on the wall, it is smart to bring in another bar of effects lights. Whenever we’re in a tunnel, I like to play some other interactive aspect. It’s always about how to best enhance the natural reality of the moment while at the same time embellishing it for the drama, so I often pushed as far as I could.
I got a kind of Das Train vibe from the shots that rush ahead on the Z-axis through the compartment.
I think it works pretty well, and that immersive you-are-there approach helped to ensure audience suspension of disbelief, which becomes a real concern when you’re faking the whole thing with blue screen. I needed to develop a tracking system for use while moving through the length of the train. We buried our track up inside the ceiling of the cars, then used a special stabilized remote head called Stabileye that is currently only available in the U.K. You pack a stripped-down Mini in there and it made for a super-small profile, letting us race down the aisle between the seats and passengers. We could also pull the camera from the rig, plug it into a backpack and go stabilized handheld right away.
It sounds like you must have taken some time to engineer all this.
The rig featured a computerized winch system. Not only could we track up and down, but there was an arm so we could spin right, which let us fly right around a stationary character at high speed. There was potential danger when maneuvering this close to the actors, but the system was quite reliable. We also disengaged the winch system quite a bit of the time. Key grip Paul Hymns pushed the rig up and down manually, so he could react to the way an actor moved, which was important for telling this story, since there are several characters who are suspects and the camera looking at them is kind of reflecting a certain paranoia at times, not knowing who to trust.
The other big challenge was where to hide our lighting on our set; we had lights up at the top of the frame and hidden down below the frameline as well, along with lights on the sides that we could cover with blue screen. [Additional units were mounted above the windows, which helped provide pools of illumination during the tunnel passages.] I used [uncoated Zeiss] Master Primes [from CW Sonderoptic] in spherical 2.40, so that aspect ratio gave us a little space to hide things at the top and bottom. Then again, we were on wider lenses most of the time, so the battle of where to light from raises its head. I used lots of LED mats taped to the ceiling, plus small panels wherever we could find space, and some handheld units too when we could get away with it, to provide eyelight or some fill. It was definitely tricky when dealing with this enclosed reality, like shooting in a storage facility.
It looks like you used a bit of atmosphere on the train interiors.
It helped to have some atmosphere in the daytime scenes, giving us some shafts of light in the interior that gave a cinematic dimension to the light inside the train. There’s a fine line when using smoke, and we did find it advisable to back off somewhat a few times. Like so much of the job, determining the aesthetic is very subjective, and as much about taste as it is about experience. We found the smoke affected our vari-lites and also the sunlight effects coming in through the windows. Digital is just so sensitive that when there are color shifts in the light, it becomes very apparent. It is very difficult even with newer light meters that read different spectrums of light; it seems your eyes can pick up better on those subtle differences than any tool, at least right now. So we got into a routine of dropping the lights down to clean the filters inside them every three or four days to do what we could to maintain the color temperature as precisely as possible.
Can you discuss the camera workflow?
We captured in ARRIRAW. DIT Tom Gough built a couple of look-up tables, one more contrasty than the other, and we switched between them depending on the levels of smoke used on the train. I chose them to emulate a print stock look. We did some work on the ASC CDLs later on, but these LUTs translated well all the way through editorial to visual effects. Pinewood Digital handled our dailies, and I worked with Goldcrest colorist Adam Glasman on the DI.
Did you wind up using drones for establishing views of the train?
We shot helicopter aerials on the Mini with the Shotover system for establishing shots of the city and all up and down the line, with trains emerging from tunnels and crossing the countryside, building a path from New York to the final destination. During the main unit shoot in London, we only used aerials [by Flying Pictures] on the final sequence, with the train settled after crashing, which was all shot practically on the backlot. In the studio, we built where the train settled after the crash.
I noticed one of those simultaneous zoom lens/dolly shots of Neeson as well. Were there any other tools and tricks you tried out on this film?
There are a couple of moments when I was able to use Cinefade, which is an in-camera way to alter depth of field. It is a complicated piece of equipment that uses two spinning polarizers, so there’s severe light loss involved. I have to say it is much easier to use when shooting in daylight, but since we were lighting everything on the train set ourselves, that complicated matters. We’d have to light things up by five or six more stops to be able to make it work. When Liam finds out he is fired, we go into slow motion as the news begins to register on his face, and Cinefade let us lose the depth on the background while he reacts. It was a very different feeling from just doing a typical push-in; except for his nose and eyes, everything is soft by the end of the shot.
Are there any technologies or new approaches that have you excited about what is on the horizon for cinematographers?
Dolby Vision is extremely exciting to me, but also daunting. Dolby is just about the only place doing HDR releases right now, but there are some issues for filmmakers and cinematographers like Jaume and I, who want to maintain the look of the film we shot as much as possible. We don’t want to see the look taken so far out from ordinary that it negatively impacts the filmgoing experience, just because there is pressure to use the expanded dynamic range. I’m on the board of governors at the ASC, and currently this matter of controlling what happens with HDR is a very important topic for us. Dolby has certain expectations, but cinematographers need some say on this as well. Yet we aren’t even always invited to be present for HDR transfers.
And the possibilities for higher quality levels for projected imagery?
I find 4K laser projection quite stunning in how it displays true blacks. I can’t tell you just how terrific I felt when watching Blade Runner 2049 in HDR; seeing it in standard after that, there was just no comparison, with one immersive, almost 3D, while the other felt like all the flat gray projection we’ve been looking at for the past decade. Theatrical releases really do need the highest quality projection to satisfy paying audiences; digital has let us duck mismatched reels and bad registration that would make you squirm and go nuts, and now it can take things to a whole new level.
About six months back, when we discussed your work on the Westworld pilot, you were supervising the HDR transfer for the whole first season.
Jonathan Nolan brought me in specifically to do that, and there were some surprises; people actually recoil sometimes from the image because it can be so intense. The one I remember most clearly was a shot of Evan Rachel Wood in the pilot with the sun right behind her. When you see it in HDR, the image is almost piercing; the intensity is actually more than your eye is used to dealing with in real life, because out in the world, you close your eyes a bit to help adjust to the light. There’s a bit of trickery with HDR, since it stretches the range from pure black to intense white, and this effect is measured in NITs. Cinematographers are wondering if we can cap the NIT level, because above a certain point it can change the impact of the image rather drastically. It is nice to be let things go a bit more intense on occasion, just to expand possibilities, but you have to be careful. The other aspect is color rendition. You have to be careful with just how much chromatic impact is useful vs. hurtful.
Crafts: Shooting
Sections: Creativity
Topics: Q&A ARRI cinefade cw sonderoptic Mac Viper Performance paul cameron skypanels Zeiss
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.