Portraying a Future Beyond Blade Runner Divided into Haves and Have-Nots
Novelist Richard Morgan’s Altered Carbon posits a distant future in which lives can be extended by porting consciousness into a new body — or sleeve, in the era’s parlance. Mercenary Takeshi Kovacs finds himself awakened in a new form (played by Robocop’s Joel Kinnaman), then tasked by the wealthy Bancroft (James Purefoy) with solving a murder — that of the rich man himself.
Devising an appropriately exotic mix of high-tech and used future elements called for a combination of in-camera solutions with extensive visual effects. Pilot executive producer Ralph Winter, a veteran of the Star Trek and X-Men feature film franchises, recruited several vendors, with Double Negative serving as lead house, abetted by Atomic Arts, Milk VFX and Lola Visual Effects and graphics house Rushes.
To shoot showrunner Laeta Kalogridis’ 10-episode series, a pair of cinematographers, Martin Ahlgren (Daredevil) and Neville Kidd (Sherlock, Outlander), were recruited. Kidd, who is currently shooting Umbrella Academy, another Netflix project for Carbon producer Steve Blackman, continues to employ the camera he and Ahlgren relied upon for this project, the Arri Alexa 65.
StudioDaily: Was shooting with a higher-res camera mandated, given the VFX issues and the nature of the science-fiction setting?
Kidd: The sheer volume of data can be quite daunting when you’re recording everything in Arriraw, whether you’re shooting 4K, 6.5K, or, as we chose, 5K. [Codex Vaults were used by Encore Vancouver, which processed the 4K ProRes dailies] That was a happy medium, letting us reduce costs while still retaining higher image quality, sacrificing just a bit of resolution while still having the advantage of this large sensor. Arri was very generous, giving our show a great deal on these very expensive Alexa 65s. Going with 5K also let us use lenses that were faster than the 65 ones; we used Canon Cine Primes plus Cooke 5/i and s4 lenses, which looked good down past 1.5.
How did the partnership with Martin Ahlgren and yourself develop?
Both DPs were brought on by the producers at the same time, which was something a bit different. Giving us three months of prep was a more than decent amount of time, but what was really extraordinary was having the extra time for two DPs to collaborate. The level of collaboration on Altered Carbon was really an outstanding takeaway for me; we got great ideas living together without overwhelming the thrust of the story, bouncing ideas off one another to help create a visual continuity for this world that we would both be happy with. And while Martin shot the pilot, I was there from day one too, shooting additional units. Production got two solid opinions on every issue from us, and that was important because of the need to balance the principal objective — attaining a cinematic vision — while remaining on budget. This required making the most of what we had with our sets and what we could achieve in-camera.
So you were trying to limit the need for VFX set extensions?
We thought long and hard about ways to convince the viewer that they were seeing enormous depth in this future city. Our production designer was Carey Meyer [Firefly], and he built the main street set in a Vancouver, BC, warehouse that had held an old printing press. The set was five blocks long and the stage let him build up to 80 feet high, so that gave us a lot of framing options and things to see without always having VFX set extensions. But the other big decision we made to keep the effects down was to make sure that whenever we were shooting in either direction down the street, there would be an in-camera solution, and this time out it was something new to me — black-light translights.
Can they be lit conventionally as well as with UV?
For daylight scenes, we’d backlight them with HMIs, but for nights, we used black lights, which made those dark streets seem more alive and three-dimensional. Translights can often come off as rather one-dimensional solutions, so we were looking at ways to get a greater sense of depth and life to these static representations. We used paints to give a glowing neon look, and also put tiny magnetic LED lights on them to create the impression of a painted sign and to suggest a false sense of perspective. Those tiny sources would give off a little glimmer as the camera panned left to right, which brought that bit of life suggesting a real environment.
Were there other aspects that you could animate on the translights?
We projected a moving pill-tube onto the translight’s painted metro train. That meant using blacklight on the front [camera] side of the translight, but while using rear projection to project these pills across the backside.
There are ocular equivalents to cell phones and all sorts of other graphic advertising throughout the series. How did you go about creating practical interactives for these hologram-looking forms?
We both had a very close relationship with VFX supervisor Everett Burrell [whose background includes a decade devising makeup effects before his segue into visual effects.] That was essential, since we had to know which things on set they could fix in post, plus how much they would be adding to what we did. We broke down which of the holographic displays gave off interactive light that we needed to create live and which didn’t. Sometimes you didn’t want to bring in too much interactive because it would spill onto the rest of the set when you really only wanted it to read on a particular character. We might use something like a ball of LEDs on a stick to give the impression of light from the hologram. So there’d be one pass with the actor playing the hologram scene right there in the set, then follow that with a clean plate, plus an additional pass for the LED lights. It got very complex, but when you’re storytelling, you have to be very clear about these tech aspects. The audience sees all of these holograms, but sometimes these images are only supposed to be visible to a single character while others remain oblivious to it, so we had to create a ‘bible’ for the lights, explaining which could be seen by everyone, so we’d know what lighting effects were needed to land on each person at any given time.
Were LEDs your go-to for lighting throughout?
Altered Carbon is lighting heaven for DPs. We could explore the looks in an AI world as well as the real world, so Martin and I ended up using just about every type of lighting unit imaginable. It became a kind of obsession, actually, because that was going to be necessary to depict the range of environments in this future. Since we were showing so many different living conditions, there was not just one default lighting approach, but a whole range of scenarios. We used a lot of sky panels and could change the color on those easily, making huge washes of color as needed. We could put car chases through these streets, changing light color and temperature to vary the environment. LED lets us develop so many variations that would be difficult or most time-consuming to do with tungsten.
Was most of this equipment off-the-shelf?
We ended up making a lot of our own custom LED units, which required a lot of quality-control, because there’s so much LED out there that, for one reason or another, just doesn’t work right out of the box. When you shoot high-speed, some cheaper units will flicker. And there is the potential for banding and frequency problems that weren’t issues with traditional lighting.
Did the dramatic situation — differentiating between the rich living in the sky with the masses below — suggest distinct approaches for camera movement?
We wanted the moves to be more sleek and considered in the tower scenes, while things were often handheld down on the streets. Then, when things change, the whole approach reverses, and we start shooting handheld during the frenetic activity in the towers as the super-rich see it all start to fall apart. We spent a couple days with two Technocranes going, when we had [performers] jumping into the air and no-gravity situations. Technos were great for that. When we were slightly cramped for space on set, we liked using remote heads too. You could send in a dolly and stay on the wheel, which worked brilliantly.
On big VFX shows, it is a reasonable approach to work out every last detail in advance. Do you find that cuts into your ability to take advantage of accidents that come up on the day?
You try to know the important details going in. That is to say, if you’re shooting a scene in backlight, know that the sun will be behind your cast in the morning. It’s like going to war, owing to so many variables, but even if you prep to within an inch of your life, things happen and you wind up flying by the seat of your pants because the sun went behind a cloud. It’s like the fingers of God have entered your shot — but you just have to go for it anyway. I always try to embrace that, while still making the best use of a given location. And a lot of the success or failure with this depends on your crew. We had phenomenal people behind the camera, including focus-pullers like 1st AC Kieran Humphries, who had to have to have a very zen relationship with the camera. It was like a dance he shared with the actors and the camera.
Were there many instances where you shot with multiple cameras?
For the big battles, we always operated additional cameras. Episode seven is a prequel episode and at one point there were five cameras shooting, including a drone and a high-speed Phantom. The high-speed shoot required some tremendous engineering, because we wanted this 1000-fps camera to make a circle at extremely high speed, and this had to work during a forest fire while an army attacks. Grip Kim Olsen and [rigging key grip] David McIntosh, who won an Oscar for the inflatable green screen, built us this rotating camera rig, which, though made very quickly, worked fantastically well while still providing complete safety.
There are often issues with shooting flames because the highlights can be lost …
But we didn’t experience any of that blowing out, because the latitude with the Alexa 65 was just so phenomenal. You could see all the textures and details in the fire. During prep, we actually tried pretty hard to trash the image, but found that there’s so much data that it is really tough to ruin the picture. I will say that you do have to watch the whites in the HDR phase of the DI to make sure they don’t go too crazy. You want the whites to bloom but not become those clipped, overexposed whites.
How was the DI handled?
Martin and I had both gone onto new jobs, so we did our HDR grades remotely, which is a fantastic development in just the last few years. Before, you’d have to give a lot of notes beforehand and then have a lot of trust that things would work out, but now you just need access to a 4K HDR monitor. I did mine from Blazing Griffin Post in Glasgow, while on a conference call with [Deluxe’s Company 3] colorist Jill Bogdanowicz in L.A. We did the HDR first, and then there was a separate SD pass afterward.
Is Altered Carbon anything like what you imagined it might be visually when first reading the script?
Martin and I had both read the book as well as the script, and when author Richard Morgan came on set, I had a few discussions with him. Getting a lot of his vision on-screen was a genuine concern for me, along making sure it lived with our intentions for the city and world. Whenever I read something new, I always get a very strong picture in my head, no matter how complex a project it might become. Even with all the efforts and brilliant innovations by the effects crews and the art department, it really does look similar to how I first envisioned it.
Crafts: Shooting
Sections: Technology
Topics: Project/Case study Q&A alexa 65 arriraw Netflix neville kidd
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.