More Tales of Brave Engineering from 2008 HPA Retreat
Wayne Tidwell, the data-capture engineer on Benjamin Button, entered basic scene-and-take information. The system was developed specifically so that Fincher could delete takes on set. Footage is recorded to one of about 25 or 30 400 GB hard drives, each of which holds 30 minutes of material. The drives are delivered to the “digital lab” – in other words, the film’s editing room. (“There’s a security benefit implicit right there,” Mavromates noted.) Tapes are batch-digitized, with audio, into Final Cut Pro as DVCPRO HD files. The footage is backed up, at full resolution and with no compression, onto two LTO tapes, and finally the hard disk is recycled on set. On the rare occasions when 35mm film was shot, the reels were telecined to a D5 tape deck on one-day rental; otherwise, there were no tape decks in the edit room.
Dailies are viewed via a secure, Web-based system that allows access to be granted for specific people to watch dailies in a specific time frame. “VFX companies send pre-comps to [Fincher] that way,” according to Mavromates. “A lot of that is happening at David’s desk, on his desktop computer. We visit Digital Domain every Friday, but they’re feeding him shots every other day of the week, and sometimes getting feedback in 20 minutes. He can draw notes on frames, and type notes, and the VFX people can look at them. By the time we get there on Friday, we might be seeing shots on the big screen that have already gone through four or five iterations with director feedback.”
Filmmaker Suny Behar took a crack at explaining how to stop worrying and love the solid-state P2 system from Panasonic. “P2 is not your master,” he insisted. “It’s a temporary transport medium. It allows you to not be tethered to a record deck but still get very high quality.” He pointed to a P2 workflow where crew members would carry P2 cards around in “hot pouches” and “cold pouches,” depending on whether they contained “hot” camera footage or were in “cold” mode, ready to be re-used. Large blue and yellow stickers were affixed to each card depending on what stage it was at in the post workflow.
And Panavision’s Marker Karahadian demonstrated the new SSR-1 solid-state recorder that snaps onto the Genesis (or the Fony F23) just like an HDCAM SR tape deck. Plugging the unit into a docking station (dubbed the SSRD) gives it full VTR functionality, including HD SDI output.
Paul Bamborough of Codex Digital appealed for a dramatic change in the production and post-production mindset, arguing that the two disciplines are bleeding into each other. “There are complicated commercial, political and all kinds of other issues, and nobody is forced to do anything overnight,” he said. “But this is the way people will end up working, and this is the right thing to happen.”
“More pixels do not necessarily create more resolution,” declared Galt, “but can harm overall image performance.” To explain, he recalled “a big argument with Japan” from his tenure as project leader on the group that developed the Panavision Genesis. “I argued vociferously for 1920×1080 RGB,” Galt said. “They were keen on building a 4K camera – which would have been one of the UHD 3840×2160 versions [proposed by NHK]. The main reason we didn’t do that is the pixel would get so small we’d lose two stops of sensitivity and two stops of dynamic range.” That’s because the size of individual pixels on a 35mm imager determines those stats – bigger is better because bigger photosensors can capture more light.
Galt then mounted an argument for MTF (modulation transfer function) – cascaded across all components in an imaging system – as the single best measurement of “system resolution.” Determining the resolution of a film-based system, for instance, would require accounting for the MTF of a camera’s lens, the film negative, the interpositive, the internegative, the print, and the lens in the projector. The weakest link in that chain can have a dramatic detrimental effect on the final quality of an image. “Even if each of those parameters has a 90 percent MTF, the final system is only 53. If each parameter is 90 percent except for one parameter that is 60, the final will be 35 percent.
“We have fallen into this trap of defining cameras in the context of 1K, 2K, 4K or megapixels,” Galt continued. “It’s only one parameter. The system MTF measurement is the only way to characterize a complete system. If the MTF is less than 35 or 40 percent, the image is going to be out of focus.”
A working knowledge of MTF factors in a given system can lead to some important conclusions, Galt said. For instance, he estimated that a “good Nikkor lens” has an MTF of only about 30 percent at 4K resolution. “If you’re scanning film at 4K – unless you have extraordinary scanning optics – you’re wasting money,” he said.
Thorpe said camera design involves a “fundamental compromise between MTF, sharpness and aliasing,” but noted that much of the pertinent information about a camera’s resolution – the designers’ use of optical or electrical low-pass filters to tweak the captured image or the sensor’s “fill factor” (the percentage of an imaging pixel that is actually light-sensitive rather than taken up by circuitry) – is rarely published by manufacturers. He recommended the “MTF profile” as an important camera metric, suggesting that four MTF measurements be taken, at 200, 400, 600, and 800 lines of resolution, to create an accurate profile of a given camera.
Hanniball brought some horror stories with him, including the skin-of-our-teeth tale of how American Gangster was prepped for digital-cinema release. Previously, Universal had been taking a pass on digital releases of movies created without a DI – like the film-finished Breach earlier in 2007 – since it complicated the issue of creating a digital-cinema master. But by the time American Gangster hit screens, some circuits had actually gotten rid of their film projectors (!), which meant a digital release was suddenly mandatory.
Universal started with an HD telecine master (in 709 color space) and tried to convert the whole thing to a digital cinema master (in XYZ color space). They got close, but ended up doing what Hanniball called “a painstaking, cut-by-cut conform of the picture” to get the color exactly right. The DCP was completed less than a week before the film’s release. “We learned a very valuable lesson,” Hanniball said. “Never do it this way again.”
Sections: Business Technology
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.
Leave a Reply