SMPTE Conference's Hit List of Digital Cinema Technology
Bancroft proposed a way to compare “true resolution” among widely different camera designs. “The word pixel has an agreed-upon meaning in file and stream delivery formats,” he said. “But, with increasing variety in [camera] architecture, continued use of the word can cause misunderstandings. Using the word photosite in specifications would help end users.
“Complexity in today’s sensors means remembering the underlying principles of sampling, aliasing and filtering,” he continued. “You may have to use these to determine how many pixels you’re really getting from your camera. All camera architecture employs trade-offs between resolution, aliasing and sensitivity, and none is immune to this rule.”
The Esmeralda Stage, recounted Erland, is based on the Laboratory Aim Density frame created by the late John Pytlak of Eastman Kodak, and was further modeled on a multi-plane matte painting stage. That provided the ability to image large-scale flat test targets such as a Macbeth, D.S.C., ISO 12233, and 3D targets including color difference traveling matte backings as well as motion controlled targets for imaging motion streak and blur. “When Eastman Kodak introduced 5294, high-speed color negative, it was quite disastrous for blue-screen photography,” he said. “We had to test the problem and present it to Eastman Kodak to convince them to produce a film stock that could read the blue screen.”
As a result of the early work on the Esmeralda stage, Kodak introduced 5295, a blue-screen compatible stock. Kodak later introduced “T” grain with the 5296 stock, which was quickly adapted by the visual-effects market and cinematographers. But this stock created the problem of high-speed emulsion stress syndrome. At that time, Erland took over responsibility for the Esmeralda Stage, and he and his wife Kay founded Composite Components. With a project team led by Bill Taylor and including Jim Danforth, Ray Feeney, LeRoy DeMarsh, Phil Feiner and Bill Hogan, the actual Esmeralda Stage was hosted at various visual effects facilities.
In 2004, Esmeralda moved to the Pickford Center. It’s now redesigned as a free-standing structure employing speed-rail pipe. Erland is slated to receive a Scientific Technical award of commendation for “his leadership and efforts toward identifying and solving the problem of High-Speed Emulsion Stress Syndrome.”
“Cinematographers love to capture with anamorphic lenses, not just because of the ratio but how the anamorphic lenses deal with space, sharpness, backgrounds and flaring,” he added. “But up until now, that was a difficult task for the simple reason that the majority of the digital cameras have a 16×9 sensor. If you wish to shoot ‘scope, you have to shoot … and then crop it [to the desired aspect ratio]. In some cases, people do a kind of half-squeeze, but it doesn’t allow them to use the lenses they want. It’s a half-baked solution.”
The Arri D-21 uses a 4×3 sensor shaped like a frame of 35mm film. The question for digital cinematography has always been how to deal with an anamorphic image squeezed into that space without losing horizontal and/or vertical resolution. The best solution, Krsljanin averred, was to sub-sample all the even lines in one HD stream and all the odd lines in another HD stream. “When recombined in post, it gives a perfect anamorphic image,” he says. “The benefit is that each of these two streams gives a completely viable and recordable image. So you start with a 4×3 sensor, divide the image into two streams, each 1920×1080, with 720 active lines. Starting with 1920×1440 in the HD domain, that gives you two streams in post, and you end up with 3840×1440 after de-squeezing the image.”
In post-production, he continued, the Quantel 3D stereoscopic system is able to combine the two streams. “Post-production is exactly the same as if it were captured in film,” he said. “It works at 24, 25, 30 fps. You record it live, straight from the camera, combine it in the DI and you have an anamorphic image. It’s not rocket science.” The benefit is that the process actually adds approximately 800 lines of resolution compared to the equivalent 2.40:1 ‘scope images derived from ordinary 16×9 HD. “Plus, with M-Scope, the production is shooting with anamorphic lenses, which gives very different aesthetics, particularly in terms of depth of field, the out-of-focus backgrounds look and incidental light flares, while benefiting from HD’s cost effectiveness and flexibility.”
For more information: www.arri.de/prod/cam/mscope/details.html
The solution is to widen the exposure of digital cameras to 360 degrees, he said. “Digital cameras are now available that can capture images up to and including a 360-degree capture, or approximately 21 ms for a 48 Hz capture rate,” he said. “The 21 ms capture time represents exactly the same period as traditionally captured with a 24 fps camera with a 180-degree shutter, so when alternate frames of the 48 Hz content are extracted and played at 24 Hz, the captured images show the same motion blur as they would at 24 fps.”
To demonstrate, he showed some test material shot by cinematographer Robert Primes, ASC, at Panavision with the support of Panavision’s Nolan Murdock.
Also new are interactive dailies, which are beginning to happen in the commercial production world. “You send them to clients who draw circles or add comments and send them back and forth,” said Gaffney, who also noted the move to Blu-ray dailies and, at the other end of the spectrum, desktop dailies.
For images, visit www.jaxa.jp/video/index_e.html
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.
Leave a Reply