New Light-Field System Captures Info About Light in a Space, Not Just an Image, Allowing Unprecedented Manipulation
A new type of digital camera coming to NAB next week aims to break through fundamental limitations of cinematography, allowing the manipulation of basic image properties including plane of focus, depth of field and frame rate in post-production. If Lytro Cinema delivers on its promise, two-camera 3D rigs and green-screen photography could become obsolete.
Lytro is debuting a light-field cinema camera that captures volumetric data about a scene rather than a single image from one fixed perspective. That means it captures information about the direction light is traveling, along with the intensity of light hitting the camera’s sensor — enough to partially reconstruct the actual 3D space in front of the camera. Of course the camera can’t see what’s happening behind walls or other objects in the frame. But it captures enough information to allow the camera to be repositioned in post.
“It’s about 100mm of complete light-field parallax, which means you can shift your camera by 100mm on X, Y and Z with every single ray of light accurately reprojected,” Jon Karafin, Lytro’s head of light field video, told StudioDaily. “That size is defined by the lens itself.”
This has implications, as you might expect, for 3D cinematography — you don’t really need a two-camera rig if one camera can acquire your left-eye and right-eye views simultaneously. But Lytro goes farther than that, promising the ability to manipulate focus, depth of field and frame rate after the fact. You can move the camera, too, pushing it slightly forward or back and from side to side, within the parallax range, after the image is actually captured.
At Last: Really Fix It in Post
“Any decision you would have normally had to bake in to the image at the time of capture, like focus, depth of field or frame rate, is now a completely computational process,” Karafin said. “That has fundamentally game-changing benefits for VFX.” For example, there would be no need for a green screen to separate a performer from the background, since the camera data would include enough volumetric information about the scene to isolate the actor computationally, rather than by looking at the color values of individual pixels. Camera-tracking and image stabilization become trivial processes.
The Lytro Cinema camera won’t be cheap. The company says it will be available on a per-day and per-production basis, with production packages starting at $125,000, making it a tool for “tentpole” feature films and high-end broadcast customers. If you want to see the camera in action, block out some time for an NAB super session on “Light Field Technology & the Future of Cinema” will include the premiere of “Life,” a short film shot with Lytro Cinema by director Robert Stromberg, DGA, and cinematographer David Stump, ASC. The session takes place Tuesday, April 19, from 4 to 5 p.m. in room S222 in the South Hall of the Las Vegas Convention Center.
Spectacular Specs
Lytro says the camera can shoot at a resolution of 755 megapixels at up to 300 fps, with up to 16 stops of dynamic range. If the camera is shooting at greater than 120fps, the resulting data can be temporally sampled and adjusted for output at any frame rate or shutter angle.
The digital camera “negative” is known as a light field master, and it allows footage to be rendered in formats including traditional cinema and broadcast as well as Imax, RealD, Dolby Vision and more. Light-field specific metadata is integrated into the file in the OpenEXR standard. The camera records to a server array via PCIe fiber connection for storage and processing (either on site or in the cloud) and plug-ins will allow existing, third-party software to manipulate the light-field data.
At launch, that software is The Foundry’s Nuke. Standard Merge nodes can be used to animate light-field properties across multiple assets — say, live-action footage combined with a 3D CG background — or to decouple assets in a scene and work on them individually. “It’s as if you had a virtual camera with virtual properties for every single element,” Karafin said.
Keep It in the Cloud
In discussions with Hollywood studios, Lytro found that management of the massive amounts of data involved in light-field photography was a concern, so a cloud-based workflow option was implemented that allows data to live entirely in the cloud, including any uploaded VFX elements required for a scene. “You don’t have to shuttle the data back and forth [to and from the cloud],” Karafin explained. “You’re just shuttling a low-bitrate UI back to the end user, who can use a laptop to do full, thousand-core processing interactively on the cloud. So we have the ability to provide all of the cloud architecture, including the GPU and CPU computing that’s necessary for an interactive session and post-processing content.”
Lytro has ideas for streamlining and accelerating workflow, like moving to a solid-state architecture and using more aggressive compression, including light-field-specific schemes to reduce the overall data requirements.
Asked what Lytro has done, as a relative newcomer to Hollywood, to ensure its camera performs at the level of usability and reliability required for high-level production, Karafin said the company has been “very sensitive” to those needs. “From the very inception of the concept, we’ve been interviewing cinematographers and studio executives to make sure what we’re building is viable for their productions,” he said. “The entire production workflow will be similar to what they are doing today — you still have a focus controller, you still have an aperture controller, and you still have a live display. The director can create their look, a light-field LUT for the picture they want to see, which is streamed as keyframe data into the file itself. Obviously they can manipulate it beyond that point in time, but the system is giving you a real-time QuickTime file that can go to editorial, with no latency or delay with production workflow. The system is tethered to our back-end server architecture, and there is redundancy in every piece of hardware, so that a failure means no downtime to production.”
Lytro is positioning the camera first and foremost as a VFX tool, which is the area where its firepower seems to have the broadest range of applications today. But if it’s a hit, the technology will surely find its way into more parts of the filmmaking process. And as impressive and powerful as the technology demos are, directors and cinematographers are sure to find something at least mildly unsettling about the idea that creative decisions that they previously thought were baked into the image — everything from depth of field effects to the precise positioning of the camera — can now be altered in post.
“This is a fundamentally disruptive technology, so we’re being sensitive to the changes it would incur,” Karafin said. “We have to make sure what we’re building is going to work for the camera guys, and we make sure everything we do has been considered by the studios and the end customers.”
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.
Extraordinary. I’ve heard the explanations of how Lytro cameras work, yet I still don’t *entirely* comprehend it. Unsure if this tech will scale to consumer levels in my lifetime, but for VFX and 3D work it sounds incredible.
Sounds like quite a system just to put focus pullers out of work! I’m kidding — it sounds remarkable. Imagine a VR360 rig built from Lytro technology.
Very Cool!!! This is the camera that will actually get me laid!!!
ok for me this takes alot of the fun out of shooting and getting back to post and seeing if I nailed the shot, now anyone with minimal talent can go and shoot something amazing