3D Workflow Featured Proprietary Software
The verboten list that 3ality tackled included dissolves, fast cuts and visual effects like smoke and multi-layered composites. From the beginning, the goal was to build a toolset that would correct automatically as much as possible. “We used to do this frame-by-frame with a Quantel iQ or Autodesk Flame or Inferno,” says Schklair. “But we had to screen footage for the band, sometimes dailies, and we needed to correct as much as we could.” The result was software that allowed artists to do “3D leveling” with the PC-based workstations, running Windows XP. “By that, I mean it corrected aberrations,” says Schklair. “It didn’t correct everything, but a lot more than what ended up in the movie.”
The toolset also allows for multiple convergence points. “This is something that doesn’t make sense at all in 2D,” says Postley. “You can have not only multiple 3D layers, but each one of the layers has a different focal plane or convergence point. If I took a shot of Bono, a shot of Edge and so on into editing, I can cut up the images and layer them to make them look like they’re standing in the same depth in the screen. It’s a 3D effect for which there is no 2D corollary.”
“We had a stereographer who worked much like a colorist does,” Schklair says. “He did 3D depth control, which allowed us to transition the depth across each edit. Within 12 to 14 frames, we move the depth to match a mid-point of the incoming shot. You don’t notice these depth changes because they’re so fast, but it allowed us to violate the rules of 3D editing and do fast cuts.” The “depth control” is a post-production process, part of the conform, and the stereographer controls the process in real-time to find the sweet spot for the dissolve or cut.
Handling color is another proprietary issue. “The way color is traditionally dealt with in 3D is that you take the dominant eye, color-correct that and then apply those changes to the other eye,” says Postley. “It turns out this is the wrong thing to do, because those two cameras aren’t in the same place, so the light doesn’t hit them the same way. You need to introduce the same color characteristics that the eye actually sees.” Postley won’t, however, reveal any details of how the 3ality system does that. “It’s too proprietary,” he says.
The infrastructure to support post was as massive as the number of original tapes cloned. With one petabyte on an Isilon cluster, the workstations were connected via 10 GB Ethernet or 8 GB fiber-channel. “We added more 10 GB Ethernet towards the end,” says Postley.
Since there aren’t many 3D tools – or, to date, many 3D movies – there’s a dearth of skilled artists who can do in stereo what they’re used to doing in 2D. “It is an ongoing battle,” says Postley. “We have people who are very, very experienced in working in stereo with all the nuances. We have others who are very talented, who started off in 2D and have learned a lot about 3D from working on the movie. To get them to the next, highest-end, level is challenging. That’s why we’re building more automation into our tools, so the operator can focus more on what he wants to do rather than how he wants to do it.”
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.
Leave a Reply