Sampling the High-Res Workflow For Dalsa's Origin
If you’re a resolution hound (and these days, who isn’t?), odds are you get a special twinkle in your eye when you look at output from the Dalsa Origin camera, which delivers images at 4K resolution. Shooting pretty pictures is one thing, but figuring out the most efficient way to handle that kind of data flow is something else entirely. To prove that it could be done, Alan Lasky set up a model 4K workflow at the Dalsa Digital Cinema Center in Woodland Hills, CA.
To ensure quality signals, Lasky needed to modernize previous-generation workflows to use DVI in place of analog connectors. “They decided to do it in the digital way, while a lot of people are still doing it in the analog way,” said Hagai Gefen, president of Gefen, which had DVI boosters, splitters and CAT5 extenders ready to go when Dalsa came knocking. “It was a good fit for them, because they were only running in high resolution, and it looks much better when you use digital connectivity. You don’t have to deal with phase adjustments and loss due to cable-related problems. You can do a whole studio set-up and you don’t have to worry about equalization, matching colors and things like that.”
A key piece of gear was Gefen’s 4×4 HDTV Matrix, which allowed video from three sources – PC and Mac workstations and a DVS Clipster – to be patched into Dalsa’s screening room. “It used to be a real chore to get something into high resolution and scale it down and do all those things in a digital way,” Gefen said. “Now, going between formats is basically just going through a menu and picking which way you want to go. Everything’s very simple.”
We asked Lasky to walk us through the pipeline on a typical Origin-originated project. His comments follow each step of the process, below.
1) Send data out from Dalsa Origin camera as DPX files (4096×2048 in RAW Bayer format) via InfiniBand
The analogous way to think is: it’s a digital still camera running at 24 frames per second. We’ve just taken the film scanner off the post line and put it on the set. The InfiniBand cable is really small, but it’s a fat pipe. It’s smaller than a BNC cable – it’s four braided fiber-optic cables together, essentially carrying all of the data from the camera to the recorder.
2) Record to Maximum Throughput DDR
It’s a 3.5 TB disk recorder based on the Sledgehammer. It’s all RAID-0 and RAID-5. It records 4096×2048 in basically a RAW Bayer format as standard SMPTE DPX files. Whatever you would see on the script supervisor’s notes and slate is piped into the metadata header of the frame. We also store timecode from the master clock on the set, and since every frame has a unique identifier, we can always use that to go back to conform. [According to Maximum Throughput CEO Giovanni Tagliamonti, the original DDR was custom-built to handle Dalsa’s data requirements of 500 MB/sec, a spec that has now been built into the Sledgehammer product – ed.]
3) Back up to Field Transfer Module (FTM) and deliver to post
We’re backing up constantly to these FTMs, which are just another, more portable, set of disks. They’re like the film negatives that you box up at the end of the day to go to the lab. They hold three and some odd hours of footage, and they go in a round robin back and forth to the post house.
4) Ingest data from FTM to post house’s internal server
It’s close to real time, depending on what connection they have. A lot of people are switching to InfiniBand, and you just need one point of InfiniBand [to meet the bandwidth requirements for real-time ingest]. But you can do it over Gigabit Ethernet or fibre channel – whatever you want.
5) Generate proxies via ProxyGen software for editing, then conform automatically using embedded timecode data
Once you get off the set and want to see dailies, you don’t have to render that material to a 4K pipeline. You want to do as much as you can with a low-res proxy. We have a piece of software called ProxyGen that generates files you can send out to video, to QuickTime files, and to Avid Meridian files. It gives you the image at a much lower resolution that you use for your edit – 720p or 1080i or even NTSC. It essentially does a window-burn with unique scene-name and timecode from the metadata header. If you’re using video, you have the burned-in timecode – but if you’re using files generated with ProxyGen, you will have embedded timecode right in the file going into Avid or Final Cut Pro instead of grabbing it from the window burn. And then you use that timecode to create an EDL that matches back directly to the original. You don’t render the original RAW Bayer material to RGB until you have an EDL that gives you a sort of pull list. We give users a proprietary piece of software called Bayer-to-RGB that lets them convert to any format they want.
6) Screen directly from disks to digital projection or dump to HDCAM SR and play from tape
Most people, up until the very end, screen everything directly off of the hard disks onto a digital projector. You can do film-outs. And a lot of people have been dumping it down to HDCAM SR and playing it out from tape.