For more on collaborative editorial workflow, including 10 tips for fruitful collaboration, be sure to read the second installment of this two-part feature: How Collaborative Editing Lets You Do It Your Way
When it comes to the evolution of collaborative editing, Simon Haywood, Dell EMC Solutions Architect, Media and Entertainment, takes the long view.
Haywood put in time as a satellite engineer for the BBC before joining Isilon, and he remembers when editing was an intuitive, linear process that resulted in a videotape that you could hold in your hand, put in a VTR, and play out to air. For years, the videotape was a kind of security blanket for video pros. It was their camera negative, or their broadcast master. If you could hold a tape in your hands, or label it and put it on a shelf, you knew where your footage was at all times.
“It was really easy to understand — tactile — but entirely unsuited to any sort of collaboration or enhanced workflow,” Haywood says. “If you were doing one thing on your own, great. But the technology we have now allows you to do many things, in many different places, in parallel. We’ve gone from a serial world to a parallel world where we can all work together on the same material.”
At Isilon, immersed in the IT-based world of network-attached storage, Haywood was a video guy who knew more than how many bits and bytes a storage system could push down the pipe. He knew how many simultaneous video streams it could support. That was the most important metric in a media world that was quickly evolving storage devices from basic one-to-one SDI-based VTR replacements to networked systems with unprecedented capabilities. “EMC storage can support an astonishing number of playout clients, and that wasn’t even dreamt of when digital storage of video first came about,” Haywood says. “Collaborative workflows, multiple versioning — if you build the technology, they will come.”
When you’re building an efficient editorial pipeline, it’s important to remember that collaborative editing means different things to different people. “To a news operation, collaborative editing means the raw material comes into the facility and one team will edit for the 1 p.m. show, one team will edit for the bulletins, another team will edit for online and another team will edit for graphics. So: multiple teams working simultaneously with the same raw material. They’re all collaborating to make the output for that facility,” Haywood explains. “For a Hollywood post-production studio, where different teams specializing in different parts of the workflow are geographically distributed, collaborative editing means a bit of work in Los Angeles, and then a bit of work in New Zealand, a bit of work in Singapore and a bit of work in London. It may be a chase-the-sun workflow. They’re all working on the same output, but not necessarily at the same time.”
Part of the new collaborative landscape is VFX facilities, whose work has become ubiquitous in all genres, not just Hollywood tentpoles.“Even the silliest non-blockbuster picture will have more VFX than anybody realizes,” Burns says. “As just one example, every single show in China goes through sky replacements to present everything shot in Beijing under a blue sky. It’s now routine. An editor has to cope with material incoming from the VFX department as well as from the camera department, and in all kinds of different formats.”
And those working formats are more varied than ever. Gone are the days when every editing session necessitated a lengthy process of transcoding footage into a standard format dictated by the software vendor. That’s important when the editor is expected to deal with footage shot on everything from high-end digital cinema cameras to palm-sized camcorders, one-off action cams — and in some cases even iPhones. “It’s a combination of codec development and processing hardware getting cheaper,” explains Tom Burns, Dell EMC’s CTO of media and entertainment. “Adobe’s Mercury Engine will run on any GPU installed in your workstation, and that means you no longer need to transcode everything into a working codec. Now you can just edit Canon Raw, R3D files, ProRes — the mixing and matching of codecs is the most important thing that has happened in the last five years or so.”
Netflix and the popularity of binge-viewing is putting continually greater strain on post-production infrastructure, as it has become important — for the first time in television history — to keep as many as eight, 10 or 12 hours of content available and open for changes throughout the production of a complete season of a show. “Say you make a script change in episode 8 that requires you to fix something in episode 2,” Burns suggests. “That’s really hard on the post house. They have to keep all 13 episodes of a show unlocked and then deliver the whole series in one go.”
Another challenge? The sheer volume of content on a typical project. With digital acquisition, shooters often take a less conservative approach to how much time they spend rolling. In short, they don’t turn the camera off. “There’s a big paradigm shift that came with the transition to nonlinear editing,” Haywood says. “People who learned in the film days, when film was expensive and difficult, came to you with very few rushes — but exactly the right rushes. Now, some people literally don’t think; they just shoot. And that makes post-production much more of a challenge, because you have so much more footage to sift through.
“You could say, ‘This is terrible. This camera guy is an idiot. He should have edited in camera.’ Or you could say, ‘This camera guy is a genius. He’s giving me everything I need.”
Still, coping with the increased volume of footage is a new chore for editorial. “Logging and tagging material is 10 times harder because there’s 10 times as much material, never mind high frame rates or anything like that,” Burns says. “For a feature film, the on-set DIT used to process maybe one or two TB a night of new content and send it to post. But projects like Billy Lynn’s Long Halftime Walk show you where we’re going. They were shooting HFR, 3D, and 4K, and they generated 40 TB of content a night for two months.” Of course, not everyone is going to shoot movies in HFR and 3D. But it’s a sure bet that those numbers are going up, not down.
How will we cope? One clear trend is that the process of logging footage is going to become increasingly automated, thanks in part to machine-learning techniques that can be integrated with the editorial infrastructure. “Everyone loves being able to consume metadata, but I haven’t met a single person who says, ‘I love to input metadata!’” Haywood observes. “The trend is toward automating metadata, which makes editing easier. And if the AI can mark up shots that might be of interest, then the next logical step is to have the AI propose an edit. We might see a lot more of that in the future, particularly for news. I can see AI composing news reports not far down the road.”
Heading into the future, the challenge for technologists will remain keeping up with — and staying out of the way of — increasingly demanding creatives who expect more flexibility and support from their toolkits. Haywood notes that it never gets easy, as loosening up one bottleneck in the pipeline just reveals the next one clogging the way behind it, from storage to networking to workstation hardware to software architecture. He says it feels like a game of Whac-A-Mole.
“That’s one for us to solve as technology vendors, and we solve it by being invisible,” he says. “The minute an editor or creative has to solve a problem with their computer, or their network, or their storage, we’ve interrupted their flow by giving them a boring job to solve. We strive to provide solutions that are invisible. Isilon is low maintenance, fast, efficient storage that’s easily scalable, does its job, and is invisible. And that’s what creatives need.”
Burns thinks collaborative editorial processes can improve the “could be betters,” or CBBs In VFX lingo, those are the shots that are good enough to go into the show, but could benefit from a little more work. With a live and unlocked workflow, the VFX house can deliver its shots on schedule and then keep working on those CBBs without holding up the rest of the team. “Orchestration layers are so important [in collaborative workflow architecture],” he says. “Collaborative editorial needs to have open and well-integrated version control to allow for CBBs to be instantly updated without a manual export and import process.”
On that level, collaborative editorial is all about getting out of the way and letting creative people be creative. That’s what live and unlocked workflow is all about — allowing for a fluid working environment, rather than snarling creatives up in rigid patterns and forcing early decisions. Burns stresses that it’s important for vendors not to get hung up on their expectation of how creatives should use their technology, and instead learn about how creatives really are using their technology. “I used to get frustrated with artists,” Burns admits. “I’d say, ‘Why can’t they just use the tools the way they’re supposed to be used?’ But then I realized the infrastructure doesn’t matter. It’s the creativity that gets bums in the seats, not the fact that someone is delivering 1.2 GB/second behind the scenes.
“Artists don’t care about infrastructure. And that’s as it should be.”