Sponsored Q&A: Tom Burns, CTO M&E, EMC
Tom Burns has seen a lot. He’s watched the rise and fall of media formats, witnessed the passing of eras in technology and workflow paradigms, and taken in the cultural changes that come with them. And his view is usually from the trenches. As a workflow guru, he’s dealt with everything from film and tape formats to SDI and IP, overseeing post-production infrastructure for Technicolor and Oprah Winfrey’s Oxygen Network (now part of NBCUniversal). For a 2004 animated feature, he deployed what he describes as the 77th fastest supercomputer in the world. He gets a charge from moments of great change in technology and culture, and that means today’s media-and-entertainment business suits him well, with its epochal transition from packaged media consumption to streaming media. “It’s more than a sea change,” Burns says. “It’s a tsunami.”
Today, Burns is the CTO of media and entertainment at EMC Isilon – a position that demands he remain on the leading edge of rapid transformation in the content industry. As he prepares to hit the road for a series of meetings with leading customers and vendors that will help crystallize the challenges and opportunities offered by his new role, Burns sat down with StudioDaily to discuss the move to 4K, the ongoing transition to IP infrastructure, and why our best computer algorithms, for now, still need the helping hand of a human being.
Q: You have an interesting background in this industry. How have you ended up in the storage field today?
A: I have a degree in cultural studies, and yet I've worked in hardcore technology jobs throughout my career. I've always been either the video guy at a computing company or the computing guy at a video company. I have experience in traditional broadcast, VFX and animation, post-production for film and television, and traditional software development. I was employee number six at Alias Research, which later grew and went through some acquisitions before becoming part of Autodesk. I spent the first 13 years of my career on the vendor side, my middle 15 on the user side, and now I'm back on the vendor side. Why am I in storage now? Because storage is just about the most important thing in the known universe.
Q: What are your insights on the needs of Isilon’s customers in media and entertainment?
A: One thing I've learned over the years is that everyone's storage footprint is different. The needs of an OTT operator are entirely different from those of a VFX facility. The OTT operator needs a transcode farm with tightly integrated quality control, metadata extraction and, ultimately, metadata analytics and monetization by way of a recommendation engine. We're not quite there yet, although we have a number of initiatives to help M&E customers explore the value of analytics to their business. Meanwhile, the difference between first- and second-tier VFX providers is bespoke software tools and integrated workflow solutions vs. off the shelf toolsets. Both are tightly coupled with post-production, deliverables, and 3D conversion. And all these different footprints have to coexist in a highly automated, global pipeline.
Q: Why is extensive automation important?
A: I don't like artists and operators to interact directly with the file system. I like them to interact with a portal that hands out tasks to them while making sure rights are committed. I want to make sure that geosync and tiering are taken care of through automation. Even when you have good tools like network WAN accelerators and our own SyncIQ sync tool, you never want these large global pipelines to be controlled manually. Automation gives you all kinds of wins, like the ability to notify the client that tasks were completed successfully and the elimination of errors, including simple typing errors.
Q: Do you see the 4K transition driving activity in storage?
A: 4K is very interesting to me, but at the root it’s just a way for consumer electronics manufacturers to sell you another TV set. I don't believe 4K adds a lot of value — except for sports content, where it can be really compelling. But I do believe that it's the stalking horse required to utterly change the nature of broadcast, OTT, and streaming/packaged media as we know it. Along with 4K, we are going to get HDR and wide color gamut, and those fantastic enhancements in color reproduction and dynamic range will ultimately boost the sale of TVs. I believe 4K adoption will take place faster than the overall transition from SD to HD. But it will require the availability of content, media players, streaming applications, and so on.
Actually doing a 4K show, end-to-end, is difficult. Production pipelines are already crazy with the proliferation of digital camera formats that have to be put into a common resolution and color space. I almost long for the days when an HD signal running through a router in real time on coax was the only format you had to worry about. From a vendor’s point of view, it was proprietary hardware in a very small market, so customers would pay huge amounts of money for an HD-SDI router. Look at the economics of post, broadcast, production, and distribution. Distribution used to be about high-powered RF signals being sent through space. Now, it's all about packets.
Q: You're already addressing the next question, which is about the ongoing transition to IP infrastructure.
A: This is an incredibly interesting moment. Here in Toronto, a major post-production facility had to move offices recently, and the big question was whether you should put any coax in the new facility. It was a little early to rely on all-IP distribution at a high-end post-production facility for film and television. Yes, HD-SDI may be expensive and clunky, but it's a known quantity. It's on or it's off, meaning you don't worry about dropped packets. Even if I could put in a 100 Gb backbone and go 40 Gb to the edge and 10 Gb to each workstation, would I have the same confidence in a networked infrastructure that I do in HD-SDI? I don’t think so. In five years? Absolutely. It will be all IP. Five years ago? It would be coax. But today, it's a difficult transition. And I think they did a fantastic job with their hybrid coax and IP infrastructure.
Q: Speaking of IP, what happens as OTT delivery becomes even more widespread over the next few years?
A: You've got the granddaddy of all OTT networks, Netflix, building out their global footprint, with Amazon Prime and AT&T/DirecTV hard on their heels. iTunes is in the mix. And everybody is trying to crack a business model with inverted economics. If you're pumping out RF signals, it's a huge capital investment up front, and it doesn't cost you anything to add subscribers. You want to get as big as possible, which, in the history of broadcast networks in the United States, has meant appealing to the least common denominator. But now you have the opposite economic condition. I can stream a baby monitor channel to my mom, and it costs me nothing. But the more subscribers I add, the more it costs. How do we come up with an appropriate financial model for a distributor when adding subscribers costs them money?
A: Did you happen to see the episode of HBO’s Silicon Valley where the boys working on a tech start-up from their garage have invented a crazy codec and they're streaming live video that’s going viral, with their numbers going up and up and up? They were just barely hanging in there, trying to keep their system up. It was a happy ending for the show, but I can't think of many commercial systems out there that could withstand a highly viral load that might increase by a million subscribers with one well-placed tweet. In fact, I can't think of many commercial projects in general that could withstand that kind of scale. And yet that's the big driver of this industry — virality. If something is trending, it can be incredibly valuable for the people monetizing those clips. A clip that goes viral can make you $100,000. Look at these Vine stars with more network subscribers than Nickelodeon. How does that happen?
Q: What’s changing in security best practices? The requirements can be daunting when you're working for customers with extremely high-value media.
A: File-based technologies with good media management and logging can really help a small studio get into the game on a big picture that requires an MPAA audit. We're focused on helping both large and small customers adhere to MPAA specifications. A customer that wants to maintain best practices in security may know, realistically, that VLANs and directory permissions in separate directories on the same volume can't be easily circumvented, but they want all of their materials kept on a separate filer, SAN, or volume anyway. That's onerous, and it cuts out the small shops, but really big companies will demand it. Those companies are victims of their own popularity. They have kids diving through the dumpsters behind their production office looking for discarded script pages. And that's why they are so absolute in their pursuit of security. But security is nearing an inflection point. If we show a track record of being able to do multi-tenancy and secure VLANs in a network world — and both of those are features that EMC has proven to work for federal government accounts — that will gradually allow content owners to be comfortable that their content is secure when it's stored within a multi-tenanted environment.
Q: You mentioned "recommendation engines" earlier. Is that all about allowing content owners to more effectively monetize their old programming as the industry transitions from a traditional broadcast model to OTT and multiscreen delivery?
A: I'm not writing off the broadcast industry, but the OTT upstarts do allow for distribution of what's been called long-tail content. It offers less revenue than packaged media, and yet there is still a recognizable revenue stream, so people are willing to make investments in ingest, transcoding, QC, and metadata for the recommendation engine. That makes the OTT providers the new darlings of the industry—as long as you have metadata about what your subscribers are watching, when, and why. It's an interesting area for Isilon because of our strength in analytics. We can combine traditional file-serving architectures with an analytics play, allowing our customers to create their own recommendation engines or do something else with that extracted metadata.
Q: How is the ability to track that information about the habits of individual subscribers going to change the industry?
A: I'm not sure how it will change from a consumer point of view. I know it will, because people are frantically mining big-data applications to increase relevance to me. And I willingly give up my clickstream data if they can use those algorithms to increase my satisfaction as a consumer. As a technologist, let’s say I'm offering both 4×3 and 16×9 versions of my content. If people are only clicking through to select the 16×9 version, I might be able to gradually stop encoding the 4×3 versions. Likewise, I can tell what bandwidth versions of my multi-tier encoded assets are being used. If I get 100 percent HD end-to-end throughput on one CDN but I never get it on the other CDN, I'd better rethink my SLAs with each CDN.
I haven't even mentioned targeted advertising. That is the best thing to happen with analytics in media and entertainment — the ability to deliver real-time data to the dynamic ad insertion marketplace. Isilon is poised to inject high-quality data into those programmatic ad-buy engines to help them get more value for their dollar.
Q: Does this tie into the idea of hyperlocal and hypertargeted advertising?
A: With varying degrees of success, absolutely. But the algorithms aren't smart enough to distinguish between people who are searching for online terms because they want to get more information on that subject, or because they have experienced some kind of personal tragedy. If you're searching for something delicate or tragic, you can imagine that getting bombarded by ads for that keyword can be a difficult result. So we're not at the holy grail of hyperlocal just yet.
For a long, long time, ad buys have been controlled by a media buyer's gut sense of what works. Remember the computer at Don Draper's firm in Mad Men. All of a sudden, they wanted to spin off a whole new division that just did ad buys. They were able to guess, just using a gut feeling to target a media placement, but the computer would analyze the sales numbers that came back. So they had half of this equation back in the 1960s. And we've been trying to get better and better with matching up our gut guesses on media buys with the actual sales results that come back later.
Q: So a lot of gruntwork can be done through data analysis, but we’re still relying on that human touch to make the sale?
A: Have you read the story in Ken Auletta’s book, Googled, about Google executives Larry Page and Eric Schmidt meeting with Viacom’s [then-CEO] Mel Karmazin back in 2003? Schmidt told Karmazin that Google could quantify the advertising business — if you spend this much in ads in this industry, you’ll get this much in revenues. And Karmazin shot back, “You’re f***ing with the magic!” It's the difference between doing things by experience, intuition and gut feelings, and doing them the Google way, which is rigorously A-B tested and algorithmically driven. I thrive on these kinds of huge transitions in the market. There is a lot of leverage tipping over into the purely data-driven world, with lots of economic activity surrounding dynamic ad placement and ad purchase online marketplaces. All the different players are trying to define the market before everything settles down. It's a superb time to be in that business.
Crafts: Storage
Topics: Sponsored Content 4K emc ip infrastructure isilon ott delivery storage
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.