Why the Company Built Out a Cost-Effective Stereographic Production Truck
Jerry Gepner: BBS1 started life eight years ago as a really good mobile edit facility. It was a linear mobile edit facility, with digital, but it wasn’t completely HD capable at the time. In the intervening time we overbuilt the infrastructure to handle HD. Coming into this year, we adopted an interesting strategy. We’ve been in and around 3D since 2007, and that’s a fact we don’t talk about enough. When the NBA All-Star Game was produced in 3D, we designed and built the entire control room. We were also involved with the Jonas Brothers project and the Miley Cyrus project that were shot in 3D – not every aspect, but we had to learn a lot about stereographic television, and we understood it pretty well from a technology standpoint.
Coming into 2010, we realized NEP Broadcasting and Pace had shown the power of their relationship, which I think is great for the business. Thank god there are Paces, NEPs, and All Mobile Videos building dedicated, very high-end [3D production] facilities. On the other hand, that’s the tip of the iceberg. What about the rest of the production community? Their content is valuable.
That’s a question that comes up a lot when you look at consumer adoption of 3D television sets. Beyond the big events like NFL football or the U.S. Open, we’re going to need more content.
Those events are drivers. That’s first gear in the transmission. What do you do when you shift? You can’t drive around in first gear all the time. Our strategy coming into 2010 was, we believe a large portion of the marketplace wants the ability to experiment and cannot afford the prices of a major production in 3D.
When you say “experiment,” do you mean broadcasting a more low-key production live? Or just running tests that never see air?
I think you’ll see a combination of the two. At the most basic level, we believe there are customers – whether it’s stations, station groups, production companies, or smaller networks – that will go as basic as putting a camera at a location or in a studio for a period of time to try and understand whether the content easily translates to 3D. And then all the way up to a smaller-scale production with a flypack with two or three cameras, or a truck and a control room for up to six cameras. I could imagine the 3D experiment on a studio talk show being two cameras that may or may not see air. That will depend on distribution deals the content owner has. But it would make sense to experiment first. People need a cost-effective, flexible solution to be able to understand what 3d means to their particular content.
We’ve got several inquiries from customers that produce studio-based content. They want one camera for a week or two. They want to put it on a pedestal, move it around, shoot and record, and then do post-mortem analysis. We’re back to some of the most basic questions for production. How do I produce this type of content for 3D? Can I do the same kind of thing that I do for 2D? Maybe not.
Do you have a real-world example?
At the Major League Baseball All-Star Game, the truck didn’t produce 3D, but it provided an environment for broadcast executives and league executives to come in, sit down, and truly evaluate it. The truck was converted from a control room to a viewing room, with couches and that sort of thing. It’s designed to do everything from that, to functioning as a full, six-camera production control room. And we can do that conversion in less than a week. Is it as full-featured as a purpose-built truck? It can’t be. But it’s more flexible than a purpose-built truck.
When I met with Panasonic, I found out that their strategy for priming the content pump relied on that basic assumption. People will want to experiment. It’s nowhere near as sexy as producing the U.S. Open. But, in many respects, it’s every bit as important. We’re the only ones offering this kind of service to the industry. It’s a way of de-risking the next steps.
How much do you advise a production during the shoot? Would you offer the services of a stereographer?
No, but we can provide customers with a list of names. It’s an emerging craft in the television industry, much like the transition from videotape to disk recorder several years ago. An entirely new craft developed – what we call generically the EVS operator. But we don’t want to be in the crewing business. We want to enable it and put customers in touch with that community. Ultimately, it’s going to be primarily a freelance craft, this thing called convergence operator or stereographer. They are crafts – a combination of technological skill and the creative eye. We do not purport to have a creative eye.
Thinking about how 3D acquisition can be made easier, do you think Panasonic’s new all-in-one AG-3DA1 camera fits in with the bigger rigs on a shoot?
We hope so. Honestly, we haven’t gotten our hands on ours yet. They’re arriving this week. At least conceptually, we think there are definitely some places where these can add value. One of them is B-roll. We know about the limitations of the camera, but B-roll is B-roll and you can shoot lots of it on this camera because it records on compact flash media, which is very inexpensive. In sports, a couple of areas present themselves immediately. One of them is the fixed beauty shot, typically a high wide shot in an arena or stadium. This could be ideal. The other one could be a fixed shot looking into the commentary booth. You have enough light, you’re not worrying too much about depth of field. You put this thing out on a piece of speed rail out in front of the booth looking back in, or in the corner looking over to the side, and it could be a very cool shot. I haven’t played with one yet, but there are natural areas where it would be valuable. In a studio environment, you could conceivably use it for a master shot.
I have not evaluated it qualitatively, but I know it gives you two HD outputs, and unless the video quality is just dreadful, you should be able to get some pretty good usage out of it even though it’s a very limited piece of equipment. At the very worst, it’s a first-generation product. I got a photo sent to me last night showing that Sony has, in a glass case at IBC, a prototype all-in-one 3D camcorder. That whole genre of early experimentation with binocular lenses tied to multiple imagers – there’s a lot of physics you have to overcome to make that work properly. That said, if 3D does catch on, I believe all the major manufacturers will get there in the next few years.
What are some commonly held misconceptions about 3D production?
There are two that we hear a lot right now. The first one is around cost. Most people believe they can’t afford it. I don’t think that needs to be true. We’re the first wave, but I’m sure there will be others behind us who will be able to make 3D much more affordable. That is one of our real goals. And it’s not that they’re blatantly wrong – it’s that there are alternatives emerging in the marketplace. Call it dated as opposed to wrong.
The other one is that people tend to fall hard into one camp or another. They either believe it’s incredibly difficult and way beyond their ken, or that it’s like falling off a log. And the truth is in between. It depends on the person’s exposure and experience, who they’ve talked to, what their role in the process is, that sort of thing. It’s not like there are widely held misconceptions, but there’s a whole bunch of bad data running around. It’s not helped by the fact that there are multiple technology standards emerging.
What standards are you referring to?
From a production standpoint, one of the first decisions that needs to be made is whether you’re going to produce using side-by-side, which is the same interleaved format distributed to the home, or produce what’s becoming called discrete, meaning you’re going to preserve two channels of video through the production chain, and when you switch cameras, you’re going to switch the left-eye and the right-eye together. That’s one technology. Another is the side-by-side format, which is transportable through a standard HD-SDI infrastructure. One of the first decisions you have to make is which of the two commonly accepted formats you’re going to produce in, and it has implications all the way down the line. Side-by-side, while less than full resolution in each eye, is a lot easier on your infrastructure. On the other hand, if you go fully discrete, your engineering costs and costs to produce could be much higher, and you could find yourself far more limited with respect to the suite of tools you have available in the live production environment. These are important decisions.
And that’s the kind of experiment that you’re trying to enable – to judge the trade-offs between quality and bandwidth and flexibility.
Absolutely. In fact, we can provide solutions that let them look at both simultaneously. A 3ality rig is simply a camera rig, and a very good one with precision motors to make sure the interocular distance is maintained properly and convergence is maintained accurately. It’s a beautiful piece of engineering. From a video signal standpoint, we now have the ability to provide discrete left-eye and right-eye, and at the same time run it through what’s called a SIP, or a stereo image processor, and it will create the side-by-side [image]. We can feed both sets of signals out wherever the customer wants them to go – we can put the discrete into a Sony SRW tape machine so they can record it and we can put the side-by-side into their production switcher, so that when they press the camera 1 button it takes camera 1 and you don’t have to have it feed two buttons. If somebody wanted to make that evaluation, you could literally put [the two versions] side by side.
And there’s going to be disagreement, too, even among smart, high-level people. We still do 1080i and 720p for broadcast television.
Exactly! That’s a religious war I don’t want to fight. It feels a whole lot like the crusades. The same thing is happening with 3D. There are people who will argue vehemently for one or another. I had a conversation with somebody a few months ago who said their company’s testing had shown subjectively that people can’t tell the difference. I don’t know if that’s true; I wasn’t part of the tests. But obviously people are going to a lot of effort to try to figure out the right answer here. I think the answer will be similar to where we got to with HD, which is the two formats can co-exist.
Did you enjoy this article? Sign up to receive the StudioDaily Fix eletter containing the latest stories, including news, videos, interviews, reviews and more.