How Much RAM Is Really Necessary, How Big Should Your SSD Be, and What's More Important: The CPU or the GPU?
When Jon Bach founded Puget Systems in December 2000, the company focused on supplying computers for gaming enthusiasts. As time went on, Bach realized that he could apply Puget’s system-building know-how to other industries. Today, Puget Systems optimizes systems for content creation, engineering, and scientific computing markets as well as for videogame streamers and VR enthusiasts. The company studies spec sheets, but also runs detailed tests, publishing the results at its website. All that information is free for public consumption, but customers who buy workstations from Puget can leverage the company’s one-to-one customer service routine, which is designed to help them work out the best hardware value for common workflows.
“We ask, ‘What’s your workflow, what are your pain points, and what’s your budget?’,” Bach says. “If you have a $2,500 budget and you blow all that on a high-end video card, well, that’s not the best use of funds for rendering in [Adobe] Premiere Pro. Oftentimes, you don’t have to buy the best hardware. But you can spend a lot of time researching that, and the internet isn’t always right.”
Asked about trends in workstations, Bach says a move away from Mac to PC is continuing. “Confidence that Apple is committed to the content creation crowd is still pretty low,” he explains. “That ebbs and flows, but every time the PC gets one step faster we have a new influx of people from the Apple community looking for more value for their dollar. In the past, there’s been the stigma of a PC as a huge spaghetti-monster mess, and so people get decision paralysis. Our role is to take this huge PC ecosystem and pare it down. But we look at the whole workflow — what balance can we achieve? A PC running Da Vinci Resolve full time is a different PC from one that runs 10% Resolve and 90% Photoshop.”
Your goal, Bach says, should be a workstation that never cramps your style. “Our biggest motivation is to provide a product that doesn’t interrupt creative flow,” he says. “You don’t want to have to go down to 1/8 resolution to get real-time playback — that takes you out of the zone. You need to keep your blood pressure down, so you shouldn’t be worrying about performance. Your computer is a tool, and it needs to get out of the way and let you do your job.”
With all that in mind, we asked Bach for some free advice — answers to some evergreen issues that face anyone looking to select a pro video workstation, or to build their own from components.
StudioDaily: How much RAM does video editing really require?
Jon Bach: I have a good answer to all these questions: It depends. In this case, it depends on resolution, mostly. If someone’s working in 1080p, they don’t have to spend much on RAM. In higher resolutions, it has an impact. So I can’t give a one-size-fits-all answer. We have these conversations with each individual client. We ask what codecs they’re using, what resolutions they’re working at, and we have published way too many articles on performance measurements under different circumstances. It’s pretty common for us to see 64 GB or 128 GB of RAM for 6K and 8K. Also, memory speed? It is not worth going down that rabbit hole from 2666 MHz to 3200 MHz or 4000 MHz with DDR4. Our experience has been that as your frequency goes up, your timing loosens and the reliability goes into the tank. We tend to be very conservative from a frequency standpoint and, as a result, we end up with 100x better reliability on our memory. Unreliable memory is a huge problem to us.
How big does my SSD need to be, and how do I get the most speed from SSD storage?
First of all, for your project files and your exports, in most cases any SSD can get you there. NVMe [an accelerated interface for data transfer from solid-state memory] is obviously much faster, and it’s a great technology. As we hit price parity, why not? But the project files and the export files are not large enough to saturate the throughput of even SATA SSDs. When you talk about source files — let’s say 6K Red footage, just moving those files around — we find NVMe can be a boost for that. There are some oddities that are software quirks. As you import media into Premiere Pro, you get dramatically better performance on a secondary drive than on your primary drive. That’s what the benchmarking shows. So when we do a workstation for Premiere, we’ll have an NVMe drive that runs your OS and your software, and we have a footage drive for source footage and projects, and then another for archival or projects you’re not working on now. A three-drive mix seems to be really common for us. When we talk about external storage, it’s the Wild West out there. Small shops often have it duct-taped together, and they don’t like it but it’s what they’ve always done. We approach it by bringing in best practices from multiple shops and making recommendations for people. And sometimes people have already put so much money in going down one road — we see that a lot with Mac to PC converts, so we try to make sure that we’re good to go with Thunderbolt on the PC. That’s not so easy to do, but it helps if they don’t have to buy everything brand new.
Is Thunderbolt 3 actually stable on the PC?
Yes, I think it is now. It took years. We’ve done work with ASUS and Gigabyte, and the first year was a nightmare. The X99 [released in late 2014] was one of the first platforms where we said, “This is ready for prime time.” There are certain Thunderbolt devices that are not certified for PC, including some monitors, but storage is in a pretty good spot. And we work with Apple users on using software to move their file systems to PC so they don’t have to reformat anything. And we publish everything we do as we encounter these Mac-to-PC issues.
What’s the best practice for balancing CPU power vs GPU power?
That’s always changing. It depends on how much work the software developer has put into GPU acceleration. For Premiere, you can build a really high-end box with an Nvidia Geforce GTX 1070 video card and that’s still not your bottleneck — you’re still CPU-bound. Resolve is much better at scaling across video cards. Adobe is very siloed. The After Effects team and the Photoshop team might as well be part of separate companies. As we see things like Lightroom and After Effects get major updates, it’s interesting to see how one year, After Effects is very multicore-scaling capable, and the next year it’s more about CPU frequency. It changes all the time, and you have to be on top of it. But as long as you’re doing something that’s very parallel, like rendering, you are going to benefit from more cores. Premiere scales up until the eight- or 10-core range, and then it’s in the zone of diminishing returns. A CPU in the Intel Core i9-7960X range with 64 GB memory and a [Geforce] 1070 video card tends to be a very common configuration.
What are common bottlenecks?
It’s probably most often the CPU. People might even go in the opposite direction and overbuy — you can blow a lot of your budget on your CPU and not have enough left to get enough memory or proper storage. That tends to be where the battleground is, the difficult decisions with the most ramifications elsewhere. But if you don’t go high enough [on CPU frequency], your live playback isn’t there unless you go down to 1/8 or 1/4 resolution. The CPU could impact your export times, but that’s not going to ruin your day. The difference between a 45-minute or a 90-minute export isn’t as big a deal. There is a prevailing thought that if you spend more, it gets faster. But even if you take budget out of the equation, there are a lot of tasks where an 18-core CPU is going to be slower than a really quick 8-core. For rendering, great, go for the cores. But a lot of software hasn’t made that jump. It comes down to Amdahl’s law — where is the tipping point where adding more cores actually makes your CPU slower?
What are the best practices for cooling and overclocking? And is overclocking reliable for production systems?
Not so much. We don’t get into it much anymore. Reliability is more important than a small gain in performance. And, as Turbo Boost gets more aggressive, there’s less of a need for overclocking. One of the differences between the small guys and the big guys, meaning Dell or HP, is thermal limits. Intel has base clocks they guarantee at Thermal Design Power [TDP] figures they publish, and that’s what the big guys design their cooling for. But base clock is meaningless. There is an unpublished spec of all-core Turbo processor frequencies and, to me, that’s the most meaningful spec. Overclocking is built into the processor [via Turbo Boost], but the dirty secret of the industry is that it’s considered non-guaranteed performance, and you don’t get it without beastly cooling — a processor rated at 95W may run at 165W with all-core Turbo. So when we’re designing a workstation, we throw TDPs out the window and design our thermals around real measurements, so that our workstations can sustain that Turbo frequency indefinitely. We don’t care about the tech specs as much as the real-world performance. And, because real-world performance always outstrips the tech specs, we have to over-engineer our workstations.
Puget Systems: www.pugetsystems.com