< Back to all posts
Pro Res the Codec of the Past, Present, or Future?
We started out researching and writing this article to answer an important workflow question for video Producers, ENG/ EFP video crews, and Editors. What is going to happen to ProRes when Apple Final Cut Pro 7 is no longer a viable NLE option? This question was somewhat like opening Pandora’s Box.
From production to post then archival and distribution the life of a video project wears many codecs. The acquisition codecs were generally defined by SMPTE and designed by the camera manufacturer. The codec was ingested into an NLE and transcoded into a new codec. The project was approved by the client, and was mastered in another codec defined by the organization then compressed into one or more codecs for distribution. Mobile recording devices and now camera manufactures have started the trend of eliminating the need for the first step. They now offer the ability to capture projects via a mobile recorder or in-camera capture of the intermediate/ finishing codecs: Apple ProRes, Avid DNxHD and CinemaDNG an open file format spearheaded by Adobe.
Questions Asked and Answered
- Which edit suites are most professional Editors currently using? The answer is FCP 7, Adobe Premiere Pro CC, and Avid Media Composer.
- Are Editors going to migrate from FCP 7 to FCP X? We spoke to video Editors all over the country and some Editors have a copy of X but very few use it to edit professionally.
- How well can Editors ingest ProRes files to an Avid NLE? The AMA (Avid Media Access) plug-in allows you to view ProRes and start cutting. Avid Editors prefer to use Avid’s native codec DNxHD whenever possible.
- How well can Editors ingest ProRes files to Adobe Premier? Premier can see almost any file and you can use a handful of different codecs within the same project’s timeline.
- How does the cloud effect which platform corporations are choosing to migrate too? Adobe Premiere Pro CC is a cloud subscription based platform and great for freelance Editors but raises questions in the corporate environment because some institutions have clear rules against Cloud software. MediaCentral Platform is Avid’s answer to the Cloud question and it seems that the user can turn Cloud functions on or off as needed. More research needs to be done with both platforms in relation to the corporate environment.
Why Does the Codec Matter?
Every codec can be transcoded and managed in a non-linear tapeless world, so why does it matter how the video is recorded and delivered to the editor? There are several very import reasons why video Producers, Directors of Photography, Editors, and Communications Directors need to pay attention to this question: time, money, and human consideration.
PRODUCERS: We all know that the video Producers are responsible to see the project from pre-production to post while keeping an ever vigilant eye on the bottom line. Producers make sure that they are requesting to capture their project with a codec that their in-house editor prefers. Producer’s who transfer and view their files onset, add metadata, or do paper edits must have a viewer that can see the codec captured. Every manufacture who creates video codecs provides a free or almost free viewer for those codecs but those players aren’t evergreen. Example: If you were and Sony PMW EX3 user you could view those files with the Sony Clip Browser. When you upgraded to the PMW F3 you had to download to the Content Browser even though you were still seeing XDCAM EX clips. (Imagine Products makes some great viewers and digital media transfer tools that work with most formats.)
DIRECTORS of PHOTGRAPHY: Since the majority of DPs are freelance it is important to know which codecs are going to capture beautiful images but also not bog down the transfer time onset and ingest time and processing power in post. Because DPs capture images in the middle of the life of a project they are automatically the middle-men arranging the best codecs with the Producers and Editors. It is a good practice to ask before the shoot which codec needs to be recorded and ask the Producer if they have a viewer for that codec.
EDITORS: We know that Editors have “codecfatigue” just like the rest of the production team. Editors who know a bit about the production process make the best team members. Understanding how the footage is going to be captured on site and making suggestions based on how the footage will be manipulated in post is invaluable.
COMMUNICATIONS DIRECTORS: Being able to see a few steps down the production road is essential for the budget, educating employees, and putting the best archival practices in place to future-proof footage.
ProRes Everywhere
Have you noticed a storm of acquisition companies announcing they now record in one or more flavors of ProRes? Arri just announced that the Amira will capture ProRes. BlackMagic Design cameras/ recorders have always recorded ProRes 422 HQ but now they can also capture ProRes 422, ProRes 422 LT, and ProRes 422 Proxy. Sony announced at NAB that the PMW-F55 camera will record in ProRes at some point in the near future. The AJA CION camera also capture ProRes 4444. In addition there are many field recorder companies that capture different qualities of Pro Res: AJA, Sound Devices, ATMOS, BlackMagic, and Convergent Design. A full list of authorized Apple ProRes devices can be viewed here.
About ProRes
The ProRes family can be broken down into 6 codecs listed here from the highest to lowest quality: ProRes 4444 XQ, ProRes 4444, ProRes 422 HQ, ProRes 422, ProRes 422 LT, and ProRes 422 Proxy. As Apple defines it “The role of a codec is to preserve image quality as much as possible at a particular reduced data rate, while delivering the fastest encoding and decoding speed” pg. 7 Apple ProRes Whitepaper June 2014. The whitepaper goes on to say “As a variable bit rate (VBR) codec technology, Apple ProRes uses fewer bits on simple frames that would not benefit from encoding at a higher data rate. All Apple ProRes codecs are frame-independent (or “intra-frame”) codecs, meaning that each frame is encoded and decoded independently of any other frame.” Basically, more data is captured by a lens and camera sensor than the human eye can process so a codec throws away a little or a lot of information to make the computer processing time more efficient. This can be done in a handful of ways but the best codecs are ones that are fast and maintain enough detail to manipulate the image in post. The end deliverable, time, and budget are the factors in how much information from the camera lens/ sensor needs to captured and ingested into an NLE.
Since the decline of Beta-SP, Video Producers, Camera Operators and Editors have yearned for another go to video format. Is ProRes rising from the pack as the predominate codec? Will the codec die out with Final Cut Pro 7? Why do so many third party recorders include this codec, is it because of past performance or future need?
If everybody can agree on one thing it is that more Producers, DPs, and Editors need to talk to each other. So let’s chat, please comment about your organizations workflow, how does the trends in cloud post production effect you? Let’s talk about the best format to capture your projects no matter the locale in order to future proof your footage. Please comment about trends in your geographic region, industry, or corner of production. And by all means Editors please give us pros and cons about your current NLE! As the dust settles we will continue to share the trends with you.
Leave a Reply