This article aims to look purely at speed of the two most popular video editing apps on the Mac. We’ll be looking at the comparative speed of common editing operations on the same Mac, with the same hard drives and the same footage, both optimized ProRes footage and native H.264 footage from a Canon EOS M (like a baby DSLR) and an iPhone. There are many other camera formats out there, but that’s our representative sample.
Wherever possible, we’ll do things natively, manually putting together the same edits in the same places to avoid any (unlikely) problems from conversion. We’ll also treat each program the same way: restarting before running the tests, and making sure no other apps or processes are running at the same time. If there are multiple ways to do something, we’ll try them all to find the best one. All timings are approximate, within a second or so, but the general results are pretty clear. We’ll mostly be looking at 1080p HD video, though we’ll briefly discuss 4K as well.
We won’t be looking at the interface differences, any bugs, minor workflow differences, or the “whole magnetic timeline vs tracks” issue. While either program can get most tasks done, there are definitely things which are easier in one app than the other. Not all features are common to both apps, and we won’t be going over them either. Of course, we also aren’t looking at Windows, because FCP X simply isn’t available on Windows.
A top-spec Retina iMac with all the trimmings, including the 4 GB VRAM GPU, the higher-end 4 GHz i7 chip, 24 GB RAM, and a ThunderBay 4-drive RAID 0 for storage, with just under 4 TB of 12TB free. The system drive is a 3 TB Fusion drive but it won’t be used much—all media and project files will be on the RAID. For software, I’m running OS X 10.10.5, FCP X 10.2.1, and Premiere Pro CC 2015, v2015.0.1.
For footage, I’ve shot a scintillating 2-minute multicam production of a fish tank, with external audio from a Tascam DR-60D, a Blackmagic Cinema Camera and Pocket Cinema Camera shooting ProRes, a Canon EOS M shooting H.264, and intermittent additional shots from an iPhone shooting H.264. They are all set to 25fps, but the iPhone’s native camera app shot at 30fps, just for fun and added challenge. We did try optimizing the H.264 footage, but it made very little difference. As before, we’d recommend optimizing only if your particular camera doesn’t work well natively.
How long does it take to synchronize those tracks?
Armed with my excellent fish footage, I threw it all in a pile and asked the apps to sort it out. Surprisingly, both apps were very fast at synchronizing the footage, FCP X at about 6 seconds, and Premiere about 3 seconds. Unfortunately, Premiere couldn’t synchronize one of the iPhone clips, which ruins its good showing. Both are quick, though.
Can I play this stuff in real time? How about color correction?
Both apps stutter at full resolution, but at least adding color correction doesn’t make it any worse. Neither app could play back my sample 4K UHD RED footage at full resolution and full frame rate, but setting Premiere to 1/2 resolution and FCP X to Better Performance played at the full original speed. Your mileage may vary, but 4K REDCODE is tough enough that you need either a very grunty machine or to work with optimized or proxy media.
How long does it take to stabilize a shot?
Though there are third-party stabilizers available, we’ll go with the built-in options in each program. Note that FCP X’s Smoothcam stabilizer will behave differently on different clips: sometimes a single pass is enough, and other times a second pass is performed. We’ll use a few separate shots to even things out, but all three shots required two passes in FCP X.
While I wouldn’t want to use any of the resulting shots—garbage in, garbage out! — I could use them if I had to after further tweaking. To my taste, the (2–3 times slower) Warp Stabilizer produced slightly more obvious artifacts than FCP X’s Smoothcam.
If I add effects to a sequence of shots, can I play them back in real time? And how long does it take to render them?
You don’t necessarily need to render simple effects before you export, because both apps are usually happy to calculate the effect in real time during normal playback. More complex effects can’t be played back in real time, and so waiting for a render will let you play through your timeline at full speed.
Note that by default, FCP X renders whenever your computer is idle for just a few seconds, so you don’t notice rendering at all. For a fair playing field, we have disabled background rendering in FCP X.
We tested a color correction and a sharpen operation across our 2 minute timeline. FCP X took 31 seconds, while Premiere took 73 seconds. (In FCP X the Color Board and the Sharpen filter were used, and in Premiere Pro the Lumetri Color effect did the trick.)
I’ve got a two minute sequence with titles, effects and color correction. How long does it take to export to ProRes, without rendering first? And what if it’s been rendered already?
Good question! We took a 2 minute 1080p sequence, half H.264 and half ProRes, added a simple 10 second title at the start, then added color correction and sharpening across the whole sequence with an Adjustment Layer, in both FCP X and Premiere. It’s meant to represent a typical edit with basic effects.
Exporting to ProRes 422 from FCP X, with background rendering off, took 28 seconds. Rendering the timeline took slightly longer, at 33 seconds, but a subsequent new export took only 15 seconds. That’s good news if you export to ProRes and leave background rendering on, but don’t wait for a manual render before exporting—it’ll just slow you down.
Exporting from Premiere to ProRes took much longer: around 65 seconds to export (before or after rendering) or to render the video. The situation doesn’t improve with H.264 exports either.
What about exporting the same thing to H.264?
Another good question! While ProRes makes for a good archival master, it’s overkill for uploading to YouTube or Vimeo. Both apps can export to a high-quality 10-20Mbps H.264 file at full resolution, but FCP X can make use of QuickSync (a built-in hardware encoding module) on all Macs except the Mac Pro. Premiere Pro can’t.
Exporting to H.264 from FCP X took just 24 seconds, or 23 seconds if you rendered first—a dead heat. In fact, it took the same amount of time with no effects at all applied. I’m sure you could add more effects to make it work slower, but basic effects aren’t a challenge at all.
Exporting to H.264 from Premiere took nearly twice as long as exporting to ProRes: about 122 seconds, or 126 after a pre-render. While a real-time export to H.264 was impressive a few years ago, Final Cut Pro is 5 times faster today.
If you’re working with 1080p video, both apps will let you work with common types of footage in real time, without transcoding, and both will sync at least most of your multicam footage quite quickly. However, when it comes to stabilizing, rendering and exporting, FCP X is simply much faster. While it may well be possible to obtain different results on a different machine or with many more effects (feel free to share your own results in the comments) the results here were pretty clear. Here’s hoping that Adobe can narrow the gap in a future version.
Note: you can greatly improve the Premiere export time if you:
If you get all those settings right, you can get ProRes export times on par with FCP X. However, rendering still takes more than 2x as long as FCP X, and there's no need to render beforehand in FCP X anyway, so it's not a big win. You'd also have to put that file through Adobe Media Encoder to create an H.264 file.
Bottom line: If you are using Premiere and need to export to ProRes, try to tick "Use Previews" and match sequence and export settings if you can.