Rather than get into what can almost be called a religious debate, akin to Mac versus PC or Coke versus Pepsi, we’re going to provide a few decision points to ponder when it comes to choosing the right tool for the right project. Notice that we didn’t say the right tool, period. With as many workflows as there are NLEs (nonlinear editors), the time is right to consider how much to beef up your toolbox before determining which editing interface will serve you best.
Where Will You Edit?
We’ve traditionally had to choose between speed and portability, with the desktop computer and laptop being the respective hardware platform on which these two choices rest. Yet even today’s thinnest laptop — be it a Windows 8 ultrabook or a MacBook Air — possesses a number of key features that make portability a real option for about 90% of all editing projects.
For those projects requiring significant compositing, whether in a stand-alone motion software tool or directly within the NLE timeline, the key factor to consider in portable hardware is the graphics card (GPU). Many of today’s NLEs are GPU-enabled, meaning that the actual rendering of multiple layers of video no longer occurs until a timeline is exported to a single, flat linear video file. Instead, the GPU renders as many layers as it can at one time — including effects applied to discrete layers — to provide a real-time representation for the editor to use for decision-making. Some GPU-accelerated tools also have the option of scaling back resolution, in favor of keeping consistent timing, that are handy on portable devices.
What About the Cloud?
The secondary question that’s arisen in the last two years, but with a heritage more than a decade in the making, is whether the project will be edited in the cloud or on the local machine.
In the last 2 years, a number of online tools have popped up that offer the ability to trim, tag, segment, splice, catalog, comment on, and proof your basic online video edits.
Some editors choose this for their rough cut — or a “paper edit,” as we used to call it — while others use it for crash editing and output to a streaming format.
A few solutions are now beginning to integrate this online clip storage and collaboration with the desktop- or laptop-based NLE product. It’s a normal progression to use the tool you’re most comfortable with, leveraging the cloud to store and share both the raw video clips and in-progress timelines.
DASH or Flash?
Everyone knows Flash, and it’s part of the legacy that brought us online interactive content as well as more recent online video distribution. But you may not be as familiar with DASH, the new MPEG standard for dynamic adaptive streaming over HTTP. DASH is designed to allow content owners to deliver streaming content across a standard web server (HTTP) in multiple bitrates and resolutions, with the best resolution delivered dynamically based on the native screen resolution of the client device as well as currently available bandwidth.
The reason these two streaming delivery options enter into our NLE discussion is that the output formats often determine a large part of the workflow, which itself determines the proper tool.
DASH has an equivalent in three other formats: Apple’s HTTP Live Streaming (HLS), Adobe HTTP Dynamic Streaming (HDS), and Microsoft Smooth Streaming. HLS uses MPEG-2 Transport Stream (M2TS), but all the others use a newer technology called fragmented MP4 (FMP4). M2TS requires that audio and video be multiplexed together, where FMP4 allows different audio and video files to be simultaneously segmented at the time of streaming — allowing for one video stream to be delivered with any number of discrete audio streams — lowering the overall bitrate required to deliver to any given user.
All rely on an MP4 elementary stream (audio or video or a combination of both) that can be segmented — or fragmented, or chunked, or a few other names — into small 2–10-second pieces at a number of bitrates. Until recently, HLS required this segmentation beforehand, resulting in a massive number of small files stored at the HTTP server ahead of time. The most recent draft specification of HLS allows the MP4 files to remain intact, and segmentation occurs just before they are streamed.
Going back to the direct “DASH or Flash?” question, the good news is that prototype versions of the Flash Player have demonstrated DASH integration. The question isn’t completely moot, however, as the use of RTMP or RTSP rather than HTTP still requires a dedicated streaming server for delivery.
Audio: Now Hear This
One final area to consider is the audio capabilities of your chosen NLE. Unfortunately, the average NLE lacks robust audio capabilities, so listen as well as watch when it comes to choosing the best NLE tool for your project. Some NLEs began life as — or spun off from — audio software, and their native audio capabilities are likely to be stronger than others as a result. Alternatively, consider choosing an NLE that ships en suite with a dedicated audio tool if you need more sophisticated audio features.
Remember to think through these decision points, but also consider the types of input formats (and subsequent requirements to transcode to a “native” NLE video format) when considering which tool to use.
Above all, remember that workflows are the most critical aspect of choosing an NLE. Don’t be afraid to try out a number of trial versions of tools, including robust asset management and pre-edit logging tools, before you settle on just one. After all, a toolbox is only as good as the wide variety of tools you store in it.
Five Key Questions to Ask When Choosing an Editing Solution
- Will you edit on a desktop? On a laptop? In a studio? Or in the field?
- Will you be doing a significant amount of compositing or applying multiple layers of video or effects?
- Will your project be edited locally or on a network? Will your project involve collaborative editing in the cloud?
- Which streaming formats will matter most in your outputs?
- Do you need sophisticated pro audio features for editing or sweetening your projects?