Wednesday, November 23, 2011

Output and Intermediary Formats

Did you know that AfterEffects will cycle mask colors? Why isn't this the default? Sheesh.

More conversation between me and Nathan Vegdahl regarding delivery formats.

Me: I worry about legacy issues. In 10 or 15 years, will a movie I make today still be in a readable format? I dunno!

Nathan: The more open the standard, the more likely it is to be readable in the future. h.264 and AAC in an mp4 container is good, because it has open-source implementations. Worst-case scenario: someone has to pull up old source code and use it to convert your movie to a new format.
OpenEXR is also open source, so same deal.
The formats you have to worry about are proprietary formats, because if the companies behind them stop supporting them, you're out of luck.

Me: The main trick for our lab is: will they be able to take the file we give them and make a DigiBeta tape out of it? Because we still need to deliver DigiBetas.

Nathan: [editor's note: TL;DR: use Handbrake and/or skip to*] Hmm... I don't know if ffmpeg can encode for digital beta. I'll look into it. I use currently use ffmpeg 0.8.5 on Linux for all of my final encoding. I do everything from the command-line, because it gives me more precise control over the encoding process, and I get to play with all kinds of weird settings. But for the most part I keep it simple.
If I want to encode to a nearly (but not quite) lossless H.264 file that is widely compatible with other software, I use this command-line:
ffmpeg -i input_file -vcodec libx264 -vprofile baseline -crf 1 output_file.mp4
Where "input_file" is the name of your input video file, and "output_file.mp4" is the name of the file you want to create. Ffmpeg auto-detects what container format you want from the output file extension. In this case, it knows that .mp4 means the mpeg4 container.

The "-vcodec libx264" tells it to encode the video as H.264. The "-vprofile baseline" tells it to only use the most widely supported H.264 features, for maximum compatibility. The "-crf 1" tells it to encode nearly lossless (0 would be lossless, but isn't supported in the baseline profile; higher numbers are more lossy).

Sometimes you also need to tell it what pixel format to use, especially if you're encoding from an image sequence (I'll get to image sequences in a moment). You can do that with the pix_fmt option:
ffmpeg -i input_file -vcodec libx264 -vprofile baseline -crf 1
-pix_fmt yuv420p output_file.mp4

For fully lossless encoding, we drop the profile specification and use crf 0 and 444 chroma:

ffmpeg -i input_file -vcodec libx264 -crf 0 -pix_fmt yuv444p output_file.mp4

The resulting file, however, will not be widely supported, including by Apple's h.264 support.

You will also probably want to use AAC for your audio. Ffmpeg uses aac by default with the mp4 container, but we can specify it manually to be certain:

ffmpeg -i input_file -vcodec libx264 -vprofile baseline -crf 1 -acodec
libfaac output_file.mp4

And if we want to specify the bitrate of the audio (for example, 320kb/s):

ffmpeg -i input_file -vcodec libx264 -vprofile baseline -crf 1 -acodec
libfaac -b:a 320k output_file.mp4

If you have separate video and audio source files, you can specify them both, and ffmpeg well merge them:

ffmpeg -i input_video_file -i input_audio_file -vcodec libx264
-vprofile baseline -crf 1 output_file.mp4

When I render anything from a 3d application (for example, Blender) I always render to an image sequence. When I'm ready for final encoding of the animation, I render to png's, which are lossless, and then use
ffmpeg afterwards to manually encode them into a video file. To do this you need to tell ffmpeg where in the file names the frame number is. You do this with "%d" and some variants thereof.

If your files are named like this:
image_1.png
image_2.png
image_3.png
...
image_125.png

Then you specify the image sequence as "image_%d.png". The "%d" goes wherever the frame number is. Ffmpeg will then find all the files matching that pattern.

If your files are named like this:
image_0001.png
image_0002.png
image_0003.png
...
image_0125.png

Then you specify the image sequence as "image_%04d.png". The "04" (that's zero four) between the % and the d tells ffmpeg how many digits long the number is.

So, using this in an actual ffmpeg command-line:

ffmpeg -i image_%04d.png -vcodec libx264 -vprofile baseline -crf 1
output_file.mp4

The problem with images sequences, though, is that they contain no information about frame-rate. So we need to tell ffmpeg what frame-rate they are supposed to be in. You must specify this _before_ the image sequence on the command-line. This, for example, would give the image sequence a frame-rate of 24fps:

ffmpeg -r 24 -i image_%04d.png -vcodec libx264 -vprofile baseline -crf
1 output_file.mp4

You can also specify the frame rate with decimals and fractions:

ffmpeg -r 29.97 ...
ffmpeg -r 30/1.001

If you plan to use the file for video editing, make sure to set the GoP to 1, which means that every frame will be encoded on its own without reference to other frames in the video (such frames are called "intra frames" or "I-frames"). This makes the file size much larger, but it means that a video editor can pull frames out at random very easily, which is good for scrubbing etc. You do this by adding "-g 1" to the command-line:

ffmpeg -i input_file -vcodec libx264 -vprofile baseline -crf 1 -g 1 output_file.mp4

So there you go, that's a quick-and-dirty tutorial on how I do my video encoding. Although, internally I usually use Matroska (http://matroska.org) as my container format instead of mp4. But... then again, I use an open-source pipeline, where mastroska is well supported. I always use mp4 containers when sending material to other people.

If you don't want to use command-line ffmpeg, you can use Handbrake (http://handbrake.fr), which is a cross-platform GUI-ified version of ffmpeg. It exposes most of these options, though sometimes they can
be hard to find.

*Come to think of it, ffmpeg has an open-source ProResdecoder as well. So, for example, you could use ffmpeg to convert from pro-res to h.264 if you wanted to. I don't recall if it supports ProRes422 yet, though. But presumably it will in the future if it doesn't already.

No comments: