I’ve been trying to understand some of the ffmpeg libraries enough to use for decoding audio in a personal application I’m working on. Documentation and resources to learn it are a bit low and inconsistent or usually outdated. Here I try to maintain a list of resources and information I’ve learned about the library.
These notes will be updated as I learn more about ffmpeg.
A problem I’ve encountered with ffmpeg is that many articles posted about it or open source found online is already outdated. Many modern applications like MPC-HC and Chromium do seem to use the latest version of the APIs. I also quickly learned that most learning is done through looking at the examples in ffmpeg as well as the ffplay source.
- the updated version of the ffmpeg tutorial hosted here
- the ffmpeg examples
- a post on gamedev
- the mpc-hc source
- the chromium source
- C++11 example by tomaka17
- ffmpeg + libao example
One particular thing about decoding audio is that it is generally decoded to a format that was used during the encoding process. This could be PCM Float Planar format for example, where each sample is a float and each channel is stored in a separate buffer. However, when you want to use the audio or play it through speakers, sometimes the format required can differ from the format decoded to. Fortunately ffmpeg has libswresample which can make these conversions easy. There also exists an abstraction of this using libavfilter.
e96175adthat adds avfilter support to ffplay
- libswresample’s resampling audio example
- libav’s libavresample