...in which the author creates entirely inappropriate amounts of eye candy by testing the ranges of most of the video generation options within ffmpeg - but also manages, as luck would have it, to identify some valuable testing tools in the process...
If you haven't started following Eyevinn's ffmpeg of the day series on Instagram (of which I am the author), perhaps you should! Anyway, the very first command that I looked at in the series was an example of the cellauto
filter in ffmpeg. While trawling through the long, long list of ffmpeg filters available for audio and video, I noticed that video generators have their own quite sizeable section in the list. There are so many of these generators available, but many of them may turn out to be useful for various purposes. Not only are there many separate generators in the list, several of them are extensible with code or plugins in a specific format. In this article, I will try and get a notion of what all of these generators can do, by examining their various parameter ranges.
A caveat, first. I am on macOS, so I will be focusing on the filters that are readily available for me - not least of all the coreimagesrc
generator with it's considerable list of built-in effects.
In fact, coreimagesrc
seems like a pretty good place to start...
The coreimagesrc
generator
Several of these generators we will look at are also available as more general filters, with some slight variation in options. Generators are more specific in that they are designed to be first in the filter chain, as inputs. This coreimagesrc
has a more general counterpart in the filter coreimage
, for example.
The first thing we want to do is identify which particular generators are available on our Mac:
ffmpeg -f lavfi -i coreimagesrc=list_generators=true null
which will give you a long list of generators with all their parameters. Let's grep it (notice the little pirouette of piping everything from stderr to stdout, which is necessary in order to pipe the ffmpeg output to grep
successfully). We may as well cut
the first part of the line too for a nice tidy list:
ffmpeg -f lavfi -i coreimagesrc=list_generators=true 2>&1 | grep Filter: | cut -c 32-
which gives us the following list (in my case):
Filter: CIAttributedTextImageGenerator
Filter: CIAztecCodeGenerator
Filter: CIBarcodeGenerator
Filter: CICheckerboardGenerator
Filter: CICode128BarcodeGenerator
Filter: CIConstantColorGenerator
Filter: CILenticularHaloGenerator
Filter: CIMeshGenerator
Filter: CIPDF417BarcodeGenerator
Filter: CIQRCodeGenerator
Filter: CIRandomGenerator
Filter: CIRoundedRectangleGenerator
Filter: CIStarShineGenerator
Filter: CIStripesGenerator
Filter: CISunbeamsGenerator
Filter: CITextImageGenerator
Quite a mixed bag. How about a lenticular halo generator?
With many filters and/or generators in ffmpeg, you can just give the filter name and the default parameters will be used. That's only partially true with the coreimage
generators; there are default parameters available, but all of these generators have one or more parameters that have no default, so you will get an error if you try something like:
ffmpeg -f lavfi -i \
"coreimagesrc=s=600x600:\
filter=CILenticularHaloGenerator" \
-t 30 -pix_fmt yuv420p output.mov
which will give:
[coreimagesrc @ 0x7fa3fc205a00] Parsing of filters failed.
[lavfi @ 0x7fa3fb7045c0] Error initializing filter 'coreimagesrc' with args 's=600x600:filter=CIMeshGenerator'
coreimagesrc=s=600x600:filter=CIMeshGenerator: Input/output error
Let's examine the parameters of the lenticular halo generator in the list from earlier (we filtered them out last time):
ffmpeg -f lavfi -i coreimagesrc=list_generators=true 2>&1 | grep -A 8 'Filter: CILenticularHaloGenerator' | cut -c 32-
giving
Option: inputCenter [CIVector]
Option: inputColor [CIColor]
Option: inputHaloRadius [NSNumber] [0 1000][70]
Option: inputHaloWidth [NSNumber] [0 300][87]
Option: inputHaloOverlap [NSNumber] [0 1][0.77]
Option: inputStriationStrength [NSNumber] [0 3][0.5]
Option: inputStriationContrast [NSNumber] [0 5][1]
Option: inputTime [NSNumber] [0 1][0]
which is very useful in that we can see which parameters have defaults (the second pair of square brackets are the defaults). So at the very least we need a CIVector
and a CIColor
. Next problem, how do we represent a CIVector
and CIColor
as parameters to the generator within an ffmpeg command? It took me a while to work this out, but with the help of the very sparse ffmpeg documentation, the ffmpeg source code and a quick look at the CoreImage apple documentation, I came up with the answer, as follows:
ffmpeg -f lavfi -i \
"coreimagesrc=s=600x600:\
filter=CILenticularHaloGenerator\
@inputCenter=200.0 200.0\
@inputColor=0.0 0.0 1.0" \
-t 5 -pix_fmt yuv420p output.mov
Which gives us five seconds of a rather nice sort of focusing iris effect:
Ok, how about a star shine generator:
ffmpeg -f lavfi -i \
"coreimagesrc=s=600x600:\
filter=CIStarShineGenerator\
@inputCenter=300.0 300.0\
@inputRadius=25\
@inputCrossAngle=0\
@inputColor=1.0 0.0 1.0" \
-t 5 -pix_fmt yuv420p starshine.mov
which yields:
The gradients
generator
There are several more of these coreimagesrc
generators to explore, maybe we'll come back to them in a bit. As there are so many more types of ffmpeg generator, let's move on for the moment. Let's examine the gradients
generator next. This is a powerful filter that can generate several different types of gradient, with random or designated colours and even rotate them if you wish (merely a convenience of course, as it's pretty trivial to add a rotation filter to any filter graph). An example:
ffmpeg -f lavfi -i \
gradients=duration=5\
:nb_colors=5:x0=320:y0=240\
:type=spiral:speed=0.1 \
-pix_fmt yuv420p gradients.mov
Here, in fact, it becomes apparent that rotation is very much meant to be part of the gradient effect - the lowest speed is not 0 but 1e-05. So in fact it always rotates even at the lowest setting, just very slowly. You could always use the rotate filter in the opposite direction to keep it stationary of course...for that you would need to know what exactly that speed
parameter represents. Here's a clue: in the ffmpeg source code, video sources are sensibly prefixed with vsrc_
, so here we can take a look in vsrc_gradients.c
and see that the speed parameter is used to calculate an angle as follows: float angle = fmodf(s->pts * s->speed, 2.f * M_PI);
So basically the angle is in radians and is a function of pts
. We'll leave that topic there this time round, just as a little note on how to compensate for the occasional shortfalls in the documentation by taking a quick look at the source code. Anyway, this is what we got from the last command:
Certainly pleasing to the eye, but the back and forth partial rotation is a bit odd. I haven't yet found a combination of gradients
parameters to make a spiral gradient rotate around its centre (and I welcome suggestions), but I did notice that setting parameters x1
and y1
to the same as x0
and y0
is a handy way to stop all rotation. After that we can simply apply the rotate
filter instead, which looks...pretty good, I suppose:
When I'm going through all the options of a filter or generator to get a feel for them, I often want to see some kind of overview of all the options together. In those cases, I typically reach for javascript and build an ffmpeg command in that way, one that results in either a grid showing all the different options or one option after another in a concatenated video. For example, here is a grid of all four gradient types in the gradients
generator. In this case, some of the default options are used, so the colours are randomly chosen and will vary each time the video is generated:
Take a look at the output ffmpeg command to understand why you might like to offload things like this to a custom command! :
ffmpeg -y -filter_complex \
"gradients=duration=10:type=spiral[out0];\
gradients=duration=10:type=radial[out1];\
gradients=duration=10:type=linear[out2];\
gradients=duration=10:type=circular[out3];\
[out0]drawtext=fontfile='fonts/crystal-radio-kit/crystal radio kit.ttf'\
:text=spiral:fontsize=35:x=(w-text_w)/2:y=(h-text_h)/10:fontcolor=black[text_out0];\
[out1]drawtext=fontfile='fonts/crystal-radio-kit/crystal radio kit.ttf'\
:text=radial:fontsize=35:x=(w-text_w)/2:y=(h-text_h)/10:fontcolor=black[text_out1];\
[out2]drawtext=fontfile='fonts/crystal-radio-kit/crystal radio kit.ttf'\
:text=linear:fontsize=35:x=(w-text_w)/2:y=(h-text_h)/10:fontcolor=black[text_out2];\
[out3]drawtext=fontfile='fonts/crystal-radio-kit/crystal radio kit.ttf'\
:text=circular:fontsize=35:x=(w-text_w)/2:y=(h-text_h)/10:fontcolor=black[text_out3];\
[text_out0][text_out1][text_out2][text_out3]xstack=inputs=4:grid=2x2" \
-t 10 allGradients.mp4
There is clearly a whole load of repetition, so this should be fairly easy to build and parameterise. Essentially this will all just be string building so we won't need to use any particular libraries for most of this script. We will need a way to call ffmpeg though - and ffmpeg will need to be present too, of course. To call a CLI command we can use the package commander.
We can see from a glance at the above command that we need:
- The start line:
ffmpeg -y -filter_complex
- A line for each gradient type with a uniquely-named output at the end:
gradients=duration=10:type=spiral[out0];\
- A line for each text label with the matching inputs from 2. and uniquely-named outputs:
[out0]drawtext=fontfile='fonts/crystal-radio-kit/crystal radio kit.ttf'\
:text=spiral:fontsize=35:x=(w-text_w)/2:y=(h-text_h)/10:fontcolor=black[text_out0];\
- An
xstack
of the appropriate size that collects all of the uniquely-nameddrawtext
-outputs as inputs to it.
Apart from that, there will be a few syntax issues to deal with as regards adding line ends, quotation marks and then building the final command. When it's built we can call exec()
from commander to invoke ffmpeg. I tend to build in line breaks to the final commands so that they are easy to debug in the console.
Without further ado here is the code for that:
// gradientsDemo.js
const { exec } = require("child_process");
// Set up variables used to construct lines
const allGradientsTypes = [
'spiral',
'radial',
'linear',
'circular',
];
const TEXT_FONT = '\'fonts/crystal-radio-kit/crystal radio kit.ttf\'';
const TEXT_POS = 'x=(w-text_w)/2:y=(h-text_h)/10';
const TEXT_COLOR = 'black';
const FONT_SIZE = 35;
const TIME = 10;
// Setup lines
let filterLinesStart= "ffmpeg -y -filter_complex \\";
let filterLinesGenerator = ""; // generator lines
let filterLinesText = ""; // text lines
let filterLinesXStackInputs = "";
let filterLinesStacker = "";
let filterLinesLast = ` -t ${TIME} -an -pix_fmt yuv420p gradients.mp4`;
// Loop through lines
allGradientsTypes.forEach((gradient, index, gradients) => {
filterLinesGenerator += `gradients=duration=${TIME}:type=${gradient}[out${index}];`;
filterLinesText += `[out${index}]drawtext=fontfile=${TEXT_FONT}\
:text=${gradient}:fontsize=${FONT_SIZE}:${TEXT_POS}:fontcolor=${TEXT_COLOR}[text_out${index}];`;
filterLinesXStackInputs += `[text_out${index}]`;
if (Object.is(gradients.length - 1, index)) {
// Finish up lines
filterLinesGenerator += `\\`;
filterLinesText += '\\';
filterLinesStacker = `xstack=inputs=${allGradientsTypes.length}:grid=${2}x${2}`;
filterLinesXStackInputs += ``;
} else {
filterLinesGenerator += `\\\n`;
filterLinesText += `\\\n`;
}
});
// Build command...
const command = `${filterLinesStart}
"${filterLinesGenerator}
${filterLinesText}
${filterLinesXStackInputs}${filterLinesStacker}"\\
${filterLinesLast}`;
console.info("Command is: ", command);
exec(command, (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
console.log(`stdout: ${stdout}`);
if (stderr) {
console.log(`stderr: ${stderr}`);
}
console.info(`Generated a file with ${allGradientsTypes.length} gradient demos in a ${2} by ${2} grid`);
return;
});
And as we get completely different colours each time, let's test that code and produce another instance of the video:
Already at this stage we are able to alter some of the text properties (as you have probably noticed you will want to have a path to the font in question on your machine). It's now fairly easy to add other parameters. For example, if the intention is to use these generated videos for analytic purposes, you may want to understand how the colours look for each of the gradients, so why not make the colours settable. By default, gradients
sets only 2 random colours, but if we pass in the colour list as an array we can set the colour count parameter nb_colors
too.
// gradientsDemo2.js
const { exec } = require("child_process");
// Set up variables used to construct lines
const allGradientsTypes = [
'spiral',
'radial',
'linear',
'circular',
];
// NEW - define the colours
const colours = [
'blue',
'red',
'yellow'
];
const TEXT_FONT = '\'fonts/crystal-radio-kit/crystal radio kit.ttf\'';
const TEXT_POS = 'x=(w-text_w)/2:y=(h-text_h)/10';
const TEXT_COLOR = 'black';
const FONT_SIZE = 35;
const TIME = 10;
// Setup lines
let filterLinesStart= "ffmpeg -y -filter_complex \\";
let filterLinesGenerator = ""; // generator lines
let filterLinesText = ""; // text lines
let filterLinesXStackInputs = "";
let filterLinesStacker = "";
let filterLinesLast = ` -t ${TIME} -an -pix_fmt yuv420p gradients2.mp4`;
// NEW - Build colours list
let colourParams = "";
colours.forEach((colour, index, colours) => {
colourParams += `:c${index}=${colour}`;
});
// Loop through lines
allGradientsTypes.forEach((gradient, index, gradients) => {
filterLinesGenerator += `gradients=duration=${TIME}:type=${gradient}${colourParams}[out${index}];`; // NEW - add colours to command
filterLinesText += `[out${index}]drawtext=fontfile=${TEXT_FONT}\
:text=${gradient}:fontsize=${FONT_SIZE}:${TEXT_POS}:fontcolor=${TEXT_COLOR}[text_out${index}];`;
filterLinesXStackInputs += `[text_out${index}]`;
if (Object.is(gradients.length - 1, index)) {
// Finish up lines
filterLinesGenerator += `\\`;
filterLinesText += '\\';
filterLinesStacker = `xstack=inputs=${allGradientsTypes.length}:grid=${2}x${2}`;
filterLinesXStackInputs += ``;
} else {
filterLinesGenerator += `\\\n`;
filterLinesText += `\\\n`;
}
});
// Build command...
const command = `${filterLinesStart}
"${filterLinesGenerator}
${filterLinesText}
${filterLinesXStackInputs}${filterLinesStacker}"\\
${filterLinesLast}`;
console.info("Command is: ", command);
exec(command, (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
console.log(`stdout: ${stdout}`);
if (stderr) {
console.log(`stderr: ${stderr}`);
}
// NEW - Show the colours
console.info(`Generated a file with ${allGradientsTypes.length} gradient demos in a ${2} by ${2} grid with colours: ${colours}`);
return;
});
This, to me, makes it a lot easier to understand how each gradient uses the colours it is given...well, I'm sure code like this will be useful again soon. Perhaps we should move on to the next generator - mandelbrot
.
The mandelbrot
generator
This filter will take you right back to the 80s or 90s with it's classic (and slightly cheesy) psychedelic effects. Here's one example of a rapid descent into the world of chaos fractals:
ffmpeg -f lavfi -i \
mandelbrot=end_pts=100 \
-pix_fmt yuv420p -t 30 mandelbrot1.mov
(Here it is the end_pts
value that speeds up the journey).
Let's do something similar to last time in javascript to explore the different shading effect presets for the parameter inner
:
const { exec } = require("child_process");
// Set up variables used to construct lines
const allInnerPresets = [
'black',
'convergence',
'mincol',
'period',
];
const TEXT_FONT = '\'fonts/crystal-radio-kit/crystal radio kit.ttf\'';
const TEXT_POS = 'x=(w-text_w)/2:y=(h-text_h)/10';
const TEXT_COLOR = 'black';
const FONT_SIZE = 35;
const TIME = 30;
// Setup lines
let filterLinesStart= "ffmpeg -y -filter_complex \\";
let filterLinesGenerator = ""; // generator lines
let filterLinesText = ""; // text lines
let filterLinesXStackInputs = "";
let filterLinesStacker = "";
let filterLinesLast = ` -t ${TIME} -an -pix_fmt yuv420p mandelbrotDemo.mp4`;
// Loop through lines
allInnerPresets.forEach((innerPreset, index, allInnerPresets) => {
filterLinesGenerator += `mandelbrot=inner=${innerPreset}:end_pts=100[out${index}];`;
filterLinesText += `[out${index}]drawtext=fontfile=${TEXT_FONT}\
:text=${innerPreset}:fontsize=${FONT_SIZE}:${TEXT_POS}:fontcolor=${TEXT_COLOR}[text_out${index}];`;
filterLinesXStackInputs += `[text_out${index}]`;
if (Object.is(allInnerPresets.length - 1, index)) {
// Finish up lines
filterLinesGenerator += `\\`;
filterLinesText += '\\';
filterLinesStacker = `xstack=inputs=${allInnerPresets.length}:grid=${2}x${2}`;
filterLinesXStackInputs += ``;
} else {
filterLinesGenerator += `\\\n`;
filterLinesText += `\\\n`;
}
});
// Build command...
const command = `${filterLinesStart}
"${filterLinesGenerator}
${filterLinesText}
${filterLinesXStackInputs}${filterLinesStacker}"\\
${filterLinesLast}`;
console.info("Command is: ", command);
exec(command, (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
console.log(`stdout: ${stdout}`);
if (stderr) {
console.log(`stderr: ${stderr}`);
}
console.info(`Generated a file with ${allInnerPresets.length} mandelbrot demos in a ${2} by ${2} grid`);
return;
});
This yields even more trippiness:
As a next step, we could use a similar process to handle the outer
parameter for example, using a nested loop. This is also something we might return to later. Moving on though, we can use very similar code for the next filter which is mptestsrc
.
The mptestsrc
filter
mptestsrc
is yet another option for generating useful test patterns and is apparently based on the Mplayer test filter. It only generates patterns of size 256x256 and has an option to cycle through all of them which is handy (and otherwise something we might do using the concat
filter):
ffmpeg -f lavfi -i \
mptestsrc=d=40\
:max_frames=100 \
-pix_fmt yuv420p mptestsrc.mov
That gives us 4 seconds of each test source:
It becomes apparent that the size is in fact different for some of the tests, which could be useful to know. We could try applying a similar approach to the above javascript, but using this list of filters instead:
const allPresets = [
'dc_luma',
'dc_chroma',
'freq_luma',
'freq_chroma',
'amp_luma',
'amp_chroma',
'cbp',
'mv',
'ring1',
'ring2'
];
which produces:
The mysterious section '15.10'...
We could theoretically do a similar thing with another generator - a generator that is basically another list of test sources. It doesn't seem to have a group name, but at the time of writing it is under section 15.10. This is the list:
const allPresets = [
'allrgb',
'allyuv',
'color',
'colorchart',
'colorspectrum',
'haldclutsrc',
'nullsrc',
'pal75bars',
'pal100bars',
'rgbtestsrc',
'smptebars',
'smptehdbars',
'testsrc',
'testsrc2',
'yuvtestsrc'
];
However, in this particular case several of the filters are differently sized and best viewed in full resolution, so we can't really present them all at once (well, I'm sure we could really and maybe we will, although not right now). This is an example that is probably best suited to the approach of concatenating the test source outputs (much like the mptestsrc
). That, too, is an exercise for a later article. Here are a couple of the available test sources though:
ffmpeg -f lavfi -i \
allrgb \
-update 1 -frames:v 1 testSrc1.png
(this one is static so I present it as an image here).
ffmpeg -f lavfi -i \
testsrc \
-pix_fmt yuv420p -t 6 testSrc2.mov
Various other generators...
So we have several more generators to explore... Just to round things off nicely here are a couple more - there is an implementation of The Game of Life as source video in which you can alter various parameters:
ffmpeg -f lavfi -i \
life=ratio=0.1:death_color=#FF0000:\
life_color=#00FF00:mold_color=yellow:mold=1:s=400x400 \
-t 30 life3.mp4
And there are also some other simpler Fractal generators - the sierpinski
and cellauto
test sources shown here:
All three of the above have many parameters to test out, so would be ideal to present with different parameters in an xstack
demo. Also something to explore in the future.
Generators with plug-in capability
Beyond that, we still have yet to explore two larger generators. These are the openclsrc
generator which is a way of using OpenCL code to generate video sources, and the frei0r_src
generator which enables using frei0r scripts as video generators. As you might expect, there is a lot to examine there. And that is what we'll do in a later article.
Finally, all of the videos in this article were converted to gifs with code such as:
ffmpeg -t 6 -i testSrc2.mov \
-vf "fps=10,scale=320:-1:flags=lanczos,split[s0][s1];[s0]palettegen[p];[s1][p]paletteuse" \
-loop 0 testSrc2.gif
Alan Allard is a developer at Eyevinn Technology, the European leading independent consultancy firm specializing in video technology and media distribution.
If you need assistance in the development and implementation of this, our team of video developers are happy to help out. If you have any questions or comments just drop us a line in the comments section to this post.
Top comments (0)