Recently, browser features such as SharedArrayBuffer and WebAssembly are increasing. Video and video converter ffmpeg also works in browsers.
Therefore, I tried to create a video editing software that works on a browser experimentally.
Using basic below materials.
In order to display these materials on the screen, a container for information such as playback position, length, and screen position is called a strip.
Convert these strips into the final video in a way that suits each material.
FFmpeg inputs are
image sequences or
If you want to add text, you have to generate image included text.
So, convert all the images/videos to image sequence once and pass them to ffmpeg. used three.js for the renderer. Three.js can handle image as Texture, also video as VideoTexture.
The image capture step uses ccapture to ensure that the playback frame is converted to an image. All frame images to webm video.
Next, Merge image sequence and audio. This is ffmpeg part.
For video, get a specific range of audio from the video.
For audio, cut the audio.
filter_compex to combine all the audio in strip time.
-i _vega_video.webm -i video.mp3 -i audio.mp4 -filter_complex [2:a]adelay=5715|5715[out2];[1:a]adelay=260|260[out1];[out1][out2]amix=inputs=2[out] -map [out]:a
This is the overall flow diagram.
- Video/Audio Converter
- Capture canvas
- 3D library (as Renderer)
- Audio visualization
- Monospace font
- Design System
- Test Asset
I also created a UI components to use for this. If you like / are interested, check this out too.