When building a video editing pipeline, a tempting early design is: export each clip first, then stitch them together. It seems clean — each clip is independent, re-stitch whenever you want. But it creates painful friction: users must export every clip before stitching works, and partial progress means the whole pipeline fails.
A better approach: cut segments directly from the source video at stitch time using ffmpeg -ss/-to.
Here's how to implement this pattern in a Node.js/TypeScript background job.
The Problem with Pre-Exporting
If your stitch step requires export_url to be set on every clip:
// ❌ Old pattern — every clip must be individually exported first
const missing = clips.filter((c) => !c.export_url);
if (missing.length > 0) {
throw new Error(`${missing.length} clip(s) not yet exported.`);
}
Users get blocked: one un-exported clip breaks the whole flow. Worse, if clips were exported days ago and the storage URL expires — you're stuck again.
Cut Directly from Source Instead
Each clip has start_sec and end_sec. The source video is either in cloud storage (e.g. Cloudflare R2) or a YouTube URL. At stitch time, fetch the source once and cut all segments from it.
1. Download the source video
function isYouTubeUrl(str: string): boolean {
return /^https?:\/\/(www\.)?(youtube\.com|youtu\.be)\//.test(str);
}
async function downloadYouTubeVideo(url: string, outputPath: string) {
await execFileAsync('yt-dlp', [
'--format', 'bestvideo[ext=mp4]+bestaudio[ext=m4a]/best[ext=mp4]/best',
'--merge-output-format', 'mp4',
'--output', outputPath,
'--no-playlist',
'--extractor-args', 'youtube:player_client=web,android',
'--socket-timeout', '30',
url,
], { timeout: 5 * 60 * 1000 });
}
async function downloadFromR2(r2Key: string, outputPath: string) {
const url = await getSignedUrl(r2Client, new GetObjectCommand({ Bucket: R2_BUCKET, Key: r2Key }), { expiresIn: 3600 });
const res = await fetch(url);
const buffer = await res.arrayBuffer();
await fs.writeFile(outputPath, Buffer.from(buffer));
}
2. Sort clips and build a concat list
Always sort by start_sec so the final video is chronological — don't rely on the order clips were selected:
const sorted = clips.sort((a, b) => a.start_sec - b.start_sec);
// Cut each segment from source
const segmentPaths: string[] = [];
for (const clip of sorted) {
const segPath = path.join(tmpDir, `seg-${clip.id}.mp4`);
await execFileAsync('ffmpeg', [
'-y',
'-ss', String(clip.start_sec),
'-to', String(clip.end_sec),
'-i', sourceVideoPath,
'-c', 'copy', // stream copy — fast, lossless
segPath,
]);
segmentPaths.push(segPath);
}
3. Concatenate all segments
Use ffmpeg's concat demuxer:
const concatContent = segmentPaths.map(p => `file '${p}'`).join('\n');
await fs.writeFile(concatPath, concatContent);
await execFileAsync('ffmpeg', [
'-y',
'-f', 'concat',
'-safe', '0',
'-i', concatPath,
'-c', 'copy',
outputPath,
]);
4. Use stable tmp paths for retries
If your background job framework (Inngest, BullMQ, etc.) retries steps, you want idempotent tmp file paths. Key them off the event/job ID:
const stableId = event.id.replace(/[^a-z0-9]/gi, '-').slice(0, 40);
const sourceVideoPath = path.join(os.tmpdir(), `stitch-source-${stableId}.mp4`);
const concatPath = path.join(os.tmpdir(), `stitch-concat-${stableId}.txt`);
const outputPath = path.join(os.tmpdir(), `stitch-out-${stableId}.mp4`);
If a step retries, it finds the already-downloaded source file and skips the slow download.
Why -c copy is Key
-c copy tells ffmpeg to stream-copy audio and video without re-encoding. This makes cuts nearly instantaneous — just seeking and reading bytes. The tradeoff: cuts happen at keyframe boundaries, so you may get a frame or two of drift on segment edges. For most editing use cases (highlight reels, transcript-based clips) this is perfectly acceptable.
If you need frame-accurate cuts, drop -c copy and specify encoding params — but expect 5-20× slower processing.
Full Pattern Summary
- Store only
start_sec/end_secon each clip — noexport_urlrequired - At stitch time: fetch the source video once (R2 or yt-dlp)
- Loop over sorted clips, cut each with
ffmpeg -ss/-to -c copy - Concat with
ffmpeg -f concat - Upload result to storage, clean up tmp files
This collapses a two-stage pipeline (export-then-stitch) into a single step, removes the pre-export bottleneck, and makes the stitch operation self-contained and retry-safe.
Building something with ffmpeg and Node.js? Drop your questions or war stories in the comments — happy to dig into edge cases.
Top comments (0)