If you run a video processing worker that downloads YouTube content with yt-dlp, you've probably hit the n-challenge wall: downloads stall or fail with a Sign in to confirm you're not a bot error, or the n-parameter isn't being decoded correctly, leaving you with throttled or broken streams. After a few rounds of debugging our Railway-hosted worker, here's everything we fixed and why.
Background: what is the n-challenge?
YouTube applies a per-request throttling parameter called n to video URLs. yt-dlp has to solve a JavaScript challenge (embedded in the YouTube player) to decode this parameter into a valid token. If the challenge isn't solved, requests are throttled to ~50 KB/s or rejected outright.
The solver is written in JavaScript, and yt-dlp ships with multiple backends for running it: a pure-Python fallback, and native JS runtimes (Node.js, Deno, PhantomJS). The JS path is dramatically faster and more reliable — but only if the runtime is actually available and the yt-dlp installation includes the EJS scripts that invoke it.
We had three separate bugs causing failures. Here's each one.
Bug 1: Installing yt-dlp as a curl binary skips the EJS scripts
The most common Docker pattern for yt-dlp is to grab the standalone binary from GitHub releases:
RUN curl -L https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp \
-o /usr/local/bin/yt-dlp \
&& chmod a+rx /usr/local/bin/yt-dlp
This works for basic downloads, but the standalone binary bundles only a subset of yt-dlp's assets. The EJS (Embedded JavaScript) scripts that yt-dlp uses to run the n-challenge solver in an external JS runtime are not included in the single-file binary. They live in the yt_dlp/ package directory and are only present when you install via pip.
The fix is to install with pip using the [default] extra, which pulls in all optional dependencies including the JS solver scripts:
RUN pip3 install --break-system-packages "yt-dlp[default]"
The --break-system-packages flag is required on Debian/Ubuntu images that use externally-managed Python environments (PEP 668). Without it, pip refuses to install into the system Python on newer base images. This flag is safe here because we're in a container — there's no system package manager conflict to worry about.
Why [default] and not just yt-dlp? The [default] extra includes:
-
brotli— for brotli-compressed responses -
certifi— updated CA bundle -
mutagen— audio metadata writing -
pycryptodomex— AES decryption for some streams -
websockets— live stream support
None of these are strictly required for n-challenge solving, but they prevent silent fallbacks to slower code paths.
Bug 2: No JS runtime available for the n-challenge solver
Once the EJS scripts are present, yt-dlp needs a JS runtime to execute them. Our base image (node:20-slim) has Node.js, but we wanted a second option that's more isolated and doesn't require the full Node module resolution chain.
We added Deno 2.x as the JS runtime:
RUN curl -fsSL https://deno.land/install.sh \
| DENO_INSTALL=/usr/local sh \
&& deno --version
Setting DENO_INSTALL=/usr/local puts the deno binary at /usr/local/bin/deno, which is already on PATH. No .deno/bin in PATH needed.
yt-dlp auto-detects available JS runtimes at runtime in this priority order: deno, node, phantomjs. With Deno present, it takes priority. Deno's V8 sandbox also means the n-challenge JS executes with no filesystem or network access by default — a minor but real security improvement over running arbitrary YouTube JS in Node.
The full updated system deps block:
RUN apt-get update && apt-get install -y \
ffmpeg python3 python3-pip curl ca-certificates unzip \
--no-install-recommends \
&& rm -rf /var/lib/apt/lists/* \
&& pip3 install --break-system-packages "yt-dlp[default]" \
&& yt-dlp --version \
&& curl -fsSL https://deno.land/install.sh | DENO_INSTALL=/usr/local sh \
&& deno --version
Note unzip is required by the Deno installer.
Bug 3: player_client=ios silently drops cookies
This one was subtle. Our yt-dlp invocation was passing --extractor-args youtube:player_client=ios to use the iOS player client, which historically bypassed some bot detection. But it has a critical flaw: the iOS client ignores cookies.
YouTube's iOS API endpoint doesn't use browser cookie authentication — it uses OAuth tokens. When you pass --cookies with a Netscape-format cookies file (the kind you export from a browser), the iOS client simply ignores it. For age-restricted videos, private videos, or accounts that have verified status, this means your cookies are silently dropped and you get a public-access response (or a rejection).
The fix is to switch to clients that actually honor cookies:
// Before
"--extractor-args",
"youtube:player_client=ios",
// After
"--extractor-args",
"youtube:player_client=web,mweb,android",
The web client is the standard browser client — fully compatible with cookie authentication. mweb is the mobile web client, useful as a fallback. android is the Android app client, which is good for bypassing some rate limits while still respecting cookies.
yt-dlp tries them in order and uses the first one that succeeds. In practice, web handles the vast majority of cases, and mweb/android serve as automatic fallbacks without any extra code.
Keeping yt-dlp current without rebuilding
YouTube frequently updates the player JS, which changes the n-challenge algorithm. A yt-dlp binary that was current two weeks ago may fail today. We handle this by self-updating on every container start:
CMD ["sh", "-c", "yt-dlp -U && node --max-old-space-size=200 dist/server.js"]
yt-dlp -U checks for a newer pip release and upgrades in place if one exists. This adds ~2-3 seconds to cold start but means you never need to rebuild the image just to fix a YouTube extractor bug.
Summary of changes
| What | Before | After |
|---|---|---|
| yt-dlp install |
curl binary from GitHub |
pip install yt-dlp[default] |
| JS runtime | None (Python fallback) | Deno 2.x + Node.js |
| player_client | ios |
web,mweb,android |
| Cookie support | Silently broken | Working |
| EJS scripts | Missing | Included via pip package |
If you're running yt-dlp in Docker and seeing mysterious throttling, bot-detection failures, or cookies that seem to be ignored, this combination of fixes is likely what you need. The pip install + Deno runtime is especially easy to overlook when the standalone binary appears to work fine for public videos.
Top comments (1)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.