DEV Community

Cover image for When curl Fails You: Why aria2 is Your Download Hero
Bruce Mbudi
Bruce Mbudi

Posted on

When curl Fails You: Why aria2 is Your Download Hero

The Problem That (Almost) Ruined My Saturday Morning ☕

Picture this: You're trying to install Ollama on your Ubuntu machine. Should be simple, right? Just run the install script and boom—you're ready to chat with AI models locally. Done it a million times on my many VM instances around my homelab.

curl -fsSL https://ollama.com/install.sh | sh
Enter fullscreen mode Exit fullscreen mode

But instead of a smooth installation, you get hit with this beauty:

>>> Downloading Linux amd64 bundle
########################                                          34.4%
curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)
gzip: stdin: unexpected end of file
tar: Unexpected EOF in archive
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now
Enter fullscreen mode Exit fullscreen mode

And it happens. Again. And again. And AGAIN.

At first, I thought it was just a temporary network hiccup. You know how it is—sometimes the internet gods, (in our country, Safaricom) are angry and you just need to wait a few minutes. But after the third failed attempt, I knew I was in for a long debugging session.

The Usual Suspects (That Didn't Work)

Like any good developer, I went down the rabbit hole of "obvious" fixes. Each failure taught me something new about how fragile download protocols can be.

Attempt 1: Force HTTP/1.1

"Maybe it's just an HTTP/2 issue," I thought. Surely forcing HTTP/1.1 would fix it:

curl -fsSL --http1.1 https://ollama.com/install.sh | sh
Enter fullscreen mode Exit fullscreen mode

Result:

>>> Downloading Linux amd64 bundle
###########################                                       37.6%
curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)
Enter fullscreen mode Exit fullscreen mode

Still HTTP/2 errors even with --http1.1! The install script was making its own internal requests that ignored my flag.

Attempt 2: Download the script first

"Maybe I need to examine what the script is actually doing," I reasoned:

curl -fsSL https://ollama.com/install.sh -o ollama-install.sh
cat ollama-install.sh | head -10  # Looks legit
sudo bash ollama-install.sh
Enter fullscreen mode Exit fullscreen mode

Result:

>>> Downloading Linux amd64 bundle
########################################################################  39.9%
curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)
gzip: stdin: unexpected end of file
tar: Unexpected EOF in archive
Enter fullscreen mode Exit fullscreen mode

Same error! The script itself was using curl internally and hitting the same protocol issues.

Attempt 3: Try the direct binary approach

At this point, I decided to bypass the script entirely and download the binary manually. First, I needed to find the actual download URL:

curl -I https://ollama.com/download/ollama-linux-amd64
Enter fullscreen mode Exit fullscreen mode

Result:

HTTP/2 307  
location: https://github.com/ollama/ollama/releases/latest/download/ollama-linux-amd64
Enter fullscreen mode Exit fullscreen mode

Ah! It redirects to GitHub. But when I tried the GitHub URL:

sudo curl -L "https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64" -o /usr/local/bin/ollama
Enter fullscreen mode Exit fullscreen mode

Result:

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100     9  100     9    0     0      9      0  0:00:01 --:--:--  0:00:01     0
Enter fullscreen mode Exit fullscreen mode

Only 9 bytes downloaded. When I checked the file:

cat /usr/local/bin/ollama
# Output: "Not Found"
Enter fullscreen mode Exit fullscreen mode

The URL was returning a 404! Turns out the latest version packages the binary in a .tgz file, not as a direct binary.

Attempt 4: Download the correct archive

After some API investigation, I found the right download:

curl -s https://api.github.com/repos/ollama/ollama/releases/latest | grep -E '"tag_name"|"browser_download_url".*linux-amd64'
Enter fullscreen mode Exit fullscreen mode

Result:

"tag_name": "v0.11.4",
"browser_download_url": "https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64.tgz"
Enter fullscreen mode Exit fullscreen mode

Great! Now let's download the 1.2GB archive:

sudo curl -L "https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64.tgz" -o /tmp/ollama.tgz
Enter fullscreen mode Exit fullscreen mode

Result:

 39 1251M   39  500M    0     0  1572k      0  0:13:35  0:05:25  0:08:10 1745k
curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)
Enter fullscreen mode Exit fullscreen mode

500MB downloaded, then BOOM—same error! At least I was making progress, but this was getting ridiculous.

Attempt 5: Resume the download

"Fine," I thought, "I'll just resume where I left off":

sudo curl -L --http1.1 -C - "https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64.tgz" -o /tmp/ollama.tgz
Enter fullscreen mode Exit fullscreen mode

Result:

** Resuming transfer from byte position 524288000
 57  751M   57  430M    0     0  1456k      0  0:08:48  0:05:02  0:03:46 1722k
curl: (18) transfer closed with 337016947 bytes remaining to read
Enter fullscreen mode Exit fullscreen mode

So close! But different error this time. The resume worked, but the connection still couldn't stay stable for the full download.

Attempt 6: Try wget instead

"Maybe curl is just cursed today. Let's try wget":

sudo wget -c -O /tmp/ollama.tgz "https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64.tgz"
Enter fullscreen mode Exit fullscreen mode

Result:

--2025-08-16 04:13:15--  https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64.tgz
Resolving github.com (github.com)... 20.87.245.0
Connecting to github.com (github.com)|20.87.245.0|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64 [following]
HTTP request sent, awaiting response... 404 Not Found
2025-08-16 04:13:16 ERROR 404: Not Found.
Enter fullscreen mode Exit fullscreen mode

Even wget was getting confused by the redirects!

At this point, I was ready to throw my laptop out the window. A 1.2GB file should not be this hard to download in 2025! I had spent over an hour trying different approaches, and nothing was working consistently.

The Network Detective Work

Before giving up completely, I decided to dig deeper into what was actually happening. The consistent HTTP/2 protocol errors suggested this wasn't just bad luck—there was something fundamentally wrong with how my connection was handling the download.

Some theories I considered:

  • ISP throttling: Maybe my ISP was interfering with large downloads
  • DNS issues: Could be routing through problematic CDN nodes
  • Corporate firewall: Even though I was on home WiFi, maybe some proxy was involved
  • GitHub's CDN: Perhaps their content delivery network was having issues
  • Ubuntu's HTTP/2 implementation: Maybe there was a bug in my curl version

I tried a few diagnostic commands:

# Check curl version
curl --version  # curl 7.81.0 with HTTP/2 support

# Test basic connectivity  
ping -c 4 github.com  # Normal latency

# Check DNS resolution
nslookup github.com  # Resolving correctly
Enter fullscreen mode Exit fullscreen mode

Everything looked normal from a network perspective, but the large file downloads kept failing in the same way. It was time to try a different approach entirely.

Enter aria2: The Download Superhero

Then I remembered aria2 exists. If you haven't heard of it, aria2 is like curl's older, wiser sibling who's been to the gym and learned some serious download kung fu.

sudo apt install aria2 -y
Enter fullscreen mode Exit fullscreen mode

Here's the magic command that saved my Saturday:

aria2c -x 8 -s 8 -k 1M --max-tries=10 --retry-wait=2 \
  "https://github.com/ollama/ollama/releases/download/v0.11.4/ollama-linux-amd64.tgz" \
  -d /tmp/
Enter fullscreen mode Exit fullscreen mode

Let me break down why this worked when everything else failed:

Why aria2 is Amazing

Multiple Connections (-x 8 -s 8)

Instead of downloading through one connection like curl, aria2 opens 8 parallel connections and splits the file into 8 segments. If one connection fails, the others keep going.

Smart Chunking (-k 1M)

Downloads in 1MB pieces. If a chunk fails, only that 1MB needs to be re-downloaded, not the entire file.

Built-in Retry Logic (--max-tries=10 --retry-wait=2)

When HTTP/2 streams get wonky, aria2 automatically retries with exponential backoff. No manual intervention needed.

Resume-by-Design

Unlike curl's -C - flag which can be finicky, aria2 resuming just works. Always.

Better Protocol Handling

aria2 seamlessly handles redirects, protocol downgrades, and server quirks that trip up other tools.

The Beautiful Result

Watching aria2 work was like poetry in motion:

[#d1ba64 1.1GiB/1.2GiB(96%) CN:8 DL:1.6MiB ETA:26s]
08/16 05:03:15 [NOTICE] Download complete: /tmp/ollama-linux-amd64.1.tgz
Enter fullscreen mode Exit fullscreen mode

1.2GB downloaded in ~12 minutes (1.6MiB/s, definitely not the best) with zero babysitting. It handled redirects, renewed authentication tokens, and managed connection failures automatically.

When Should You Reach for aria2?

Perfect for:

  • Large files (>100MB) where connection stability matters
  • Unreliable networks (coffee shop WiFi, anyone?)
  • Enterprise environments with proxy/firewall quirks
  • CI/CD pipelines downloading dependencies
  • Docker images and container layers
  • Any time curl gives you grief

curl vs aria2 Quick Comparison:

Feature curl aria2
Parallel connections
Auto-retry logic Basic Advanced
Resume reliability Okay Excellent
Protocol error handling Limited Robust
Progress reporting Basic Detailed
Installation Pre-installed apt install aria2

Pro Tips for aria2 Mastery 💡

For CI/CD scripts:

# Download with timeout and specific user agent
aria2c --connect-timeout=30 --timeout=300 \
       --user-agent="MyApp/1.0" \
       --max-tries=5 \
       "$DOWNLOAD_URL"
Enter fullscreen mode Exit fullscreen mode

For multiple files:

# Create a download list
echo "https://example.com/file1.zip" > downloads.txt
echo "https://example.com/file2.zip" >> downloads.txt
aria2c -i downloads.txt -j 4  # 4 concurrent downloads
Enter fullscreen mode Exit fullscreen mode

For bandwidth control:

# Limit to 1MB/s to be nice to the server
aria2c --max-download-limit=1M "$URL"
Enter fullscreen mode Exit fullscreen mode

The Moral of the Story

Sometimes the tool you think you need isn't the tool you actually need. curl is fantastic for APIs, quick downloads, and scripting. But when you need to download large files reliably over unreliable connections, aria2 is your friend.

Next time a download fails mysteriously, don't fight it—just aria2 it.


Have you had similar download nightmares? What's your go-to solution for problematic downloads? Drop a comment below!


Top comments (0)