DEV Community

ahmet gedik
ahmet gedik

Posted on

PHP 8.3 Fibers for Concurrent API Calls

Introduction

When your application needs to call multiple external APIs — like fetching trending videos from 7 different regions — doing it sequentially is slow. PHP 8.1 introduced Fibers, and PHP 8.3 made them production-ready. Here's how I use Fibers on ViralVidVault to fetch data from multiple YouTube API endpoints concurrently.

The Problem: Sequential API Calls

// Sequential: ~7 seconds for 7 regions (1s each)
foreach (['US','GB','PL','NL','SE','NO','AT'] as $region) {
    $results[$region] = fetchTrending($region); // ~1 second each
}
// Total: 7 * 1s = ~7 seconds
Enter fullscreen mode Exit fullscreen mode

The Solution: Fibers + curl_multi

Fibers don't make I/O faster by themselves — they provide cooperative multitasking. Combined with curl_multi, they let you run multiple HTTP requests in parallel:

<?php

class ConcurrentFetcher
{
    private array $fibers = [];
    private \CurlMultiHandle $multiHandle;
    private array $handles = [];

    public function __construct()
    {
        $this->multiHandle = curl_multi_init();
    }

    public function addRequest(string $key, string $url): void
    {
        $ch = curl_init();
        curl_setopt_array($ch, [
            CURLOPT_URL => $url,
            CURLOPT_RETURNTRANSFER => true,
            CURLOPT_TIMEOUT => 10,
            CURLOPT_HTTPHEADER => ['Accept: application/json'],
        ]);

        curl_multi_add_handle($this->multiHandle, $ch);
        $this->handles[$key] = $ch;
    }

    public function execute(): array
    {
        $results = [];
        $running = null;

        // Execute all requests concurrently
        do {
            $status = curl_multi_exec($this->multiHandle, $running);
            if ($running > 0) {
                curl_multi_select($this->multiHandle, 0.1);
            }
        } while ($running > 0 && $status === CURLM_OK);

        // Collect results
        foreach ($this->handles as $key => $ch) {
            $content = curl_multi_getcontent($ch);
            $httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);

            $results[$key] = [
                'data' => $httpCode === 200 ? json_decode($content, true) : null,
                'http_code' => $httpCode,
                'time' => curl_getinfo($ch, CURLINFO_TOTAL_TIME),
            ];

            curl_multi_remove_handle($this->multiHandle, $ch);
            curl_close($ch);
        }

        curl_multi_close($this->multiHandle);
        return $results;
    }
}
Enter fullscreen mode Exit fullscreen mode

Using It for Multi-Region Fetching

<?php

$apiKey = $_ENV['YOUTUBE_API_KEY'];
$regions = ['US', 'GB', 'PL', 'NL', 'SE', 'NO', 'AT'];

$fetcher = new ConcurrentFetcher();

foreach ($regions as $region) {
    $url = 'https://www.googleapis.com/youtube/v3/videos?' . http_build_query([
        'part' => 'snippet,statistics',
        'chart' => 'mostPopular',
        'regionCode' => $region,
        'maxResults' => 25,
        'key' => $apiKey,
    ]);

    $fetcher->addRequest($region, $url);
}

$startTime = microtime(true);
$results = $fetcher->execute();
$elapsed = round(microtime(true) - $startTime, 2);

echo "Fetched " . count($regions) . " regions in {$elapsed}s\n";

foreach ($results as $region => $result) {
    $count = count($result['data']['items'] ?? []);
    echo "[{$region}] {$count} videos ({$result['time']}s)\n";
}
Enter fullscreen mode Exit fullscreen mode

Output

Fetched 7 regions in 1.23s
[US] 25 videos (0.89s)
[GB] 25 videos (0.95s)
[PL] 25 videos (1.02s)
[NL] 25 videos (0.91s)
[SE] 25 videos (0.98s)
[NO] 25 videos (1.12s)
[AT] 25 videos (1.01s)
Enter fullscreen mode Exit fullscreen mode

All 7 regions fetched in 1.23 seconds instead of ~7 seconds sequentially. The total time equals the slowest individual request, not the sum.

Adding Fibers for Structured Concurrency

For more complex scenarios where you need to process results as they arrive:

<?php

class FiberPool
{
    private array $fibers = [];
    private array $results = [];

    public function add(string $key, callable $task): void
    {
        $this->fibers[$key] = new \Fiber(function () use ($key, $task) {
            $result = $task();
            \Fiber::suspend($result);
        });
    }

    public function run(): array
    {
        // Start all fibers
        foreach ($this->fibers as $key => $fiber) {
            $result = $fiber->start();
            if ($fiber->isSuspended()) {
                $this->results[$key] = $result;
            }
        }

        return $this->results;
    }
}

// Usage
$pool = new FiberPool();

$pool->add('trending_PL', fn() => fetchAndProcess('PL'));
$pool->add('trending_NL', fn() => fetchAndProcess('NL'));
$pool->add('trending_SE', fn() => fetchAndProcess('SE'));

$results = $pool->run();
Enter fullscreen mode Exit fullscreen mode

When to Use Fibers vs curl_multi

Approach Best For
curl_multi alone Pure HTTP parallelism
Fibers + curl_multi Complex async workflows
Fibers alone CPU-bound task switching

For ViralVidVault, curl_multi is sufficient since our concurrent work is purely HTTP-based. Fibers become valuable when you need to interleave processing between requests.

Key Takeaways

  1. curl_multi gives you parallel HTTP without Fibers
  2. Fibers add structured concurrency for complex workflows
  3. Total time = slowest request, not sum of all requests
  4. Always set CURLOPT_TIMEOUT to prevent hanging fibers
  5. PHP 8.3 makes this pattern production-ready

This concurrent fetching approach is how viralvidvault.com keeps its 7-region data fresh without spending 7x the cron time.


Part of the "Building ViralVidVault" series.

Top comments (0)