Did you know you can send multiple HTTP requests in parallel in Laravel, instead of one after another?
Thatβs what Http::pool()
does. Itβs built on top of Guzzle and can massively boost performance when fetching data from several APIs at once.
π§ The idea
Normally, you might do this:
$response1 = Http::get('https://api.example.com/users');
$response2 = Http::get('https://api.example.com/posts');
$response3 = Http::get('https://api.example.com/comments');
Each waits for the previous one.
If each takes 1 second β total β 3 seconds.
With Http::pool()
, Laravel runs them all at once β so the total time β the longest single request (around 1 second here).
π§© Example
use Illuminate\Support\Facades\Http;
$responses = Http::pool(fn ($pool) => [
$pool->as('users')->get('https://api.example.com/users'),
$pool->as('posts')->get('https://api.example.com/posts'),
$pool->as('comments')->get('https://api.example.com/comments'),
]);
$users = $responses['users'];
$posts = $responses['posts'];
$comments = $responses['comments'];
$usersData = $users->json();
βοΈ How it works
- The
pool()
method accepts a closure. - Inside, you define multiple requests.
- Laravel runs them concurrently using Guzzle promises.
- Returns an array of responses (each is a standard Response instance).
π§© Dynamic example
$urls = [
'https://api.example.com/products/1',
'https://api.example.com/products/2',
'https://api.example.com/products/3',
];
$responses = Http::pool(fn ($pool) =>
collect($urls)->map(fn($url) => $pool->get($url))->all()
);
foreach ($responses as $response) {
dump($response->json());
}
π When to use it
β
Fetch data from multiple APIs simultaneously
β
Speed up microservice calls
β
Reduce latency in dashboards or data aggregators
π¬ Have you used Http::pool()
before?
Share your favorite use case or performance boost story π
Top comments (0)