DEV Community

Umer Dev
Umer Dev

Posted on

API Rate Limits / External Services

Handling API Rate Limits in Laravel (With Code)

When you integrate external APIs, everything usually works fine in development. The real problems start in production, when multiple users trigger requests at the same time and you begin hitting rate limits.

You might see failed requests, slow responses, or inconsistent data. Most of the time, the issue isn’t your logic. It’s the fact that you’re calling the API too frequently without any control.

Here’s how to handle it properly in Laravel.

- Don’t Call APIs Directly in Controllers

This is the most common mistake:

$response = Http::post('https://api.example.com/data', [
    'key' => 'value'
]);
Enter fullscreen mode Exit fullscreen mode

This approach gives you no control over retries, failures, or rate limits. If multiple users hit this endpoint, you can easily overwhelm the API.

- Use Queue Jobs for API Calls

Move API calls into a queued job so they run in the background.

Create a job:

php artisan make:job SendApiRequestJob

Job example:

use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Support\Facades\Http;

class SendApiRequestJob implements ShouldQueue
{
    public $tries = 5;

    public function backoff()
    {
        return [10, 30, 60];
    }

    public function handle()
    {
        $response = Http::post('https://api.example.com/data', [
            'key' => 'value'
        ]);

        if ($response->status() === 429) {
            throw new \Exception('Rate limit exceeded');
        }

        if ($response->failed()) {
            throw new \Exception('API request failed');
        }

        // Process response here
    }
}
Enter fullscreen mode Exit fullscreen mode

Dispatch the job:

SendApiRequestJob::dispatch();

This prevents sending too many requests at once and lets Laravel handle retries safely.

- Use Built-in HTTP Retry

Laravel’s HTTP client supports retries out of the box:

$response = Http::retry(3, 2000)
    ->post('https://api.example.com/data');
Enter fullscreen mode Exit fullscreen mode

This will retry the request three times with a two-second delay between attempts.

- Cache Responses When Possible

If the same data is requested frequently, cache it to reduce API calls:

use Illuminate\Support\Facades\Cache;
use Illuminate\Support\Facades\Http;

$data = Cache::remember('api_data', 60, function () {
    return Http::get('https://api.example.com/data')->json();
});
Enter fullscreen mode Exit fullscreen mode

This reduces load on the external API and improves performance.

- Add Rate Limiting on Your Side

Control how often your app sends requests:

use Illuminate\Support\Facades\RateLimiter;

$key = 'external-api';

if (RateLimiter::tooManyAttempts($key, 10)) {
    return 'Too many requests. Try again later.';
}

RateLimiter::hit($key, 60);
Enter fullscreen mode Exit fullscreen mode

This limits your own application before the external API does.

- Basic Circuit Breaker Pattern

If an API keeps failing, stop calling it temporarily:

use Illuminate\Support\Facades\Cache;
use Illuminate\Support\Facades\Http;

if (Cache::get('api_down')) {
    return 'Service temporarily unavailable';
}

$response = Http::post('https://api.example.com');

if ($response->failed()) {
    Cache::put('api_down', true, 60);
}
Enter fullscreen mode Exit fullscreen mode

This protects your system from repeated failures and unnecessary load.

Top comments (1)

Collapse
 
shahzamandev profile image
Sheikh Shahzaman

Rate limiting in production is always the real challenge, good breakdown.