DEV Community

Cover image for Boost Laravel Performance with LazyCollection and cursor()
Laravel Daily tips
Laravel Daily tips

Posted on

Boost Laravel Performance with LazyCollection and cursor()

Laravel LazyCollection and cursor() are game-changers for handling large datasets efficiently. Instead of consuming all memory upfront, these tools stream data line-by-line or record-by-record. For instance, using LazyCollection to parse a user activity file or cursor() for high-value order reports ensures your application remains fast and responsive even with massive data.

These techniques not only improve Laravel performance but also help with tasks like CSV processing, large database queries, and memory optimization in Laravel. Implementing them in your Laravel workflow guarantees scalable and efficient application architecture.

Managing large datasets in Laravel can be overwhelming, especially when server memory becomes a bottleneck. Laravel's LazyCollection and cursor() method offer powerful tools to handle this challenge effectively by reducing memory consumption during data processing.

What is LazyCollection?

LazyCollection is a memory-friendly feature in Laravel that reads data only when it’s needed, instead of loading everything into memory at once. This lazy-loading behavior is perfect for working with big files, streaming responses, or looping through extensive data.

Here’s a simple use-case of reading a log file and processing lines containing errors:

use Illuminate\Support\LazyCollection;

LazyCollection::make(function () {
    $handle = fopen(storage_path('logs/system.log'), 'r');
    while (($line = fgets($handle)) !== false) {
        yield $line;
    }
})
->filter(fn($line) => str_contains($line, 'ERROR'))
->each(function ($errorLine) {
    // Handle each error line (e.g., store or alert)
});
Enter fullscreen mode Exit fullscreen mode

This allows Laravel to process a huge log file without crashing due to memory overload.

Real-world Application: Importing User Activity

Suppose you’re importing user activity from a text file for analysis. LazyCollection can help transform and insert data in chunks without overloading the system.

use App\Models\UserActivity;
use Illuminate\Support\LazyCollection;

class UserActivityImporter
{
    public function import(string $filepath)
    {
        LazyCollection::make(function () use ($filepath) {
            $file = fopen($filepath, 'r');
            while (($line = fgets($file)) !== false) {
                yield explode('|', trim($line));
            }
        })
        ->map(fn($entry) => [
            'user_id' => $entry[0],
            'action' => $entry[1],
            'timestamp' => $entry[2],
        ])
        ->chunk(300)
        ->each(fn($chunk) => UserActivity::insert($chunk->all()));
    }
}
Enter fullscreen mode Exit fullscreen mode

Using cursor() for Large DB Queries

For database-heavy applications, Laravel’s cursor() method retrieves results lazily using generators, this helps process millions of rows without memory issues.

use App\Models\Order;

class OrderAnalyzer
{
    public function analyze()
    {
        Order::cursor()
            ->filter(fn($order) => $order->total > 5000)
            ->each(fn($order) => $this->notifyHighSpender($order));
    }

    protected function notifyHighSpender($order)
    {
        // Send notification or flag order
    }
}
Enter fullscreen mode Exit fullscreen mode

This technique avoids loading all orders into memory and allows on-the-fly processing.

Final Thoughts:

Leveraging Laravel's LazyCollection and cursor() transforms how you handle large datasets, turning memory-heavy tasks into streamlined, efficient processes. Whether you're importing logs, processing user activity, or analyzing millions of records, these tools help keep performance high and memory usage low. Start using them today to build scalable, high-performing Laravel applications

🚀 Want more powerful Laravel tips like this?
Join my newsletter and stay ahead with exclusive insights 👉 LaravelDailyTips.com 💡

Top comments (0)