DEV Community

Iheb Jabri
Iheb Jabri

Posted on

Building a Quick CSV Export Command in Laravel

Image description

Building a Quick CSV Export Command in Laravel

When working with Laravel applications, you might find yourself needing to export data to a CSV file. Whether you're building a reporting feature or simply exporting data for analysis, it’s essential to implement a solution that is both efficient and straightforward.

Here’s a recommended approach for creating a quick CSV export command in Laravel. This method uses Laravel’s chunking functionality to handle large datasets gracefully, writing directly to the output stream with PHP’s fputcsv function.

Why This Approach?

  1. Efficiency: By leveraging Laravel’s chunking, you process the data in smaller batches, reducing memory usage.
  2. Scalability: You can export large datasets without running into memory issues.
  3. Simplicity: Using fputcsv ensures that CSV formatting is handled correctly without needing additional libraries.

Code Example

use Illuminate\Database\Eloquent\Collection;

$chunk = 500; // Number of records per chunk
$output = fopen('php://stdout', 'wb+'); // Open a stream to StdOut

// Write the CSV headers
fputcsv($output, ['email', 'name']);

// Query the data and write to CSV in chunks
User::select(['email', 'name'])
    ->chunk($chunk, function (Collection $users) use ($output) {
        $users->each(
            fn($user) => fputcsv($output, [$user->email, $user->name])
        );
    });
Enter fullscreen mode Exit fullscreen mode

Breaking Down the Code

1. Define the Chunk Size

The $chunk variable determines how many records are processed in each iteration. Adjust this number based on your dataset size and memory availability.

$chunk = 500;
Enter fullscreen mode Exit fullscreen mode

2. Open a Stream to StdOut

Using php://stdout, you can write directly to the output stream. This is particularly useful for commands that generate CSV exports, as the output can be piped directly to a file or another process.

$output = fopen('php://stdout', 'wb+');
Enter fullscreen mode Exit fullscreen mode

3. Write the CSV Headers

Before writing the data rows, define the CSV headers with fputcsv.

fputcsv($output, ['email', 'name']);
Enter fullscreen mode Exit fullscreen mode

4. Query the Data in Chunks

Use Laravel’s chunk method to process the data in smaller batches. This prevents memory exhaustion when dealing with large datasets.

User::select(['email', 'name'])
    ->chunk($chunk, function (Collection $users) use ($output) {
        $users->each(
            fn($user) => fputcsv($output, [$user->email, $user->name])
        );
    });
Enter fullscreen mode Exit fullscreen mode

5. Write Each Row

For each user in the chunk, use fputcsv to format and write their data as a CSV row.

fn($user) => fputcsv($output, [$user->email, $user->name]);
Enter fullscreen mode Exit fullscreen mode

Use Case: Artisan Command

You can encapsulate this functionality within an Artisan command, allowing you to execute the export with a single terminal command. Here’s how you can structure your Artisan command:

<?php

namespace App\Console\Commands;

use Illuminate\Console\Command;
use Illuminate\Database\Eloquent\Collection;
use App\Models\User;

class ExportUsersToCsv extends Command
{
    protected $signature = 'export:users';
    protected $description = 'Export users to a CSV file';

    public function handle()
    {
        $chunk = 500;
        $output = fopen('php://stdout', 'wb+');

        // Write the CSV headers
        fputcsv($output, ['email', 'name']);

        // Query the data and write to CSV in chunks
        User::select(['email', 'name'])
            ->chunk($chunk, function (Collection $users) use ($output) {
                $users->each(
                    fn($user) => fputcsv($output, [$user->email, $user->name])
                );
            });
    }
}
Enter fullscreen mode Exit fullscreen mode

Conclusion

This approach to CSV export is efficient, scalable, and simple to implement. By leveraging Laravel’s chunking and PHP’s native fputcsv, you ensure your application can handle even large datasets without performance issues.

Sentry blog image

How I fixed 20 seconds of lag for every user in just 20 minutes.

Our AI agent was running 10-20 seconds slower than it should, impacting both our own developers and our early adopters. See how I used Sentry Profiling to fix it in record time.

Read more

Top comments (0)

Cloudinary image

Zoom pan, gen fill, restore, overlay, upscale, crop, resize...

Chain advanced transformations through a set of image and video APIs while optimizing assets by 90%.

Explore

👋 Kindness is contagious

Explore a sea of insights with this enlightening post, highly esteemed within the nurturing DEV Community. Coders of all stripes are invited to participate and contribute to our shared knowledge.

Expressing gratitude with a simple "thank you" can make a big impact. Leave your thanks in the comments!

On DEV, exchanging ideas smooths our way and strengthens our community bonds. Found this useful? A quick note of thanks to the author can mean a lot.

Okay