DEV Community

Techsolutionstuff
Techsolutionstuff

Posted on • Originally published at techsolutionstuff.com

How to Process Large CSV Files with Laravel 10

Hey folks! Are you tired of feeling overwhelmed by those massive CSV files in your Laravel projects? Well, I've got some good news for you!

In this guide, I'm going to walk you through the process of handling those hefty CSV files with ease using Laravel.

By the end of this guide, you'll have the skills and confidence to tackle those giant CSV files like a pro, thanks to the power of Laravel and some nifty tools.

In this article, we'll see how to process large CSV files with laravel 10, laravel 10 imports large CSV files, how to handle large CSV files in laravel 8/9/10, and how to import CSV files using spatie/simple-excel.

Step 1: Set Up Your Laravel Project

First things first, let's get our Laravel project up and running. If you haven't already, install Laravel by running:

laravel new my-laravel-project
Enter fullscreen mode Exit fullscreen mode

Step 2: Install the spatie/simple-excel Package

Next, let's install the spatie/simple-excel package via Composer. In your terminal, navigate to your project directory and run:

composer require spatie/simple-excel
Enter fullscreen mode Exit fullscreen mode

we'll use SimpleExcelReader to load it up. by default, it returns you a LazyCollection – think of it as a more considerate way to handle your data without exhausting your server's memory. This means you can process the file bit by bit.

Step 3: Create a Laravel Job for importing CSV

Now, we'll create a job using the following command.

php artisan make:job ImportCSV
Enter fullscreen mode Exit fullscreen mode

App/Jobs/ImportCSV.php

<?php

namespace App\Jobs;

use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Spatie\SimpleExcel\SimpleExcelReader;

class ImportCSV implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    /**
     * Create a new job instance.
     */
    public function __construct()
    {
        //
    }

    /**
     * Execute the job.
     */
    public function handle(): void
    {
        SimpleExcelReader::create(storage_path('app/public/products.csv'))
            ->useDelimiter(',')
            ->useHeaders(['ID', 'title', 'description'])
            ->getRows()
            ->chunk(5000)
            ->each(
                // Here we have a chunk of 5000 products
            );
    }
}
Enter fullscreen mode Exit fullscreen mode

Here's is follow some steps to import large CSV files.

Chunking the CSV: We'll split the big file into smaller parts, making it easier to handle. This gives us smaller chunks to work with, which is much simpler.

Job Dispatching: We'll send each of these smaller chunks as separate tasks. This helps us manage the workload better and is gentler on your server.

Database Insertion: Then, we'll smoothly add each chunk to the database. It's a straightforward process that ensures everything goes in neatly.

Chunking the CSV

With our LazyCollection set up, we'll break down the CSV into smaller parts. It's similar to slicing a huge sandwich into smaller, manageable pieces.

php artisan make:job ImportProductChunk
Enter fullscreen mode Exit fullscreen mode

For each part of the CSV file, we'll create and start a job. These jobs act like dedicated workers, each handling a portion of the data and inserting it into your database with care.

<?php

namespace App\Jobs;

use App\Models\Product;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldBeUnique;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Database\Eloquent\Model;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Str;

class ImportProductChunk implements ShouldBeUnique, ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public $uniqueFor = 3600;

    /**
     * Create a new job instance.
     */
    public function __construct(
        public $chunk
    ) {
        //
    }

    /**
     * Execute the job.
     */
    public function handle(): void
    {
        $this->chunk->each(function (array $row) {
            Model::withoutTimestamps(fn () => Product::updateOrCreate([
                'product_id' => $row['ID'],
                'title' => $row['title'],
                'description' => $row['description'],
           ]));
        });
    }

    public function uniqueId(): string
    {
        return Str::uuid()->toString();
    }
}
Enter fullscreen mode Exit fullscreen mode

Ensuring Uniqueness

One important thing to remember is to use $uniqueFor and uniqueId in your jobs. It's similar to giving each worker a special ID badge. This way, you ensure that no two workers end up doing the same job, which is really bad for efficiency.

Dispatching Jobs

In our ImportCSV job, we'll send out a job for every chunk using each method.

<?php

namespace App\Jobs;

use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Spatie\SimpleExcel\SimpleExcelReader;

class ImportCSV implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    /**
     * Create a new job instance.
     */
    public function __construct()
    {
        //
    }

    /**
     * Execute the job.
     */
    public function handle(): void
    {
        SimpleExcelReader::create(storage_path('app/public/products.csv'))
            ->useDelimiter(',')
            ->useHeaders(['ID', 'title', 'description'])
            ->getRows()
            ->chunk(5000)
            ->each(
                fn ($chunk) => ImportProductChunk::dispatch($chunk)
            );
    }
}
Enter fullscreen mode Exit fullscreen mode

And there you have it! Your large CSV files are ready to be processed efficiently, with each part handled separately, avoiding any memory issues. If you're pressed for time, simply add more workers to speed up the process.


You might also like:

Read Also: How to Import CSV File in Laravel 10 using League/csv

Read Also: Efficient CSV File Uploads in Laravel 10 with Python

Read Also: How To Import CSV File In MySQL Using PHP

Top comments (0)