The Problem We All Face
How many times have you opened your storage folder only to find multiple copies of the same file? 😫
We've all been there:
- Users upload the same file multiple times
- Your storage grows uncontrollably
- Backups take forever
- Data becomes inconsistent across your application
I faced this exact problem in projects, and that's why I built Dedupler for Laravel Storage.
What is Dedupler?
Dedupler is a Laravel package that automatically prevents duplicate file storage using SHA-1 hashing. It gives you a beautiful polymorphic API to manage file attachments while ensuring zero duplicates.
Check potential disk space savings
If you install in legacy project you can calculate efficiency of using the package via command
php artisan dedupler:analyse-legacy /absolute/path/to/legacy/storage/directory
How It Works Under the Hood
// Every file gets SHA-1 hash fingerprint
$hash = sha1_file($file);
Package benefit happens here:
- If hash exists → return existing file record
- If not → store file and create new record
- All models share the same physical file
🔧 How to use
1. Using Facade to Store Deduplicated files
/** @var \Illuminate\Http\UploadedFile $file */
/** @var \Maxkhim\Dedupler\Models\UniqueFile $uniqueFile */
$uniqueFile = Dedupler::storeFromUploadedFile($file);
// OR
$uniqueFile = Dedupler::storeFromPath($absolutePathToFile);
// OR
$uniqueFile = Dedupler::storeFromContent($content, 'direct_content_file.ext');
2. Via Trait to Your Model to keep deduplicated files attached to models
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
use Maxkhim\Dedupler\Traits\Deduplable;
class Post extends Model
{
use Deduplable;
}
$post = new Post([...]);
/** @var \Illuminate\Http\UploadedFile $file */
/** @var \Maxkhim\Dedupler\Models\UniqueFile $uniqueFile */
$uniqueFile = $post->storeUploadedFile($file);
// OR
$uniqueFile = $post->storeLocalFile($absolutePathToFile);
// OR
$uniqueFile = $post->storeContentFile($content, 'direct_content_file.ext');
File SHA1 can be checked via RESTapi
Enable RESTapi endpoint to check file existence
Why REST API? This endpoint allows you to check if a file already exists in the system by its SHA-1 hash without uploading it again. Useful for frontend checks or integration with other services.
DEDUPLER_REST_ENABLED=true
Make HTTP Request
GET http://server.name/api/dedupler/v1/files/{sha1_hash}
Receive file info (or error message)
{
"success": true,
"data": {
"hash": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
"sha1_hash": "da39a3ee5e6b4b0d3255bfef95601890afd80709",
"md5_hash": "d41d8cd98f00b204e9800998ecf8427e",
"exists": false,
"filename": "da39a3ee5e6b4b0d3255bfef95601890afd80709.pdf",
"path": "da\/39\/da39a3ee5e6b4b0d3255bfef95601890afd80709.pdf",
"mime_type": "application\/pdf",
"size": 102400,
"size_human": "100 KB",
"disk": "deduplicated",
"status": "completed",
"created_at": "2025-10-22T18:40:41.000000Z",
"updated_at": "2025-10-22T18:40:41.000000Z",
"links_count": 94
}
}
🚀 Try It Out!
I'd love your feedback:
- GitHub: https://github.com/maxkhim/laravel-storage-dedupler
- Packagist: https://packagist.org/packages/maxkhim/laravel-storage-dedupler
If you find it useful make Issue report - it truly inspires me to keep improving!
🏆 Special Offer: First 5 quality Issue reports will get special mention in the Release Notes!
Your contribution will be permanently visible to all package users with a link to your GitHub profile.
What's your experience with duplicated files management in Laravel? Share your stories and solutions!
P.S. This package was approved by Laravel News!
Top comments (0)