DEV Community

Marx Oblan III
Marx Oblan III

Posted on

Why I ditched "Soft Deletes" for S3: Building a Physical Purge Workflow

As a Laravel dev, I’ve relied on SoftDeletes for years. It’s great for accidentally deleted users or blog posts, but it’s a massive liability when you’re handling sensitive files like .env configurations, private keys, or PII.

I started auditing how the "big" file-sharing services handle data, and I realized most of them are just flipping a deleted_at flag in a DB. The actual object stays in an S3 bucket for 30 days "just in case."

I got a bit paranoid about this "ghost data" sitting in cloud storage, so I spent my weekends building FortByte.io.

The Goal: True Zero-Persistence
I wanted a "burn after reading" workflow where the second a link expires (or is viewed), the S3 object is physically and permanently wiped. No 30-day retention, no "trash" folder, and no recovery.

The Stack:
Backend: Laravel

Frontend: Inertia.js/Vue.js

Storage: S3

Infrastructure: DigitalOcean

How it works:
Instead of just hiding the record, I’ve set up a triggered workflow. When a file hits its expiry limit:

The database record is purged.

An S3 DeleteObject command is issued immediately.

Any temporary logs associated with the transfer are cleared.

I did include an optional 30-day retention toggle for users who actually want a safety net, but by default, "Delete" actually means "Gone."

Why I built it:
It was originally just to solve my own workflow headaches when sending keys to clients, but I’ve made it live and included a free tier for anyone else who is tired of leaving a digital trail.

I’d honestly love some feedback from other devs on the UI or the purging logic. If you've ever dealt with the "ghost data" problem in your own apps, how did you handle it?

Top comments (0)