I am 19 years old. I set out to build a production-grade e-commerce system from scratch. No Laravel. No React. Just raw PHP, MySQL, and a lot of stubbornness.
I call it Commerza. It is a security-first storefront and operations panel with 238 files of pure, custom architecture.
But I almost killed the entire project before anyone even saw a single line of code. This is the story of how relying too much on AI almost wrecked my system, and how the rebuild forced me to become a real software engineer.
The Code Red: A 9,800-Line Disaster
Like many developers today, I leverage AI tools. Gemini, Claude, Copilot — they act as a sounding board to accelerate development. But they are tools, not architects. I learned that the hard way.
Nearing what I thought was the “completion” phase of my backend, things had gotten messy. I had files ballooning to anywhere between 6,000 and 9,800 lines of code. The logic was tangling. Instead of doing the hard work of refactoring it myself, I got lazy. I asked GitHub Copilot to logically split those massive files into modular components.
It did not split them. It butchered them.
In a matter of seconds, Copilot scrambled the logic and outright deleted 40% of my backend. Critical functionalities vanished. My custom API endpoints, my image compressor, the parser feature, and several core JavaScript files were just gone.
I sat staring at the screen. I was ready to turn off the computer, delete the folder, and abandon the project completely.
The Biggest Mistake Was Not the AI
It took me two days and roughly 9.6 hours of pure screen time to manually stitch the backend back together.
Why did I have to do it manually? Because of the ultimate rookie mistake: It wasn’t on Git.
If I had been committing my code properly, a simple git restore . would have saved me. Instead, I paid in blood and time.
The Hard Truths About Version Control:
Push constantly: A local copy is not a backup. A Git repository is.
Protect your secrets: Never, ever share your
.envfile or database credentials on GitHub. I spent hours analyzing the difference between public and private repositories and utilizing.gitignoreandgit filter-branchto ensure my commit history was scrubbed of any leaked SMTP or database passwords before going public.
# Environment secrets
.env
.env.*
!.env.example
# Temporary local maintenance scripts
.history-scrub.ps1
# Dependency directories
node_modules/
vendor/
!frontend/assets/vendor/
# Local runtime artifacts
*.log
*.tmp
*.cache
*.out
*.pid
*.seed
# Package manager / test artifacts
npm-debug.log*
yarn-debug.log*
pnpm-debug.log*
coverage/
.phpunit.result.cache
# OS noise
.DS_Store
Thumbs.db
Desktop.ini
- Own your code: AI can write syntax, but it cannot hold the context of your entire system. If you don’t understand the architecture, you can’t fix it when the AI breaks it.
The Frontend-First Trap
Not using Git wasn’t my only architectural sin. I made the massive mistake of building the frontend first.
I designed the UI using .html files and relied on hardcoded JSON fetching to populate the data. It looked great, but when it was time to actually connect the PHP backend, the data mutation flow was a nightmare. I had to rip out the static HTML, convert everything to server-rendered .php templates, and entirely rebuild how the client communicated with the server.
Always build the engine first. Paint the car later.
Rising from the Wreckage: The Architecture of Commerza
After the 9.6-hour rebuild, my mindset shifted. I stopped coding like a hobbyist and started coding defensively. I didn’t just rebuild the APIs; I built a fortress.
Here is what the architecture looks like today:
commerza/ # Example
├── admin/ # Restricted operations UI and Auth
│ ├── api/ # Admin-specific backend endpoints
│ └── frontend/ # Admin dashboards and management views
├── backend/ # The Engine
│ ├── core/data.php # Shared bootstrap and DB connection
│ ├── security/ # Argon2id, CSRF, and CAPTCHA models
│ ├── payment/ # Stripe verification and COD logic
│ └── jobs/ # Cron-based automation scripts
├── frontend/ # Storefront static assets (CSS, JS, media)
├── .env.example # Secure environment template
└── .htaccess # Apache routing and security gatekeeper
Engineering for Production
Building without a framework means you have to write your own security, performance, and SEO layers. I couldn’t just run an artisan command.
1. Database Safety & Concurrency I implemented strict transactional locking. When an order is placed, the system executes a SELECT ... FOR UPDATE lock on the database row. This prevents race conditions, double-orders, and negative stock if two users try to buy the last item at the exact same millisecond. Every user-facing query uses strict prepared statements to eliminate SQL injection.
2. The Anti-Bot Layer The login and checkout flows are protected by a layered CAPTCHA model. It attempts a silent reCAPTCHA v3 verification first. If it detects suspicious behavior, it degrades to a v2 checkbox, and finally to a custom mathematical fallback challenge. Combined with strict CSRF tokens and rate limiting, brute-force attacks are neutralized at the door.
3. Identity & Access Passwords are hashed using Argon2id. But security isn’t just about passwords. I built a sub-admin lifecycle where, if a staff member is suspended or their permissions are altered, their session is immediately revoked server-side. No waiting for a cookie to expire.
4. Performance & Dual-Theme UI The frontend is highly optimized. I configured .htaccess to handle static asset caching and clean-routing for SEO (turning messy PHP queries into clean URLs). The UI features a native, zero-dependency Dual-Theme system. Users can toggle between a high-contrast Dark Mode (OrangeRed + Black) and a clean Light Mode (NavyBlue + White), with the preference persisting via local storage.
5. Payments Checkout handles both Cash on Delivery (with configurable OTP limits for high-value orders) and Stripe card payments, with server-side PaymentIntent verification to ensure no transaction is manipulated on the client side.
The Final Lesson
Frameworks are incredibly useful, but they abstract the hard parts of engineering. By avoiding them, I crashed my own system, lost my code, and had to learn the raw mechanics of web security, transactional databases, and server routing.
It was brutal. But because of that Code Red, I don’t just know how to use a tool — I know how to build the machine.
Explore the Codebase: You can view the full repository, including the [SECURITY.md](https://github.com/ahmershahdev/commerza?tab=security-ov-file#) and detailed documentation, on GitHub: ahmershahdev/commerza
Live Deployment: The production environment will be officially live between 15 to 20 May 2026 here: commerza.ahmershah.dev
Connect With the Author
| Platform | Link |
|---|---|
| ✍️ Medium | @syedahmershah |
| 💬 Dev.to | @syedahmershah |
| 🧠 Hashnode | @syedahmershah |
| 💻 GitHub | @ahmershahdev |
| Syed Ahmer Shah | |
| 🧭 Beacons | Syed Ahmer Shah |
| 🌐 Portfolio | ahmershah.dev |




Top comments (17)
Learning the importance of version control the hard way is a rite of passage for almost every developer. Relying on AI tools like Copilot is great for velocity, but as you experienced, it absolutely cannot replace the safety net that a proper Git workflow provides. Glad you were able to recover and rebuild the system even stronger than before.
"A rite of passage" is the perfect way to describe it. It’s a painful lesson, but I can guarantee I’ll never start another project without git init ever again. Losing 40% of your work in a split second changes how you look at the "Save" button forever.
The "no framework" war stories are always educational. The thing that usually bites teams later isn't the initial framework-free velocity — it's idempotency on payments, race conditions on inventory decrement, and the migration tooling that frameworks quietly solve. I'd love to hear which of those hit you hardest. Any chance you ended up re-implementing a mini-framework internally by the end (routing layer, ORM-ish abstraction, migration runner)?
To avoid those exact traps, I actually ended up building what feels like a 'security-first mini-framework' from scratch.
To handle inventory and payments, I implemented strict SQL row-locking (FOR UPDATE) and transactional idempotency. For the 'mini-framework' layers:
Routing: Custom clean-route layer with .htaccess orchestration.
Security: Layered hashing (Argon2id with bcrypt fallback), CSP with 128-bit nonces that refresh on every request, and XSS protection built into the core engine.
Anti-Abuse: A hybrid CAPTCHA (reCAPTCHA v3 + v2 fallback) combined with a custom-built challenge and mandatory 6-digit email OTPs for high-value actions.
DevOps: I built PowerShell-based migration scripts and CI/CD pipelines for smoke testing to replace what a framework usually handles.
The hardest part wasn't the code—it was ensuring that every alert, SMTP status change, and security log was intentional. It’s framework-less, but definitely not 'structure-less.'
This is a great reminder that AI tools can break production code fast. Solid insights on version control, security, and scalable ecommerce backend design.
It’s a scary realization how fast a tool meant to help can turn into a "delete" button. The lesson for me was that AI is a great assistant for writing a function, but a terrible partner for restructuring an entire system.
pushing back slightly - the lesson is not "no frameworks = real engineer." AI generates plausible code but has no model of your system constraints. that gap bites with or without a framework.
I Agree with you that the lesson is about technical intentionality, not just 'No Frameworks but the reason I chose to build framework-less is that frameworks provide a layer of abstraction that often masks those system constraints. When an AI generates a 'plausible' controller in Laravel, it’s easy to overlook the underlying race conditions or session vulnerabilities because the framework handles the heavy lifting.
By removing the framework, I forced the 'system constraints' to the surface. I had to manually architect the row-locking, the CSP nonce lifecycle, and the idempotency logic. That process bridged the gap between 'plausible code' and 'predictable systems' in a way that wouldn't have happened if I were just filling in framework boilerplate. It wasn't about being a 'real engineer'; it was about becoming the person who actually understands why the constraints exist in the first place."
right - that's what ai exploits. it writes confident laravel controllers because the pattern is in training data, not because it understood your constraints. going raw forces the modeling step.
Brutally honest story of building an ecommerce system without frameworks. Great insights on AI risks, backend architecture, Git mistakes, and real-world security lessons.
"Brutally honest" was the only way to write this. I think it’s important to share the failures, not just the finished products, especially with how much "perfect" AI-generated code we see online these days.
Raw 🔥
Thanks for the energy! It was a fiery process to get through, but the system is much stronger because of it.
Great real world lesson on building an ecommerce system without frameworks. The Git mistake and AI refactor issue highlight why version control and architecture matter in scalable backend development.
Exactly. Building without a framework is like taking the covers off the engine while it's running—it's messy and dangerous, but you finally see how the pistons move. Architecture isn't just about clean folders; it's about making sure those 238 files actually talk to each other without collapsing.
Kudos 🎉
Appreciate the kudos! It’s been a wild ride, and I’m just glad the project survived the "AI butchery" phase.