I graduated from Holberton School Tulsa — now Atlas School of Tulsa — as a full stack software engineer. Then I mostly didn't use those skills for a while. Life moved in other directions and the urgency faded.
Recently I started job hunting in software engineering and the market made things clear pretty quickly: AI has changed the landscape significantly, the competition for roles is intense, and a credential without visible work to back it up doesn't move the needle the way it used to. I needed more than a diploma.
So I thought about what actually separates developers who get hired from developers who don't. The answer I kept coming back to was documented, public work — projects where someone can see the decisions, the reasoning, the mistakes, and the progress. A build diary, essentially. A record of how you think.
That's what I didn't do at Atlas School. I built things, but I didn't document them. I'm doing that now, starting with my portfolio.
The old site problem
There's already a version of frandy.dev. It's live right now — static, not particularly well thought out, built without much intention. It doesn't represent what I can do, and I've known that for a while.
The new version replaces it completely. This is a full stack application with a real backend, a projects CMS, animated theme backgrounds I designed myself, and a level of craft the original site never had. The goal: show what I've actually learned over the years, not just prove I can ship a webpage.
What I built
The complete backend for frandy.dev:
- FastAPI (Python 3.12) with auto-generated OpenAPI docs
- PostgreSQL 16 with SQLAlchemy 2.x async ORM and Alembic migrations
- Docker Compose — five services: FastAPI, Next.js, PostgreSQL, Umami, Nginx
- JWT auth with httpOnly cookies and bcrypt password hashing
- Projects CMS — CRUD, image upload with validation, drag-and-drop reorder, publish/draft states
- Contact form with Resend transactional email (fire-and-forget — form always succeeds regardless of email delivery)
- GitHub stats cache — APScheduler pulls commits, languages, pinned repos every 3 hours into PostgreSQL
- Theme system — four dark themes, stored in the database, auto-rotates every 90 days, admin can override anytime
- Resume manager — upload a PDF, it updates the download URL sitewide instantly
Repo: github.com/frandy-slueue/frandy.dev
The planning lesson — the real one
My old way of working: start coding, figure out the details as I go. Pick the tech stack mid-feature. Change the database schema after the routes that depend on it are already written. It works, eventually — but it's slower and more frustrating than it needs to be.
For this project I committed to something I'd never fully done before: a complete specification document before opening a code editor. Every section of the site. Every API endpoint with its request and response shape. Every database table with column names and types. The Docker Compose services. The Nginx config. The deployment steps. The DNS setup.
All of it on paper before line one.
Here's what it actually produced: every time I sat down to code and didn't know what to build next, the spec had the answer. Every time I was tempted to cut a corner, I checked the spec and remembered why I'd decided not to. The planning didn't slow the build down. It made the build predictable. Predictable is fast. This is the thing I wish someone had said clearly to me at school.
First-time tools — stepping out deliberately
Part of how I'm approaching projects now is deliberately including tools I've never used. Real pressure — something that actually has to work — produces different learning than a tutorial.
Umami — self-hosted analytics. I wanted control over visitor data without handing it to a third party. Umami runs as a Docker service and I proxy its API through FastAPI. The frontend never talks directly to Umami.
Nginx — reverse proxy for SSL termination and traffic routing. I'd understood it conceptually but never configured one for a real multi-service application. It made sense once I reframed it: Nginx is a traffic director, not a server in the application sense. Every request comes in, Nginx decides which container handles it.
Resend — transactional email. Clean API, minimal SDK. The one design decision I was careful about: the email send is wrapped in try/except and returns False on failure without raising. The form submission always succeeds. Whether the email lands is a separate concern.
DigitalOcean — where the finished application will live. That comes after the frontend. The research is done.
The research into each of these tools was genuinely rewarding. Every technology made more sense once I understood the problem it was designed to solve. None of them were as intimidating on the other side as they looked from the outside. I keep learning this and then forgetting it when the next unfamiliar thing appears.
The bugs and struggles — specific ones
The bcrypt/passlib compatibility issue. passlib[bcrypt] breaks silently with newer versions of the bcrypt package. The error — AttributeError: module 'bcrypt' has no attribute '__about__' — gives you nothing useful about what's actually wrong. I looked at the hashing logic, the database connection, the settings class. All fine. The problem was two lines in requirements.txt. Fix: pin passlib[bcrypt]==1.7.4 and bcrypt==4.0.1 together. I now pin every dependency before writing application code.
The swapped files. Built the contact router and email service in the same session, saved their contents into the wrong files. Both were syntactically valid Python — no errors, nothing flagged. The server told me the contact module had no router attribute and I had to trace it back by reading both files. Happens when you're moving fast across many new files. Small mistake, harder to catch than it looks.
Hardcoded container paths. Upload directory set to /app/uploads — correct in Docker, a PermissionError locally. Looked like a permissions problem before I realized the path simply doesn't exist outside the container. Fix: environment variable with a local fallback, declared in Pydantic Settings. Rule: runtime paths belong in environment config, not source code.
Alembic "target database is not up to date." Tried to generate a migration before applying existing ones. Always run alembic upgrade head before alembic revision --autogenerate. Every time.
Pydantic strict extra fields. Added UPLOAD_DIR to Docker Compose env before adding it to the Settings class. BaseSettings raises a validation error for any undeclared environment variable. Fix: declare in Settings first. Always.
What I'd do differently
Pin all dependencies before writing line one. Write seed scripts alongside the models. Commit at a finer grain — one logical unit per commit so you can trace what broke and when. Test locally earlier in each session instead of writing a lot before running anything.
Where things stand
Backend is done. Frontend is next — Next.js 16+, Framer Motion, four custom animated backgrounds, and a morphing mosaic grid. I'll document that build the same way I documented this one.
I'm excited to see what the finished thing looks like. The spec was ambitious. The backend delivered on it.
Wish me luck.
Repo: github.com/frandy-slueue/frandy.dev
Frandy Slueue — Full Stack Software Engineer
Atlas School of Tulsa (formerly Holberton School Tulsa)
Top comments (0)