DEV Community

楊東霖
楊東霖

Posted on • Originally published at devtoolkit.cc

How I Built a Side Project in 1 Day With AI: A Real-World Walkthrough

Twelve months ago, building a deployable side project in a single day was ambitious — the kind of thing that required a rare combination of focus, experience, and luck. In 2026, with the right AI tools and workflow, it's repeatable. I've done it several times, and this is a detailed account of exactly how.

This isn't a "vibe coding" story where AI writes everything and you watch. It's a practical account of how AI tools accelerate specific parts of the development process — and where you still need to think carefully and make decisions yourself.

The Project: A Developer Tools Directory

The project I built: a curated directory of free developer tools, organized by category, with ratings, descriptions, and links. Sounds simple. The requirements that made it non-trivial: a static site that could handle hundreds of tools without a database, a filtering UI that worked client-side, an admin interface for adding new tools, and deployment to a CDN with automatic builds on content changes.

Tech stack I chose: Astro (static site generator), Preact (lightweight UI for the filter component), JSON files as the "database," Cloudflare Pages for deployment, and GitHub Actions for CI. Total cost: $0/month at the expected traffic level.

Time from idea to deployed: about 9 hours, spread across a single day.

Hour 1: Architecture and Setup (60 minutes)

The first hour was planning — but AI-accelerated planning. I described the project to Claude and asked it to help me think through the architecture:

Prompt: "I want to build a curated directory of free developer tools. The site should be completely static (no server-side rendering at runtime), support filtering by category and search, load fast, and be easy to update by adding JSON files. What's the best architecture for this in 2026? Walk me through the trade-offs."

Claude's response covered five options (Jekyll, Hugo, Next.js static export, Astro, plain HTML + JS) with trade-offs for each. It recommended Astro because of its partial hydration model — the filter UI could be a Preact component (hydrated, interactive) while everything else was pure static HTML (fast, no JavaScript needed).

This is the kind of architectural reasoning that used to require either experience with all these frameworks or a lot of research time. AI compressed it to 10 minutes.

After choosing Astro, I asked Claude to generate the project scaffold command and initial file structure. Instead of reading Astro documentation for 30 minutes, I had a working project structure in 5.

Hour 2-3: Core Data Model and Content (90 minutes)

The most important early decision in any content project is the data model. Get this wrong and you'll refactor it for days. I asked Claude to design the JSON schema for a developer tool entry:

Prompt: "Design a JSON schema for a developer tool entry in my directory. Each tool needs: name, description, URL, categories (multiple), pricing (free/freemium/paid), tags, when it was added, and a rating. I also want to filter by category and tag. What schema would be easy to query, extend, and validate?"

Claude produced a schema with sensible field names, explained why it chose an array for categories (filtering by multiple categories simultaneously), suggested adding a slug field derived from the name (for clean URLs and deduplication), and recommended separating tools into individual JSON files rather than one large array (easier to add new tools via PRs or scripts).

I seeded the directory with 30 tools using another AI prompt: "Generate 30 real, free developer tools entries for this schema, covering these categories: JSON/data, terminal/CLI, API testing, code formatting, monitoring, deployment. Use real tools with accurate descriptions."

This saved about 2 hours of manual research and data entry. I verified the entries (Claude occasionally gets URLs slightly wrong — worth checking) and had a full dataset in under an hour.

Hour 3-5: UI Development (120 minutes)

The UI was the most complex part — specifically, the filter and search component. This is where I used the most AI assistance, and also where I had to be most careful about the output.

I started with the Astro page layout:

Prompt: "Write an Astro page that reads all JSON files from a data/tools/ directory, passes them as props to a Preact component called ToolDirectory, and renders a page with appropriate meta tags for SEO. The page should be fully static — no server endpoints."

Claude produced working Astro page code with correct frontmatter, glob imports for the JSON files, and proper prop passing. Minor issue: it used import.meta.glob syntax that was slightly outdated for the Astro version I was using. A quick follow-up prompt ("this is throwing an error with Astro 4.x, here's the error:") produced a corrected version.

For the filter component itself, I broke the task into pieces:

First prompt (state management): "Design the state management for a ToolDirectory Preact component. It needs to track: selected categories (array, multiple select), search query (string), sort order (newest/alphabetical/rating). Use Preact signals. Show me the state declarations and the derived filtered+sorted list logic."

Second prompt (rendering): "Now write the JSX for the ToolDirectory component using these signals. The filter UI should be a sidebar on desktop, collapsible on mobile. Each tool card should show: name, description, categories as badges, and a link. Use Tailwind for styling."

Breaking it into steps — state first, rendering second — consistently produces better output than asking for everything at once. Each piece is focused enough for Claude to reason about carefully.

The component worked on the first try with minor style adjustments. Total time for UI development including iteration: about 2 hours. The same UI without AI assistance would have taken me 4-5 hours minimum.

Hour 5-6: SEO and Performance (60 minutes)

SEO is an area where AI assistance is particularly valuable because it involves a lot of boilerplate that follows known patterns:

Prompt: "Write a reusable Astro layout component that handles all SEO meta tags for this site. Include: title, description, canonical URL, Open Graph tags (og:title, og:description, og:image, og:type), Twitter Card tags, and JSON-LD structured data for a website. Make the layout accept title and description as props with sensible defaults."

Claude produced a complete layout component with all the tags. I then asked it to add JSON-LD structured data for the individual tool pages (a different structured data type) and it extended the component correctly.

For performance, I asked Claude to review my Astro config for optimization opportunities. It suggested enabling compression, using the format: 'directory' build option for clean URLs, and adding image optimization through Astro's built-in image service. All three suggestions were valid and took under 10 minutes to implement.

Hour 6-7: Testing and Bug Fixes (60 minutes)

I ran the site locally, found three bugs, and used AI to debug each one:

Bug 1: Filter categories not updating when clicking the same category twice (toggle behavior broken). Described the bug to Claude with the relevant component code → received a correct fix in about 3 minutes. The issue was comparing arrays by reference instead of by value in the toggle function.

Bug 2: Mobile filter sidebar not closing after selecting a category. Described the behavior → Claude identified that the signal controlling sidebar visibility wasn't being reset in the category click handler → fix was one line.

Bug 3: JSON-LD structured data had an invalid field name (Claude had used a non-standard property). I validated the output with a structured data testing tool, pasted the error to Claude, and it corrected the schema immediately.

Three bugs, about 15 minutes total to fix. The debugging workflow — describe symptoms, paste relevant code, receive diagnosis and fix — is where AI tools show some of their highest ROI.

Hour 7-8: Deployment Setup (60 minutes)

I'd never used Cloudflare Pages before. Without AI assistance, this would have meant reading through documentation. With it:

Prompt: "Walk me through deploying an Astro static site to Cloudflare Pages. I'm using GitHub. I need: 1) the wrangler.toml configuration for the project, 2) the GitHub Actions workflow for CI/CD, 3) any Astro config changes needed for Cloudflare. The site should rebuild automatically on pushes to main."

Claude produced all three files with correct configuration. The wrangler.toml needed one change (the compatibility date was slightly outdated — updated it to the current date). The GitHub Actions workflow worked on first deploy.

This single prompt saved at least an hour of documentation reading and trial-and-error.

Hour 8-9: Polish and Launch (60 minutes)

The final hour was polish: writing the about page, generating the sitemap, setting up the robots.txt, and testing the deployed site on mobile.

For the about page, I described the project to Claude and asked it to write a developer-focused about page that explained what the site was, why it existed, and how to submit new tools. Three minutes, done.

Sitemap generation was handled by an Astro plugin — Claude suggested the right package and configuration. Robots.txt Claude wrote in about 30 seconds.

By hour 9, the site was live, indexed (Cloudflare Pages handles this quickly), and working correctly on mobile and desktop.

What AI Did and Didn't Do

To be clear about what made this work, here's an honest breakdown:

What AI did: Generated initial code for ~80% of the components (which I then reviewed and modified), handled all boilerplate (SEO meta tags, build configuration, deployment config), debugged all three bugs, designed the data schema, produced the initial dataset, and handled documentation I hadn't memorized (Cloudflare Pages setup, specific Astro APIs).

What I did: Made all architectural decisions (after AI-assisted evaluation of options), reviewed every piece of AI-generated code before using it, caught the outdated API usage and incorrect JSON-LD, decided what the project should be and why, and handled the product decisions (what categories to include, how to present the filter UI, what the brand voice should be).

The combination works because AI is extremely good at the execution layer — writing correct code for a well-specified task — while humans remain essential for the judgment layer — deciding what to build, why, and how to structure it.

The Repeatable Workflow

If you want to build side projects fast with AI, the pattern that works is:

1. Architecture first. Use AI to evaluate your tech stack options before writing any code. This is the highest-value use of AI in the early phase — you avoid building on the wrong foundation.

2. Break into components. Don't ask AI to build your entire application. Ask it to build one focused component or module at a time. Each piece will be higher quality.

3. Review everything. Read every line of AI-generated code. AI makes specific, subtle errors — outdated APIs, incorrect assumptions about your framework version, edge cases it didn't handle. The review step is where you catch these before they become bugs.

4. Iterate with follow-ups. When something doesn't work, describe the exact error or behavior and ask for a fix. This is faster than trying to fix it yourself first.

5. Use specialized tools for specialized tasks. For SEO validation, use an actual SEO validator. For code review, the DevToolkit AI Code Review tool is faster than prompting Claude manually. For SQL, the AI SQL Builder handles it directly.

For more on setting up this kind of workflow, see the Claude Code prompts guide and the AI code review automation guide.

Ready to launch your own side project? The Side Project Launch Playbook ($7.99) covers the full journey from idea to launch: validating the idea, choosing the right tech stack, setting up your development workflow, building an MVP, launching on Product Hunt and Hacker News, and finding your first users. A practical, opinionated guide for developers who want to ship, not plan.

Free Developer Tools

If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.

Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder

🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.

Top comments (0)