DEV Community

Ooi Yee Fei
Ooi Yee Fei

Posted on

Resurrecting 2006: A Haunted S3 Manager Built Entirely with Kiro

What if the 2006 AWS Console came back from the dead? That's the question behind The Phantom of the Console - a fully functional S3 management tool disguised as a retro HTML dashboard, complete with a bitter AI ghost who despises modern technology.

Built for the Kiroween Hackathon's "Resurrection" category, this project showcases how Kiro IDE's advanced features can work together to create production-ready applications with artistic constraints.

The Concept

Inspired by "The Phantom of the Opera," where the Phantom controls the theater from the shadows, AWS operates behind the console, powering infrastructure invisibly. I wanted to resurrect the 2006 AWS experience - when "the cloud" was revolutionary, table-based layouts were professional, and sysadmins ruled physical servers.

The result: S3 Made Easy: Manage • Upload • Share Securely - with a haunted 2006 twist. But more importantly, it became an exploration of how AI-assisted development can maintain artistic constraints while building production-ready software.

The Journey: Learning to Code with Constraints

When I started this project, I had a simple question: Could I build a modern web application using only 2006 JavaScript patterns? Not as a novelty, but as a real constraint that would force me to think differently about code generation and AI assistance.

The answer surprised me. Not only was it possible, but the constraint made me a better developer.

Discovery #1: Steering Rules as Creative Constraints

The first breakthrough came when I discovered Kiro's steering rules. I'd used linters and formatters before, but this was different. Instead of checking code after it's written, steering rules guide the AI as it generates code.

I wrote a simple document describing 2006 patterns - use var instead of const, use XMLHttpRequest instead of fetch, avoid arrow functions. I added examples showing what I wanted and what I didn't want. Then I added one line at the top: inclusion: always.

That single line changed everything.

Suddenly, every conversation with Kiro happened in the context of 2006. When I asked for a file upload component, Kiro generated it with XMLHttpRequest and callbacks. When I needed state management, it used class components with this.setState. When I requested error handling, it used try-catch blocks with function expressions.

I didn't have to remind Kiro about the constraints. I didn't have to fix modern syntax after generation. The patterns were just... there. Automatic. Consistent. Across 2,500+ lines of code.

This taught me something profound: AI-assisted development isn't about generating code faster. It's about maintaining consistency at scale. The steering rules became my design language, and Kiro became fluent in it.

Discovery #2: Vibe Coding with Context

The second revelation came from vibe coding - Kiro's conversational code generation. I'd used AI code assistants before, but they always felt like autocomplete on steroids. You'd get a function, maybe a component, but never a complete feature.

Kiro was different because of context. When I asked for "a retro S3 bucket browser with folder icons," Kiro didn't just generate a component. It generated a component that understood my entire project - the 2006 constraints, the retro styling, the haunted theme, the AWS integration patterns I'd established.

The most impressive moment came when I needed resumable uploads. I described the feature: "Implement true resumable uploads using S3 multipart API with localStorage progress tracking." I expected to iterate, to fix bugs, to refine the approach.

Instead, Kiro generated 200+ lines of working code. It used S3's CreateMultipartUploadCommand to initiate uploads, UploadPartCommand to send chunks, localStorage to track progress, and CompleteMultipartUploadCommand to finalize. All in 2006 style with var, callbacks, and XMLHttpRequest patterns.

The code worked on the first try. Not because Kiro is magic, but because it understood the context - both technical (AWS SDK patterns) and stylistic (2006 constraints).

This taught me that vibe coding isn't about prompting. It's about building context. Every component I generated added to Kiro's understanding of my project. By the time I reached complex features, Kiro knew my codebase better than I did.

Discovery #3: Agent Hooks as Feedback Loops

The third insight came from agent hooks. I created a pre-commit hook that scans for modern JavaScript syntax and rejects commits with violations. It seemed like a simple quality gate.

But it became something more: a feedback loop.

Every time I accidentally used modern syntax (and I did, muscle memory is strong), the hook caught it immediately. Not in code review. Not in CI/CD. Right there, in the commit. With ASCII ghost art and a snarky error message.

This created a tight feedback loop that reinforced the constraints. I stopped thinking in modern JavaScript and started thinking in 2006 patterns. The constraint became natural, not forced.

More importantly, it showed me that AI-assisted development needs guardrails. Not to limit creativity, but to maintain consistency. The hook ensured that even when I manually edited code, I stayed within my design language.

Discovery #4: MCP as Conversational Infrastructure

The fourth breakthrough was MCP (Model Context Protocol). I built a custom server that wraps AWS CLI, letting Kiro manage S3 directly from chat.

At first, this seemed like a convenience feature. Instead of switching to the AWS Console, I could ask Kiro "list my S3 buckets" and get results in the IDE.

But it became more than that. It became conversational infrastructure.

When I needed to test uploads, I'd ask Kiro to create a test bucket. When I needed to debug CORS errors, I'd ask Kiro to check the bucket's CORS configuration. When I needed to verify region detection, I'd ask Kiro to list buckets with their regions.

The MCP server turned AWS from a separate system into part of my development conversation. I wasn't context-switching between IDE and Console. I was having a conversation with my infrastructure.

This taught me that AI-assisted development can extend beyond code generation. With MCP, Kiro became my interface to external systems. Not through APIs I had to learn, but through natural language I already knew.

The Technical Challenge: Building Modern Features with Old Patterns

The real test came when I needed to implement complex features like resumable uploads and CORS configuration using 2006 patterns.

Modern JavaScript makes this easy: async/await for sequential operations, Promises for parallel operations, arrow functions for clean callbacks. But I couldn't use any of that.

Instead, I used nested callbacks, function expressions, and careful state management. I used XMLHttpRequest for network requests and localStorage for persistence. I used class components with lifecycle methods instead of hooks.

The surprising part? The code was clearer.

Without the syntactic sugar of modern JavaScript, I had to be explicit about control flow. Without Promises, I had to think carefully about error handling. Without arrow functions, I had to be deliberate about scope and binding.

The constraints forced me to write more intentional code. And Kiro, guided by steering rules, helped me maintain that intentionality across the entire codebase.

The Unexpected Benefit: Spec-Driven Development

For the most complex features, I used Kiro's spec-driven development. I'd write a requirements document with acceptance criteria, a design document with architecture, and a task list with implementation steps.

Then I'd let Kiro execute the tasks one by one.

This was different from vibe coding. Instead of conversational iteration, I had structured execution. Instead of exploring solutions, I had defined outcomes.

The two approaches complemented each other beautifully. Vibe coding for UI components and quick iterations. Spec-driven development for complex backend logic and coordinated changes.

Together, they gave me the best of both worlds: the speed of AI generation with the structure of traditional development.

The CORS Crisis: When Theory Meets Reality

My biggest technical challenge came from an unexpected place: CORS errors.

I'd built the upload system. I'd implemented multipart uploads. I'd tested it locally. Everything worked perfectly. Then I deployed to production and tried to upload a file to a bucket in a different region.

CORS error. Access denied. The browser refused to talk to S3.

This is where AI-assisted development showed its real value. I didn't just need code generation. I needed problem-solving.

I described the issue to Kiro: "Browser-to-S3 uploads are failing with CORS errors when the bucket is in a different region than the credentials." Kiro analyzed the problem and suggested a solution: detect the bucket's region before uploading, then use the correct regional endpoint.

But there was a catch. Detecting the region required calling S3's GetBucketLocationCommand, which itself could trigger CORS errors if called from the browser.

The solution? Move region detection to the server. When listing buckets, fetch each bucket's region server-side (no CORS issues), then pass that information to the client. When uploading, use the region I already know instead of detecting it again.

This taught me that AI-assisted development isn't just about generating code. It's about architectural problem-solving. Kiro didn't just write the fix - it helped me understand the problem and design the solution.

The Resumable Upload Saga

The second major challenge was implementing true resumable uploads - uploads that survive page refreshes.

Most "resumable" upload libraries aren't truly resumable. They chunk files and retry failed chunks, but if you refresh the page, you start over. I wanted real resumability: close the tab, come back tomorrow, pick up where you left off.

This required understanding S3's multipart upload API at a deep level. When you initiate a multipart upload, S3 gives you an UploadId. You upload parts and get back ETags. When all parts are uploaded, you complete the upload by sending the UploadId and all the ETags.

The key insight: if you store the UploadId and ETags in localStorage, you can resume the upload even after a page refresh. You just need to let the user re-select the file (so you can read it again), then skip the parts you've already uploaded.

I described this to Kiro through spec-driven development. I wrote requirements, designed the architecture, and broke it into tasks. Kiro implemented each task, maintaining 2006 patterns throughout.

The result was elegant: a resumable upload system that uses native S3 features, stores minimal state in localStorage, and works across page refreshes. All in 2006-style JavaScript.

This taught me that complex features don't require complex syntax. With clear architecture and good tooling, you can build sophisticated systems with simple patterns.

The One-Click CORS Solution

After solving the region detection problem, I realized users would still hit CORS errors if their buckets weren't configured correctly. I could document the CORS configuration, but that's not user-friendly.

So I built one-click CORS configuration.

When an upload fails with a CORS error, I detect it and show a button: "Configure CORS on this bucket." Click it, and I call S3's PutBucketCorsCommand with the correct headers for multipart uploads. Problem solved.

This was a small feature, but it taught me something important about user experience: the best solution isn't always the most technically elegant. Sometimes it's the one that removes friction.

And with Kiro, implementing that solution was conversational. I described what I wanted, Kiro generated the code, I tested it, and it worked. The entire feature took less than an hour.

What I Actually Learned

After building this project, I have a different perspective on AI-assisted development.

It's not about speed. Yes, Kiro generates code faster than I can type. But the real value isn't velocity - it's consistency. Steering rules ensure that every line of code follows the same patterns. Agent hooks ensure that manual edits don't break those patterns. The result is a codebase that feels like it was written by one person, even though it was generated by AI.

It's not about automation. Yes, Kiro automates repetitive tasks. But the real value isn't automation - it's context. Vibe coding works because Kiro understands my project. MCP works because Kiro can interact with my infrastructure. The result is a development experience that feels collaborative, not mechanical.

It's not about replacing developers. Yes, Kiro writes a lot of code. But the real value isn't replacement - it's augmentation. I still make architectural decisions. I still design solutions. I still solve problems. Kiro just helps me execute those decisions faster and more consistently.

It's about constraints. The most surprising lesson was that constraints make AI-assisted development better, not worse. By limiting Kiro to 2006 patterns, I forced it to be more intentional. By defining clear patterns in steering rules, I gave it a design language to work with. By creating feedback loops with agent hooks, I reinforced those patterns.

The constraint wasn't a limitation. It was a framework.

The Future of AI-Assisted Development

This project changed how I think about building software with AI.

Traditional development is about writing code. AI-assisted development is about defining patterns and letting the AI maintain them. Traditional development is about individual productivity. AI-assisted development is about consistency at scale. Traditional development is about tools that help you code. AI-assisted development is about environments that understand your project.

Kiro showed me what that future looks like. Not AI that replaces developers, but AI that amplifies them. Not AI that generates code without context, but AI that understands your design language. Not AI that works in isolation, but AI that integrates with your infrastructure.

The Phantom of the Console is a haunted S3 manager. But it's also a proof of concept: AI-assisted development can maintain artistic constraints, solve complex problems, and deliver production-ready software.

And it can do it all while pretending it's 2006.

Try It Yourself

The project is open source and deployed live. Experience the haunted 2006 AWS Console and see how constraints can make AI-assisted development better.

More importantly, try building something with Kiro. Define your own constraints. Create your own steering rules. Build your own MCP servers. See how AI-assisted development changes when you give it context, patterns, and purpose.

I built a time machine to 2006. You can build whatever you want.


Technical Note: All AWS operations use the AWS SDK for JavaScript v3. Client-side operations include multipart uploads with CreateMultipartUploadCommand, UploadPartCommand, and CompleteMultipartUploadCommand. Server-side operations include bucket listing with parallel region detection, CORS configuration with PutBucketCorsCommand, and secure sharing with pre-signed URLs. The entire implementation maintains 2006 code patterns while using modern SDK features - a unique technical challenge that proved constraints and capability can coexist.

Top comments (0)