DEV Community

Cover image for I Tried "Vibe Coding" a Hardware Test Site: AI is Powerful, But It's Not Magic (Yet) ๐Ÿ› ๏ธ
jack
jack

Posted on

I Tried "Vibe Coding" a Hardware Test Site: AI is Powerful, But It's Not Magic (Yet) ๐Ÿ› ๏ธ

๐Ÿ‘‹ The Backstory
I want to share a recent "experimental project" of mine: HardwareTest.org.

The motivation was simple: I bought some new peripherals and wanted to test them. But I was fed up with the existing toolsโ€”screens full of ads, outdated UIs, or sketchy .exe files that I didn't want to download.

As a self-described "average developer," I recently got brainwashed by the concept of "Vibe Coding" (coding by natural language/AI intuition). I thought, "AI is so strong now. I'll just write the prompts, let the AI write the code, and I'll be done in minutes, right?"

Spoiler Alert: I was too naive. ๐Ÿ˜‚
While AI absolutely lowered the barrier to entry and boosted my speed by 10x, taking a tool from "it works" to "it feels good to use" was full of hidden traps.

๐Ÿšง The Real Challenges
Here is a breakdown of the actual struggles I faced while pair-programming with AI:

  1. The Tooling Chaos
    My workflow was a bit of a mess. I started with Antigravity (it designed the initial UI), but ran out of credits. I switched to Codex to finish the logic. For the blog content, I used Gemini, but integrating that content back into the project via Codex resulted in a formatting nightmare. It was a lot of back-and-forth "fixing" what the AI broke.

  2. Browser Limitations vs. Physics (The Keyboard Test)
    I thought testing Keyboard Polling Rate would be simple: just tell the AI to "write an event listener."

The Reality: I discovered that the browser's Event Loop often can't even keep up with a 1000Hz gaming keyboard. The raw data coming out was jittery and unusable. The Fix: I was forced into dozens of rounds of conversation with the AI. We had to optimize the algorithm, add debounce logic, and implement sliding averages just to get a relatively accurate "Real-time Hz Dashboard" on the web.

  1. The Devil is in the Details (The Mouse Test)
    I assumed a mouse test was just listening for onClick. The Reality: To properly test for Double Click issues (a gamer's nightmare) and Scroll Wheel rollback, you need very precise counting logic. Also, the AI kept confusing "Middle Click" (pressing the wheel) with "Scrolling" (spinning the wheel). It took a lot of human intervention to separate those events cleanly.

  2. The SEO Battle
    Writing the code was just step one. To get this English-language site indexed by Google, I spent ages wrestling with Schema, FAQ, and JSON-LD. The Insight: AI writes syntactically correct code, but often logically nonsensical SEO tags. This led to Google Search Console errors that I had to manually debug and patch.

โœจ The Result
Despite the process being more twisted than I expected, I'm actually really proud of the final result. It is a pure static, ad-free, dark-mode online hardware diagnostic suite.

๐Ÿ‘‰ Check it out here: www.hardwaretest.org

Current Features:

โŒจ๏ธ Keyboard Test: Visualizer with a real-time Hz polling rate dashboard (and Ghosting/NKRO support).

๐Ÿ–ฑ๏ธ Mouse Test: Left/Right/Middle buttons + Scroll Wheel + Double Click detection.

๐Ÿ–ฅ๏ธ Dead Pixel & Fixer: Standard color cycle test, plus a "High-Frequency Noise Repair" feature built with Canvas.

๐ŸŽง Audio Test: Left/Right channel separation + Logarithmic Sweep.

This "Vibe Coding" experience taught me a valuable lesson: AI is an incredibly fast junior developer. It can speed up production by 1000%, but it cannot yet replace the human eye for product details, edge cases, and user experience.

๐Ÿ™ Feedback Welcome! The site just went live, so there are definitely bugs and rough edges. If you have a moment to try it out, Iโ€™d love to hear your feedback in the comments!

Top comments (1)

Collapse
 
art_light profile image
Art light

Great!