Previously I talked about how I was tired of "flaky" tests and how I'm using AI to generate test steps, saving hours of my time.
I showed how I can create a 20-step E2E test in 6 minutes, just by describing it.
But there's a problem. Even an AI can create a "brittle" test.
Imagine you ask an AI: "Log in using 'admin@test.com' and password '12345', and check for the text 'Welcome, John Doe'".
The AI will do it. And then your test will fail.
It will fail when you run it on staging, where the login is staging_admin@test.com. It will fail when 'John Doe' changes his name to 'Jane Doe'. It will fail because it is hardcoded.
This is the number one enemy of stable automation. And when I, as a solo developer, started building Debuggo, I knew I had to solve this problem first.
AI is the "Magic," but Variables are the "Engineering"
The AI in Debuggo is great at "translating" English into steps. But real test stability isn't just about the steps; it's about the data those steps use.
From the very beginning, I designed Debuggo to work with two types of variables. This is the logic I wrote myself, on top of the AI core.
1. Environment Variables
The Problem: You have different URLs, logins, or API_KEYs for your dev, staging, and production environments.
The Solution: In Debuggo, you don't hardcode these values. You store them in your environment settings. Like this:
%BASE_URL%=https://my-staging-site.com%ADMIN_LOGIN%=staging_admin@test.com%ADMIN_PASSWORD%=superS3cret!
Now, here's the best part. I don't need to tell the AI: "Navigate to %BASE_URL%, then fill 'email' with %ADMIN_LOGIN%..."
I just write in plain English:
"login to the website"
And that's it.
The AI is smart enough to understand that "login" requires the %BASE_URL%, %ADMIN_LOGIN%, and %ADMIN_PASSWORD% variables. It will find the right elements on the page (even if the layout changes), plug in the values from your environment, and perform the login.
The Result: The exact same test. Zero changes to the text. I can run it on any environment just by switching the env in Debuggo. That's the foundation.
2. Temporary ("In-Test") Variables
This is my favorite part.
The Problem: How do you test a flow where data is generated during the test? The classic example: "Create a user" -> "Verify the user exists" -> "Delete that specific user."
You don't know the ID or email of this user before the test. You can't hardcode it.
The Solution: I gave the AI the ability to "remember" data from the page and use it in later steps.
Imagine a test case like this:
"Navigate to '/users' and click 'Create New User'"
"Fill 'email' with 'test.user.123@example.com' and click 'Save'"
"Find the new User ID from the table and save it as
%new_user_id%""Navigate to /users/delete/%new_user_id%"
"Click 'Confirm Delete'"
"Verify that the user with ID
%new_user_id%is no longer in the table"
Step #3 and #6 are what make this a true E2E test. The AI isn't just doing actions; it's extracting dynamic data (%new_user_id%) and using it for test cleanup (Teardown).
What's the takeaway?
For me, this is the difference between a "toy recorder" and a professional QA tool.
A simple recorder logs: click("#btn-123"). That's brittle.
Debuggo (with AI) understands: click "Submit". That's better.
Debuggo (with AI + Variables) understands: "login to the website"... then... "save the new ID as %user_id%"... and then... "delete %user_id%".
That is a robust, maintainable automated test.
I'm building Debuggo not just to create tests quickly, but to create reliable tests that don't break every time the wind blows.
This approach is my biggest bet. And I'm very curious to see if it solves your "flaky" test pain just like it solves mine.
Try my approach. I'm actively looking for beta testers who understand this "hardcoded data" problem.


Top comments (0)