Most developers or technical marketers I see are either copying garbage code without thinking, or avoiding AI entirely because "it doesn't understand my problem π€₯". Both approaches miss the point. AI won't write your app for you, but it can handle the parts of coding that make you want to quit and cry.
The ten techniques below work because they're based on how you actually build software and are tested by me and many others I learned from.
Make AI Explain Your Code Back to You
This one changed everything for me. Instead of asking AI to fix broken code, I started explaining my code to it first and asking it to tell me what it does.
It's rubber duck debugging, but the duck actually talks back π€ͺ
When you hit a bug, write out what you're trying to accomplish and what your code actually does. Then ask AI to explain your logic back to you in plain language. Nine times out of ten, when the AI restates your code, you'll spot the flaw immediately.
The Reverse Explanation Pattern:
I'm trying to calculate the total price with tax. Here's my function:
[paste your code]
Can you explain back to me, in plain English, what this function does step by step? Don't fix anything, just tell me what the code is actually doing.
When the AI says "this function multiplies price by 0.08 then adds it to itself," you'll realize you forgot to add the original price. The bug becomes obvious because someone (or something) else is reading your intent.
I use this for almost every bug now. It catches logic errors way faster than staring at the screen for an hour.
β±οΈ Time Saved: 30 minutes to 2 hours per bug
Generate Database Schemas That Actually Work
Setting up databases is one of those things that's never creative but always takes forever (at least for me). You know what you need: tables, relationships, indexes, etc. It's just annoying to write it all out.
AI can generate complete database schemas if you give it the full picture upfront. Don't just say "make me a user table." Describe your entire data model, how pieces relate, what needs to be fast, and what needs to be secure.
Schema Generation Checklist:
π Entities: List every type of thing (users, posts, comments, etc.)
π Relationships: One-to-many? Many-to-many? Be explicit
π Constraints: Required fields, unique values, foreign keys
β‘ Performance: What queries will run most often?
ποΈ Indexing: Which columns need indexes for speed?
π Audit: Do you need created_at, updated_at, deleted_at?
For a blog platform side project, I asked for tables for users, posts, comments, and tags. Told it users write posts, posts have comments, and posts can have multiple tags. Needed it optimized for showing recent posts by author since that's the main query. Asked for PostgreSQL with proper relationships and timestamps on everything.
I got back complete table definitions with all the connections set up correctly. Ran it, worked immediately. Saved hours of looking up syntax (especially for a newbie like me).
β±οΈ Time Saved: 2 to 4 hours
Ask AI to Break Down Your Approach First
This is the opposite of what most people do. Instead of asking AI to solve your problem, ask it to help you plan your solution.
Tell AI what you're building and ask it to question your approach. What edge cases might you be missing? What's a simpler way to do this? Where might this break at scale?
The Interrogation Pattern:
I'm building [feature description]. My plan is to [your approach].
Before I start coding or working on this project, what questions should I be asking myself? What edge cases am I probably not thinking about? Is there a simpler architecture I'm overlooking?
I was planning to build a simple feature for handling file uploads on a personal project (the blog platform from before). Asked AI what could go wrong with my approach. It caught that I wasn't checking file size limits or validating file types. Would have been a security nightmare in production.
Use this for any feature that feels like it might get complicated or if you always "forget" something. The AI won't always be right, but it'll force you to defend your choices, which often reveals flaws you didn't see.
β±οΈ Time Saved: 4 to 8 hours (by avoiding wrong implementations)
Generate Migration Scripts Without the Headache
Database migrations are error-prone and boring. One typo in a migration and you're rolling back in production at 2am.
AI is perfect for generating migration scripts because it can be paranoid about edge cases in ways humans forget to be.
Give it your current schema and your target schema, and ask for migration scripts that handle existing data, add rollback commands, and preserve data integrity.
Migration Request Template:
Current schema: [paste current tables]
Target schema: [describe changes]
Generate migration scripts that:
- Add new columns with appropriate defaults
- Migrate existing data safely
- Create rollback scripts
- Add necessary indexes
- Handle foreign key constraints
- Work with [your DB engine]
Include warnings for any data loss scenarios.
Needed to split a users table into users and user_profiles to separate auth from profile data. Described the split and asked for both up and down migrations. AI generated scripts that created the new table, copied data over, set up foreign keys, added indexes, and included a rollback that would merge everything back. Even included warnings about potential data loss if email fields were longer than 255 characters. For best practices, see Database Rollback Strategies in DevOps.
Tested the migration on a copy of production data. Worked perfectly. Would have taken me all afternoon to write and test manually.
β±οΈ Time Saved: 1 to 3 hours
Build Multi-Step Workflows in One Shot
Once you understand individual prompting patterns, you can chain them together to generate entire features.
The trick is structuring your prompt as a sequence: models first, then business logic, then API, then UI, then tests. Each piece builds on the previous one.
Feature Blueprint Format:
Build a [feature name] with the following stack: [technologies]
Step 1 - Data Model:
[describe your entities and relationships]
Step 2 - Business Logic:
[describe the core functionality]
Step 3 - API Endpoints:
[list endpoints with methods and requirements]
Step 4 - Frontend:
[describe the UI components needed]
Step 5 - Tests:
[what needs test coverage]
Generate code for each step that works together as a cohesive feature.
I needed a content translation system. Described the setup: Streamlit for the UI, Playwright to scrape web pages, AI models for translation, Bright Data's SERP API to find relevant links to add, and Pyppeteer for simple browser actions like screenshots.
AI generated the complete pipeline in about 30 minutes of back-and-forth. The scraping worked immediately, translation was solid, and the UI had all the controls I needed. Code wasn't perfect (had to fix some error handling and rate limiting), but it was ~90% done and properly structured.
This works best for self-contained tools that don't need to integrate deeply with existing systems. For modifying production apps, it's better to do it piece by piece.
My own example:
β±οΈ Time Saved: 4 hours to 2 days
Let AI Write Your Documentation
Writing docs is important and nobody wants to do it. AI is actually pretty good at this if you give it the code and tell it who the audience is.
Don't ask for generic documentation. Specify the audience, the format, what needs examples, and what needs warnings.
Documentation Pattern:
Create documentation for [code/API/component] targeted at [audience].
Include:
- Brief overview of what it does
- Setup/installation steps
- Usage examples (at least 3 realistic scenarios)
- Common pitfalls and how to avoid them
- API reference (if applicable)
- Links to related docs
Format: [Markdown/Docstring/JSDoc/etc.]
Built a custom event tracking system that sends user behavior data to our analytics stack. Needed a documentation so other marketers could add new events without breaking anything. Gave AI the code and asked for docs with examples of tracking button clicks, form submissions, and page views.
Got back a documentation that actually made sense to non-developers. First time it's ever happened to me.
β±οΈ Time Saved: 1 to 3 hours
Use AI for "What's Wrong With This" Reviews
When your code works but feels off, ask AI to review it like a senior engineer would. Not just for bugs, but for code smell, performance issues, security problems, and maintainability.
Code Review Prompts:
| Review Type | Prompt |
|---|---|
| Security | "Review this code for security vulnerabilities. Look for SQL injection, XSS, insecure dependencies, and auth bypass issues." |
| Performance | "Analyze this code for performance problems. Identify N+1 queries, unnecessary loops, and expensive operations." |
| Maintainability | "Review this code for maintainability. Point out complex logic, poor naming, missing error handling, and tight coupling." |
| Best Practices | "Review this against [language/framework] best practices. What patterns am I violating? What's going to confuse the next dev?" |
I had an API endpoint that worked fine in testing but felt slow. Asked AI to review it for performance and security. It caught that I was making 50 separate database calls inside a loop when I could do it in one query, spotted a missing rate limit, and found a security hole in how I was handling user input.
β±οΈ Time Saved: 2 to 6 hours (by catching issues early)
Generate Environment-Specific Configs
Docker files, CI/CD configs, environment variable files, they're all important but boring. Each one has specific syntax that I often forget.
AI can generate complete config files for your stack if you tell it what you're deploying and where.
Config Generation Needs:
π Stack: Framework, database, services you're using
π Environments: Development, staging, production requirements
π§ Build process: Dependencies, build steps, optimization
π§ͺ Testing: How tests should run in CI
π¦ Artifacts: What gets deployed, where it goes
π Secrets: How environment variables get managed
I saw a LinkedIn post about someone needing a complete Docker setup for a Next.js app with PostgreSQL. Asked for a Dockerfile with multi-stage builds for production, docker-compose for local dev with hot reloading, and a GitHub Actions workflow that runs tests, builds the image, and pushes to AWS ECR.
And guess what? He got back everything he needed with comments explaining each step. The Dockerfile was optimized for layer caching, docker-compose had correct volume mounts, and the GitHub workflow included parallel test runs.
Disclaimer: The above section is based on someone else's story.
β±οΈ Time Saved: 2 to 4 hours
Ask AI to Generate Only The "Boring" Parts
One of the best uses of AI is admitting which parts of your job are genuinely choking your soul.
Form validation? Have AI write it. Type definitions? AI. Converting JSON to TypeScript interfaces? AI. Boilerplate CRUD endpoints? AI.
Keep the interesting problems for yourself. Let AI handle the stuff that makes you zone out.
High-Value Automation:
β Form validation with error messages
β TypeScript type definitions from JSON
β CRUD endpoints (basic create/read/update/delete)
β Data transformation utilities
β Mock data generators for testing
β Regex patterns for common validations
β Basic error handling wrappers
β Logging setup and configuration
This is where AI really shines. Not replacing you, just doing the things you'd do anyway but faster.
β±οΈ Time Saved: 15 minutes to 2 hours per task
Create Reusable Code Patterns
If you find yourself writing the same kind of code repeatedly, have AI generate a template you can reuse.
This works great for things like API error handlers, database query wrappers, authentication middleware, logging, or anything else that follows the same pattern every time.
Pattern Template Request:
Create a reusable template for [pattern type] that:
Base requirements:
- [list core functionality]
Must handle:
- [list edge cases]
Should be customizable by:
- [parameters/options]
Generate TypeScript/Python/etc. with inline comments explaining how to customize it for different use cases.
I kept writing the same code to connect to a CRM API every time we built a new automation. Asked AI to create a reusable connector that takes care of authentication, retries failed requests, and formats the data correctly. Said that I wanted detailed error messages during testing but simple ones in production.
Got back a wrapper function I now use for all integrations. Changed a few field names to match our CRM structure, but the connection logic worked perfectly.
Now whenever I need to pull CRM data, I copy the template and customize it in 2 minutes instead of fighting with API documentation.
β±οΈ Time Saved: 30 minutes to 2 hours (accumulated over multiple uses)
What This Actually Means
The pattern here is obvious: AI is best at structured, repetitive work with clear requirements. It's not good at creative problem-solving (yet).
That means your job changes, no matter if you are a dev or a marketer. Instead of writing every line of code, you're making architectural decisions, explaining requirements, reviewing generated code, and solving the interesting problems.
The boring parts (boilerplate, config files, CRUD, type definitions - you name it) happen faster. The creative parts (system design, user experience, performance optimization) get more of your time.
Start This Week
| When | Do This |
|---|---|
| Today | Use the "explain back" technique on your current bug |
| Tomorrow | Generate a database schema or migration script |
| This Week | Try the multi-step workflow for a small feature |
| Next Week | Review what saved you the most time and do more of that |
You don't need to use AI for everything. Use it for the parts that make sense for your workflow. If something works better the old way, keep doing it the old way. Thanks for reading!

Top comments (0)