Stop Getting Mediocre Answers from AI Coding Assistants
Your AI coding assistant is probably smarter than you think. You're just asking the wrong way.
I spent the last 6 months using Claude, ChatGPT, and Copilot daily. Noticed something weird: some developers get incredible help from AI, while others get generic garbage that barely compiles.
The difference? Not the AI. It's how you ask.
Here are 5 techniques that actually work.
1. Give Context Like You're Onboarding a Junior Dev
Bad prompt:
How do I add authentication to my app?
Good prompt:
I'm building a Next.js 14 app with TypeScript and Prisma.
Currently using API routes for backend. Need to add
JWT-based authentication with refresh tokens. Users already
exist in the database (User model has email/password fields).
How should I implement this?
Why it works: AI isn't psychic. It doesn't know your stack, your constraints, or what you've already tried. Give it the same context you'd give a new teammate.
Pro tip: Keep a "context template" for your project. Paste it at the start of new conversations:
Stack: Next.js 14, TypeScript, Prisma, PostgreSQL
Deployment: Vercel
Constraints: Need server-side rendering, keep bundle small
Current auth: None yet
2. Show Your Failed Attempt First
Instead of asking "how do I do X," show what you tried:
I'm trying to debounce this search input but it's not working:
[paste your code]
The search still fires on every keystroke. I tried using
setTimeout but the previous timeout isn't getting cleared.
What am I missing?
Why it works: You get a targeted fix instead of a generic tutorial. The AI can see your actual problem, not guess what it might be.
This is how I debug now:
- Try to fix it myself (5 min)
- Paste broken code to AI with specific error
- Get solution in 30 seconds
- Move on
Saves me hours of StackOverflow rabbit holes.
3. Ask for Explanations Like You're Five
When you get code back, don't just copy-paste it. Ask:
Explain this line by line like I'm a junior developer.
Why did you use useCallback here? What would break without it?
Real example from yesterday:
AI gave me this React optimization:
const memoizedValue = useMemo(() => computeExpensiveValue(a, b), [a, b]);
I asked "why useMemo instead of just calling the function?"
Got back: "computeExpensiveValue runs on every render. With 60fps that's 60 calls/sec even when a and b don't change. useMemo caches the result."
Now I actually understand why, not just what.
Side benefit: You'll spot when AI is wrong. It happens. Understanding the "why" is your BS detector.
4. Iterate in the Same Conversation
Don't start a fresh chat for every follow-up question. Build on what you've established.
Session example:
Me: How do I validate email input in React?
AI: [gives validation code]
Me: Now add password validation with these rules:
8+ chars, 1 number, 1 special char
AI: [adds to existing code]
Me: Show me how to display validation errors below each field
AI: [extends the solution]
Each response builds on the previous context. By message 3, you have a complete, working solution that fits together.
Reset when: You switch topics or the AI starts hallucinating. Otherwise, ride that context.
5. Use "Rubber Duck Mode" for Complex Problems
Sometimes I'm not even sure what question to ask. That's when I rubber duck with AI:
I'm building a file upload system. Users can upload images,
and I need to:
- Validate file types
- Compress images before upload
- Show upload progress
- Store in S3
- Save metadata to database
I'm not sure where to start or what could go wrong.
What's the best architecture for this?
Why it works: By explaining the full problem (not just "how do I upload files"), you get a solution that considers the whole picture.
AI will often spot edge cases you missed:
- "What happens if upload fails halfway?"
- "How will you handle large files (>50MB)?"
- "Should compression happen client-side or server-side?"
These are questions that would've bitten you later.
The Meta-Skill: Treat AI Like a Pair Programming Partner
Here's the real unlock: stop treating AI like Google.
Google: Query → Answer → Done
AI Assistant: Problem → Discussion → Solution → Refinement
Best developers I know use AI like they'd use a senior dev sitting next to them:
- "Here's what I'm thinking, does this make sense?"
- "I'm stuck on this part, what am I missing?"
- "Is there a better way to do this?"
You wouldn't ask a teammate "how do I center a div" and walk away. You'd have a conversation.
Same with AI.
Bonus: The Copy-Paste Test
Before you paste AI code into your project, ask:
What could go wrong with this code in production?
What edge cases am I not handling?
Good AI will roast its own solution. You'll get:
- "This doesn't handle null values"
- "Memory leak if component unmounts during fetch"
- "No rate limiting on API calls"
Fix those before shipping. Your future self will thank you.
The Compound Effect
These techniques are small. Each one saves maybe 5 minutes per coding session.
But if you code daily, that's 30+ hours saved per year. Plus, better code. Plus, you actually learn instead of just copy-pasting.
I went from "AI tools are overhyped" to "can't work without them" by changing how I asked questions.
Your mileage may vary, but give these a shot for a week. You'll notice the difference.
Level Up Your AI + Dev Game
I write about AI tools, productivity hacks, and developer workflows every week. Real techniques that work, not generic "AI is amazing" content.
Subscribe to LearnAI Weekly — practical AI tips delivered to your inbox.
What's your best trick for working with AI assistants? Drop it in the comments — always looking to level up.
Top comments (0)