DEV Community

Midas Tools
Midas Tools

Posted on

10 Prompt Engineering Mistakes You Are Making (and How to Fix Each One)

If you use ChatGPT, Claude, or Gemini regularly, your prompts probably follow the same pattern: vague instruction, mediocre output, frustration, try again.

I built a free tool that roasts your prompts — tells you exactly what is wrong and how to fix it. But before you try it, here are the 10 most common prompt mistakes I found after analyzing hundreds of prompts.

1. The Naked Command

Bad: "Write me a blog post about AI"

Why it fails: No role, no context, no audience, no format, no length, no tone. You gave the AI nothing to work with. It will give you nothing back.

Fix: "Act as a senior tech journalist writing for a developer audience. Write a 1200-word blog post about how AI is changing code review workflows. Include 3 specific examples from real companies. Tone: practical and opinionated, not corporate."

2. No Role Assignment

Bad: "Help me with my resume"

Why it fails: A career coach, a recruiter, a hiring manager, and a resume writer would all give different advice. You did not tell the AI which expert to be.

Fix: "Act as a senior technical recruiter at a FAANG company who has reviewed 10,000+ engineering resumes. Review my resume and tell me: what would make me reject this in 6 seconds, and what would make me move it to the interview pile?"

3. Missing Context

Bad: "Make this email better"

Why it fails: Better for what audience? What is the goal? What is the relationship? A cold sales email and a follow-up to your CEO require completely different optimization.

Fix: "I am emailing a VP of Engineering at a Series B startup. We met at a conference last week. I want to schedule a demo of our dev tools platform. Make this email concise (under 100 words), warm but professional, with one clear CTA."

4. No Output Format

Bad: "Give me marketing ideas"

Why it fails: You will get a wall of text. No structure. No prioritization. No way to act on it.

Fix: "Give me 10 marketing ideas for a B2B SaaS tool targeting developers. Format as a table with columns: Idea, Effort (low/med/high), Expected Impact (low/med/high), Time to Result. Sort by impact descending."

5. Asking for Everything at Once

Bad: "Create a complete business plan for my startup"

Why it fails: Business plans have 10+ sections. Asking for everything at once guarantees shallow output on every section.

Fix: Break it into focused prompts. Start with: "Act as a startup advisor. I am building [X] for [Y audience]. Give me only the market analysis section: TAM/SAM/SOM with realistic numbers and your methodology for estimating each."

6. No Constraints

Bad: "Write me some code"

Why it fails: What language? What framework? What coding style? Should it have tests? Error handling? Comments?

Fix: "Write a Python function that validates email addresses. Use regex. Include type hints. Add docstring. Handle edge cases (empty string, None, unicode). Include 5 pytest test cases. No external dependencies."

7. Vague Success Criteria

Bad: "Make this better"

Why it fails: Better how? Shorter? More persuasive? More technical? Better structured? More engaging?

Fix: "Rewrite this paragraph to be: (1) half the length, (2) more specific with numbers instead of adjectives, (3) active voice only, (4) reading level: 8th grade."

8. Ignoring the Chain

Bad: Starting a brand new prompt when the AI already has context from previous messages.

Fix: "Based on the marketing strategy we just discussed, now create the email sequence for segment 2 (churned users). Keep the same tone and CTA structure."

9. Not Asking for Reasoning

Bad: "Which framework should I use?"

Why it fails: You get an answer with no reasoning. You cannot evaluate if the recommendation fits your situation.

Fix: "I am choosing between Next.js, Remix, and Astro for a content-heavy marketing site with some interactive features. Compare all three on: build time, SEO, learning curve, ecosystem, and hosting costs. Recommend one with your reasoning."

10. Skipping Examples

Bad: "Write product descriptions for my store"

Fix: "Here is an example product description I like: [paste example]. Write 5 product descriptions in the same style for these products: [list]. Match the tone, length, and structure of the example."

Test Your Prompts

I built a Prompt Roaster that checks for all 10 of these mistakes. Paste in any prompt and it will:

  • Score it 0-100
  • Identify which of the 10 sins you are committing
  • Give you a specific, rewritten version

It is brutal, funny, and genuinely educational.

If you want to skip the roasting and just improve your prompt directly, the Prompt Enhancer rewrites any prompt with proper role assignment, context, format instructions, and constraints.

And if you want a quick score without the commentary, the Prompt Scorer gives you a 0-100 score with specific improvement suggestions.

All three are free, no signup.

The One-Liner Version

Good prompts have 5 things: Role (who), Context (background), Task (what), Format (how), and Constraints (boundaries).

If your prompt is missing any of those, it is leaving quality on the table.


Tools: Prompt Roaster | Prompt Enhancer | Prompt Scorer | All tools at midastools.co/tools

Top comments (0)