You've tried AI. You typed a question, got a mediocre answer, and thought: "This isn't as useful as everyone says."
The problem isn't the AI. It's the prompt.
The difference between a useless AI response and a genuinely helpful one is almost always in how you asked. This guide gives you practical frameworks for writing prompts that work — for sales emails, reports, analysis, customer support, and everything else you do at work.
No engineering background required. Just clear thinking.
Why prompts matter more than tools
Every AI tool — ChatGPT, Claude, Gemini, Copilot — is only as useful as the instructions you give it. A $20/month AI subscription with great prompts beats a $200/month subscription with vague ones.
Think of it this way: if you hired a brilliant assistant and said "write me something about marketing," you'd get something generic. But if you said "draft a 200-word LinkedIn post about how our Q1 product launch increased customer retention by 18%, aimed at B2B SaaS founders, in a conversational tone" — now they can deliver.
AI works the same way.
The RICE framework for business prompts
Every effective prompt has four components. Remember them as RICE:
R — Role
Tell the AI who to be.
- "Act as a senior financial analyst"
- "You're a customer support manager drafting a response to an upset enterprise client"
- "Act as a marketing strategist specializing in B2B SaaS"
The role sets the expertise level, vocabulary, and perspective of the response.
I — Input
Provide the specific context the AI needs.
- Paste the email you're replying to
- Include the data you want analyzed
- Share the brief, document, or background information
More context = better output. Don't make the AI guess.
C — Constraints
Define the boundaries.
- Word count: "Keep it under 150 words"
- Tone: "Professional but warm"
- Format: "Use bullet points, not paragraphs"
- Audience: "Written for a non-technical CEO"
- Exclusions: "Don't mention competitor names"
E — Example
Show what good looks like.
- "Here's an email I wrote last week that got a positive response: [paste example]"
- "The output should look like this: [paste format]"
- "Match the tone of this paragraph: [paste sample]"
Examples are the single most powerful lever for improving output quality.
Prompt templates by department
Sales: follow-up emails
Role: Senior sales rep at a B2B SaaS company
Input: [paste meeting notes or call summary]
Task: Write a follow-up email to the prospect.
Constraints:
- Under 150 words
- Reference one specific point from our conversation
- Include a clear next step with a specific date suggestion
- Professional, direct tone — no fluff
Example: [paste a follow-up email that worked well]
For more on AI-powered sales emails, see AI sales emails.
Marketing: content briefs
Role: Content marketing manager
Input: Target keyword is "[keyword]". Our audience is [description].
Task: Create a content brief for a blog post.
Constraints:
- Include suggested title, 5-7 subheadings, target word count
- Focus on practical advice, not theory
- Include 3 internal link opportunities
- Specify the search intent this article should satisfy
Support: customer responses
Role: Customer support specialist at [company]
Input: Customer message: "[paste message]"
Task: Draft a response that resolves their issue.
Constraints:
- Empathetic but efficient — acknowledge the problem, then solve it
- Under 100 words
- Include specific steps the customer should take
- End with a way to follow up if the issue persists
Analysis: data interpretation
Role: Business analyst
Input: [paste data or describe the dataset]
Task: Analyze this data and provide insights.
Constraints:
- Lead with the 3 most important findings
- Include specific numbers, not vague trends
- Flag any data quality issues you notice
- Suggest 2-3 actions based on the findings
Format: Numbered list with brief explanations
Operations: meeting summaries
Role: Executive assistant
Input: [paste meeting transcript or notes]
Task: Create a meeting summary.
Format:
- Decisions made (bulleted)
- Action items (with owner and deadline)
- Key discussion points (3-5 sentences max)
- Open questions for follow-up
Constraint: Total summary under 300 words
Five techniques that immediately improve results
1. Chain your prompts
Don't ask the AI to do everything in one shot. Break complex tasks into steps:
- "Analyze this customer feedback data and categorize the top 5 themes"
- "For each theme, draft a one-paragraph summary with specific quotes"
- "Based on these themes, recommend 3 product improvements with estimated impact"
Each step builds on the previous one, and you can correct course between steps.
2. Use "before you answer" instructions
Tell the AI what to think about before responding:
"Before you write this email, consider: What is the recipient's biggest concern? What objection might they have? What would make them say yes?"
This forces more thoughtful output.
3. Ask for multiple versions
"Give me three versions of this headline: one provocative, one straightforward, one question-based."
Multiple options are fast for AI and give you better material to work with.
4. Specify what to avoid
"Don't use jargon. Don't start with 'In today's fast-paced world.' Don't use the word 'leverage.'"
Telling the AI what not to do is often as important as telling it what to do.
5. Iterate, don't restart
If the first response is 70% right, don't start over. Say:
"Good start. Make these changes: [specific edits]. Keep everything else the same."
Iteration is faster and more effective than rewriting your prompt from scratch.
Building a team prompt library
The biggest productivity multiplier isn't individual prompt skill — it's shared prompts across your team.
Step 1: Collect what works. When someone writes a prompt that produces great results, save it.
Step 2: Standardize templates. Create templates for your 10 most common tasks — emails, reports, summaries, analysis, responses.
Step 3: Share and iterate. Put templates in a shared doc (Notion, Google Docs, your wiki). Let team members suggest improvements.
Step 4: Include examples. Each template should include 1-2 examples of great output. Examples teach better than instructions.
For more on AI-powered content workflows, see AI content creation and AI writing assistant: keep your voice.
Common prompting mistakes
Being too vague. "Help me with marketing" → useless. "Draft 3 LinkedIn post hooks for our new expense tracking feature, aimed at finance directors" → useful.
Not providing context. The AI doesn't know your company, your customers, or your situation unless you tell it. Paste in relevant context every time.
Accepting the first output. Treat the first response as a draft. Push back. Ask for changes. The second or third version is almost always better.
Overcomplicating prompts. If your prompt is 500 words, you're probably asking for too much at once. Split it into steps.
Ignoring the AI's strengths. AI is great at first drafts, brainstorming, analysis, and reformatting. It's weaker at strategy, judgment calls, and anything that requires real-world context it doesn't have. Play to its strengths.
What's next
Prompt engineering isn't a one-time skill. It gets better with practice. Start with the RICE framework, use the templates above for your most common tasks, and build a team library over time.
The goal isn't to become a "prompt engineer." The goal is to communicate clearly enough with AI tools that they save you real time on real work. Every week, you'll get a little better at it — and the AI tools are getting better too.
Originally published on Superdots.
Top comments (0)