DEV Community

Odinaka Joy
Odinaka Joy

Posted on • Edited on

How To Use LLMs: Prompt Engineering - A Practical Guide for Beginners

In this article, I explained what LLMs are and how to use them to build smart applications.
But just using an LLM isn’t enough. We need to communicate with it clearly and strategically.

That’s where Prompt Engineering comes in.


💡 What is Prompt Engineering?

Prompts are instructions and context (clear and structured inputs) provided to a language model for a certain task.

Prompt engineering is the practice of crafting and refining prompts so a language model can generate outputs that are useful, accurate, and relevant.

These LLMs are very powerful assistants but need smart instructions to produce quality results and the quality of what we get depends on how we ask.

Why it matters:

  • It saves time and frustration
  • It reduces irrelevant or wrong answers
  • It unlocks advanced LLM capabilities like reasoning, coding, creativity
  • It is an essential skill for developers building AI-powered apps 😊

📌 Understanding How LLMs Respond to Prompts

Before writing prompts, know that LLMs:

  1. Predict the next token based on probability - They don’t understand like humans. They pattern-match words to generate content.
  2. Rely heavily on context - The quality of your input determines the quality of its output. So, Better prompts = Better outputs.
  3. Don’t know your intent unless you tell them - LLMs don’t read our minds, ambiguity leads to confusion. If you are vague, the model will be vague 😄.

📌 Hierarchy of Instructions

When you give an LLM instructions, they are not all treated equally. There’s a hierarchy that decides which rules the model follows first and which ones get ignored in the case of conflicts.

1. System Instructions (Highest Priority) - Set by the model provider (e.g., OpenAI, Anthropic) and invisible to the user. It define core behavior, safety rules, and identity. It cannot be overridden.

2. Developer Instructions - Set by the app developer through API or integration. It control tone, style, and behavior for a specific app.

3. User Instructions – Direct requests from the person interacting with the model. It can override some developer rules but never system rules.

4. Contextual/Embedded Instructions (Lowest Priority) – Found in documents, chat history, or examples. It is the weakest in priority and easily overridden.


📌 Core Components of a Good Prompt

A good prompt often has 3 parts:

Component Purpose Example
Role/Context Tell the model who it is or what perspective to take "You are a professional backend engineer…"
Task/Goal The exact thing you want done "Explain microservices in simple terms."
Format/Constraints How you want the output delivered "Use bullet points, under 200 words."

Example:

You are a career coach with 10 years experience.  
Explain to a fresh graduate how to prepare for a software engineering interview.  
Give me 5 bullet points and a short motivational ending.
Enter fullscreen mode Exit fullscreen mode

📌 Basic Prompting Techniques (With Examples)

Prompting techniques are styles or strategies for writing prompts.

1. Zero-shot prompting
Ask the model to perform a task with no example, just instructions.

Translate this sentence to French: "I love programming."
Enter fullscreen mode Exit fullscreen mode

Use this when the task is simple and clear.

2. One-shot prompting
Give one example before asking the model to perform the same task again.

Translate this sentence to French:
English: "I love cats."
French: "J'aime les chats."

English: "I love programming."
French:
Enter fullscreen mode Exit fullscreen mode

This is good for moderately complex tasks where one example helps show the pattern.

3. Few-shot prompting
Provide a few examples to help the model understand the expected format or logic. Between 5 to 8 is ideal according to reseaarch.

Convert the following to a formal business email tone:

Casual: "Need the report by tomorrow."
Formal: "Kindly ensure the report is ready by tomorrow."

Casual: "Can't make the meeting."
Formal: "Unfortunately, I won’t be able to attend the meeting."

Casual: "What's the update on the task?"
Formal:
Enter fullscreen mode Exit fullscreen mode

This is great when:

  • We need consistency.
  • The task involves writing style.
  • We want the model to follow a specific structure.

4. Chain-of-thought prompting
Ask the model to follow reasoning steps before answering by adding think step by step to the prompt.

Question: If Sarah has 3 apples and buys 4 more, then gives 2 to her friend, how many apples does she have?

Answer: Let's think step by step.
Enter fullscreen mode Exit fullscreen mode

This is best for tasks that require reasoning, calculation, or logic.

5. Role prompting
Give the model a role or identity to respond from.

You are a senior software engineer. Explain the difference between GraphQL and REST to a junior developer.
Enter fullscreen mode Exit fullscreen mode

This is perfect for:

  • Customer support bots
  • Teaching/educational apps
  • Task-specific assistants like lawyer, doctor, manager

There are more advanced prompting techniques and many more emerging as research continues but I will cover those in a separate post.


📌 Practical Tips for Better Prompts

1️⃣ Be clear and specific, not vague.

  • ❌ "Tell me about AI."
  • ✅ "Explain AI in under 150 words for a 10-year-old."

2️⃣ Break down complex requests

  • ❌ "Write me a business plan for a bakery"
  • ✅ "List 5 business model options for a bakery" 👉🏼 "Write an executive summary for model #3"

3️⃣ Use iteration: Your first prompt is rarely perfect. Tweak, re-run, and refine.

4️⃣ Set output boundaries

  • Word count (under 150 words)
  • Style (formal, casual, humorous)
  • Language tone (beginner-friendly, expert-level)

5️⃣ Use bullet points or steps if possible.

6️⃣ Provide examples if the task has a pattern.

7️⃣ Use delimiters like """ to separate instructions from data.

8️⃣ Use XML tags like <article>...<article> to group data within the instruction.


📌 How to Choose the Right Prompting Technique

Use Case Suggested Prompt Type
Simple data transformation Zero-shot
Text classification Few-shot
Reasoning tasks Chain-of-thought
Needs personality or tone Role prompt
New use cases, no examples Zero-shot + Instructions
Task where examples help Few-shot or One-shot

 

📌 Prompt Engineering in Real Projects

  • Chatbots: Role prompts + output format for consistent replies
  • Content Generation: Few-shot prompts for tone consistency
  • Code Assistants: Chain-of-thought for debugging explanations
  • Data Extraction: Instruction-based prompts returning JSON

📌 Real Example: Job Description Analyzer

I built a project called Job Application Assistant, which helps users understand and respond to job listings. Before I integrated Function Calling, I used Prompting techniques with the OpenAI API to extract structured data from job descriptions.

Here’s how I did it using a combination of Few-shot and Role-based prompting:

const jobDescriptionExample = "We need a frontend developer skilled in React, JavaScript, and TailwindCSS. You will build UIs and collaborate with backend teams. 2+ years experience required.";

const response = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [
    {
      role: "system",
      content:
        "You are an AI assistant that extracts key skills, responsibilities, and experience from job descriptions.",
    },
    {
      role: "user",
      content: `Extract the following from this job description:\n
        1. Required Skills  
        2. Responsibilities  
        3. Required Experience\n\n${jobDescriptionExample}`,
    },
  ],
  max_tokens: 200,
});

return response.choices[0]?.message?.content?.trim() || "";
Enter fullscreen mode Exit fullscreen mode

Sample output:

Skills: React, JavaScript, TailwindCSS  
Responsibilities: Build UIs, collaborate with backend  
Experience: 2+ years
Enter fullscreen mode Exit fullscreen mode

📌 Some sample projects that illustrates Prompt Engineering

  1. Soul Sync - A safe space where users can check in emotionally, express themselves, and receive gentle, AI-powered guidance that helps them reconnect with their inner self.
  2. Therabot - A web app where users can chat with an AI-powered therapist for emotional support.

I also use this guide when prompting LLMs.

Prompting is about clear communication, iteration, and testing.

The more intentional your prompt, the more reliable your LLM becomes.

Happy coding!!!

Top comments (17)

Collapse
 
smoothstay profile image
SmoothStay

Interesting. Thanks for sharing!

Collapse
 
tulsi_shuklag_d50e39ef8e profile image
Info Comment hidden by post author - thread only accessible via permalink
Tulsi Shukla

can i have meeting with you about career suggestion i need ur suggestion

Collapse
 
Sloan, the sloth mascot
Comment deleted
Collapse
 
tulsi_shuklag_d50e39ef8e profile image
Tulsi Shukla • Edited

yeah mam when u free thanks

Thread Thread
 
dinakajoy profile image
Odinaka Joy
Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
dinakajoy profile image
Odinaka Joy

Are you joining again?

Thread Thread
 
tulsi_shuklag_d50e39ef8e profile image
Tulsi Shukla • Edited

i comment on linkdin u see that

Thread Thread
 
tulsi_shuklag_d50e39ef8e profile image
Tulsi Shukla

mam i comment on linkdin u see that

Thread Thread
 
tulsi_shuklag_d50e39ef8e profile image
Tulsi Shukla

voice is not clear i hardly listen ur suggestion

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
dinakajoy profile image
Odinaka Joy • Edited

Apologies, I waited but did not get a response from you.
I will chat with you on LinkedIn tomorrow please.
I will be stepping out soon

Thread Thread
 
tulsi_shuklag_d50e39ef8e profile image
Tulsi Shukla

when we meet

Some comments may only be visible to logged-in visitors. Sign in to view all comments. Some comments have been hidden by the post's author - find out more