In Casino Royale (2006), James Bond specifies the precise ingredients for his Vesper martini: 3 measures of Gordons, 1 of Vodka, half a measure of Kina Lillet, shaken over ice with a thin slice of lemon peel.
By detailing the components, Bond ensures a superior drink. Similarly, in interacting with large language models (LLMs), the specificity and clarity of your prompts determine the quality of the output.
This process, known as prompt engineering, is essential for obtaining accurate and relevant responses from AI systems.
What is Prompt Engineering?
Prompt engineering involves crafting precise instructions to optimize LLM responses. The more detailed and specific the input prompt, the more relevant and aligned the output will be with your needs.
Example:
If you ask an LLM, “Summarize prompt engineering,” the response may be vague. However, by specifying the request like this:
Prompt: “Give a 100-word summary of prompt engineering, aimed at non-technical users.”
The LLM is now guided to give a concise, audience-specific answer.
Output:
"Prompt engineering is crafting effective instructions for AI systems like ChatGPT to get the best results. It involves being specific, providing context, and breaking down complex questions. By specifying formats, providing examples, or setting boundaries, prompt engineering ensures clear communication with an AI assistant."
This method of asking for specific outputs demonstrates how to "engineer" a prompt to get a desired, precise result.
Components of a Good LLM Prompt
1 Context: The context provides the LLM with additional information that helps it understand your request more fully. By embedding relevant context, the AI can generate more accurate and tailored responses.
Example:
Suppose you are working on a C# project and want to modify a class definition. The context here could be a code snippet.
Prompt with Context:
"Given this C# code:
public class User { public int UserId { get; set; } public string Name { get; set; } public string Email { get; set; } public string PhoneNumber { get; set; } }
Modify the class to make UserId
and Name
read-only and set them in the constructor."
Output:
public class User
{
public int UserId { get; }
public string Name { get; }
public string Email { get; set; }
public string PhoneNumber { get; set; }
public User(int userId, string name)
{
UserId = userId;
Name = name;
}
}
This example illustrates the importance of providing relevant context to guide the LLM in generating the correct response.
- User Question: The question is the main part of the prompt. It should be single-purpose, specific, and concise.
Example:
If you want to create a user class in C# with certain fields, specify the required fields and behavior clearly.
Vague Question:
"Create a user class."
Specific Question:
"Create a C# user class with fields: UserId
, Name
, PhoneNumber
. Make UserId
read-only and add a constructor to set these fields."
Output:
public class User
{
public int UserId { get; }
public string Name { get; set; }
public string PhoneNumber { get; set; }
public User(int userId, string name, string phoneNumber)
{
UserId = userId;
Name = name;
PhoneNumber = phoneNumber;
}
}
- Output Guidance: You can guide the model’s output by providing examples of the format you want.
Example:
If you need to generate dummy data for a User
class, provide an example of what the data should look like.
Prompt with Examples:
"Generate 5 instances of the User
class with these fields: UserId
, Name
, Email
, PhoneNumber
. Use the following format for examples:
var user1 = new User(1, "John Doe", "john.doe@example.com", "555-555-5555");
Here are some examples:
var user2 = new User(2, "Jane Doe", "jane.doe@example.com", "555-555-5556");
Now generate 5 instances."
Output:
var user1 = new User(1, "John Smith", "john.smith@example.com", "555-555-5555");
var user2 = new User(2, "Jane Doe", "jane.doe@example.com", "555-555-5556");
var user3 = new User(3, "Mary Johnson", "mary.johnson@example.com", "555-555-5557");
var user4 = new User(4, "David Lee", "david.lee@example.com", "555-555-5558");
var user5 = new User(5, "Linda White", "linda.white@example.com", "555-555-5559");
Techniques for Effective Prompt Engineering
- Zero-shot Prompting: The LLM generates a response based on its training data without explicit examples. This is effective for generating generic solutions or answers based on established patterns.
Example:
Prompt: "Create a unit test for the User
class using xUnit in C#."
Output:
public class UserTests
{
[Fact]
public void UserConstructor_SetsProperties()
{
// Arrange
var user = new User(1, "John Doe", "john.doe@example.com", "555-555-5555");
// Act & Assert
Assert.Equal(1, user.UserId);
Assert.Equal("John Doe", user.Name);
Assert.Equal("john.doe@example.com", user.Email);
Assert.Equal("555-555-5555", user.PhoneNumber);
}
}
- Few-shot Prompting: Provide several examples to guide the model in generating the desired output format.
Example:
To generate data for a User
class, you might use a few example data points to guide the output format.
Prompt with Few-shot Examples:
"Here are some instances of the User
class:
var user1 = new User(1, "John Smith", "john.smith@example.com", "555-555-5555");
var user2 = new User(2, "Jane Doe", "jane.doe@example.com", "555-555-5556");
Now create 3 more instances following the same pattern."
Output:
var user3 = new User(3, "Alice Brown", "alice.brown@example.com", "555-555-5557");
var user4 = new User(4, "Bob Green", "bob.green@example.com", "555-555-5558");
var user5 = new User(5, "Charlie White", "charlie.white@example.com", "555-555-5559");
- Prompt Chaining: Iteratively refine your queries based on previous responses, allowing the model to build on earlier interactions.
Example:
Start with a simple prompt and progressively modify it.
First Prompt:
"Create a basic User class in Python."
Output:
class User:
def __init__(self, username, email, password):
self.username = username
self.email = email
self.password = password
Follow-up Prompt:
"Convert this class to C#."
Output:
public class User
{
public string Username { get; set; }
public string Email { get; set; }
public string Password { get; set; }
}
Final Prompt:
"Make the Password
property private, and add a DateTime CreatedAt
property initialized in the constructor."
Output:
public class User
{
public string Username { get; set; }
public string Email { get; set; }
private string Password { get; set; }
public DateTime CreatedAt { get; private set; }
public User(string username, string email, string password)
{
Username = username;
Email = email;
Password = password;
CreatedAt = DateTime.Now;
}
}
- Chain-of-Thought Prompting: Provide multi-step instructions, helping the LLM approach a complex problem step-by-step.
Example:
Prompt:
*"Create a set of unit tests for the following C# class:
public class User { public int UserId { get; set; } public string Name { get; set; } public string PhoneNumber { get; set; } }
Think step-by-step:
- Identify key scenarios to test.
- Write unit tests using xUnit.
- Consider edge cases."*
Output:
public class UserTests
{
[Fact]
public void UserConstructor_SetsProperties()
{
// Arrange
var user = new User(1, "John Doe", "555-555-5555");
// Act & Assert
Assert.Equal(1, user.UserId);
Assert.Equal("John Doe", user.Name);
Assert.Equal("555-555-5555", user.PhoneNumber);
}
[Theory]
[InlineData("555-555-5555")]
[InlineData("123-456-7890")]
public void UserPhoneNumber_ShouldBeValid(string phoneNumber)
{
// Arrange
var user = new User(2, "Jane Doe", phoneNumber);
// Act & Assert
Assert.Equal(phoneNumber, user.PhoneNumber);
}
[Fact]
public void UserPhoneNumber_ShouldThrowException_WhenNull()
{
// Arrange
var user = new User(3, "Invalid User", null);
// Act & Assert
Assert.Throws<ArgumentNullException>(() => user.PhoneNumber = null);
}
}
The article was originally written by Jim, head of Devreal at Pieces for Developers. You can find more examples and nuances in this article https://pieces.app/blog/llm-prompt-engineering
Top comments (0)