DEV Community

Cover image for Generating a result with a context
Cris Crawford
Cris Crawford

Posted on

Generating a result with a context

In this video, 1.4 from the llm-zoomcamp, we start by reviewing what happens when we ask the LLM a question without context. We get a generic answer that isn't helpful.

from openai import OpenAI

client = OpenAI()

response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{"role": "user", "content": q}]
)

response.choices[0].message.content

"Whether you can still enroll in a course that has already started typically depends on the policies of the institution offering the course. Here are a few steps you can take:\n\n1. **Check the Course Enrollment Deadline:** Look for any specific deadlines mentioned on the institution's website or contact the admissions office to see if late enrollment is allowed.\n\n2. **Contact the Instructor:** Reach out to the course instructor directly. They might allow late entries if you're able to catch up on missed material.\n\n3. **Administrative Approval:** Some institutions require approval from the department or academic advisor for late enrollment.\n\n4. **Online Courses:** If it's an online course, there may be more flexibility with start dates, so check if you can still join and catch up at your own pace.\n\n5. **Catch-Up Plan:** Be prepared to ask about what materials you've missed and how you can make up for lost time. Showing a willingness to catch up might increase your chances of being allowed to enroll.\n\nEach institution has its own policies, so it's best to inquire directly with the relevant parties at your school."
Enter fullscreen mode Exit fullscreen mode

I created a prompt template. The prompt doesn't have to be exactly as written. Creating a prompt is sort of an art. Later, we'll learn how to refine the method using metrics to determine how good the prompt is. But for now this is what I used.

prompt_template = """
You're a course teaching assistant. Answer the QUESTION based on the CONTEXT from the FAQ database.
Use only the facts from the CONTEXT when answering the question.
If the CONTEXT doesn't contain the answer, output NONE

QUESTION: {question}

CONTEXT: {context}  
""".strip()
Enter fullscreen mode Exit fullscreen mode

Now I put what I got from the search engine into the context.

context = ""

for doc in results:
    context = context + f"section: {doc['section']}\nquestion: {doc['question']}\nanswer: {doc['text']}\n\n"
Enter fullscreen mode Exit fullscreen mode

Finally, I put the context and the question into the prompt_template, and ask ChatGPT the question.

prompt = prompt_template.format(question=q, context=context).strip()

response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{"role": "user", "content": prompt}]
)

response.choices[0].message.content

"Yes, even if you don't register, you're still eligible to submit the homeworks. Be aware, however, that there will be deadlines for turning in the final projects. So don't leave everything for the last minute."
Enter fullscreen mode Exit fullscreen mode

Now the answer is relevant to the course. This is Retrieval-Augmented Generation, or RAG.

Previous post: Setting up the database and search for RAG
Next post: Swapping in elasticsearch to the proto-OLIVER

Top comments (1)

Collapse
 
raajaryan profile image
Deepak Kumar

Hello everyone,

I hope you're all doing well. I recently launched an open-source project called the Ultimate JavaScript Project, and I'd love your support. Please check it out and give it a star on GitHub: Ultimate JavaScript Project. Your support would mean a lot to me and greatly help in the project's growth.

Thank you!