Finished another short free course made by DeepLearning.AI, this time it was the course, LangChain for LLM Application Development
In this short course, you learn about the common components of LangChain, including models, prompts, memory, and chains. You will also learn about agents, a new type of end-to-end use case that uses the model as a reasoning engine.
After completing this course, you will be able to quickly put together some applications using LangChain.
The lessons of the course are:
1. Introduction
This lesson introduces the course and gives a short overview of LangChain. It is led by experts Harrison Chase (one of the creators of LangChain) and Andrew Ng.
~ 3 minutes long
2. Models, Prompts and parsers
Here Andrew is walking us through how to work with models, prompts and output parsing with LangChain. Where models represents the language model to use, prompts is the way to ask question to the language model and parsers are how we extract the information from the output. This is exemplified with LangChain code which translates a customer email from English pirate to polite American English. A lot of emphasis is put on prompt templates which are a useful abstraction that are good to reuse when you can.
~ 18 minutes long
3. Memory
The lesson covers the basics of memory in LangChain and LLMs, which is used to remember previous parts of a conversation and feed that into a language model so that it can have a conversational flow as you interact with it.
LangChain offers multiple options for managing memory, including:
- ConversationBufferMemory: This memory allows for storing of messages and then extracts the messages in a variable.
- ConversationBufferWindowMemory: This memory keeps a list of the interactions of the conversation over time. It only uses the last K interactions. -ConversationTokenBufferMemory: This memory keeps a buffer of recent interactions in memory, and uses token length rather than number of interactions to determine when to flush interactions
- ConversationSummaryBufferMemory: This memory creates a summary of the conversation over time.
There are additional memory types:
- Vector data memory: Stores text (from conversation or elsewhere) in a vector database and retrieves the most relevant blocks of text.
- Entity memories: Using an LLM, it remembers details about specific entities.
~ 17 minutes long
4. Chains
This lesson teaches you about chains, which is one of the most important building blocks in LangChain (it's even in the name). Chains can be used to combine an LLM with a prompt, or to combine multiple chains together.
Here are the different types of chains covered in the lesson:
- Simple chain
- Sequential chain
- Router chain
- Multi-prompt chain
Chains are a powerful way to build complex language applications. By combining different types of chains, you can create applications that can perform a wide range of tasks.
~ 13 minutes long
5. Question and Answer
This lesson teaches you how to use LangChain to do question and answering over documents. The key components involved are embeddings, vector stores, and retrievers.
The lesson also discusses different methods for question answering.
~ 15 minutes long
6. Evaluation
In this lesson, you will learn how to evaluate a LLM-based application using LangChain.
Once you have your examples, you can use the LangChain, QAEvalChain
, chain to evaluate them. This will use a language model to grade each example and give you a score.
Evaluating LLM-based applications is important to ensure that they are meeting your needs and performing as expected. LangChain provides a number of tools that can be used to evaluate LLM-based applications, including the QAEvalChain
chain and the LangChain Evaluation Platform.
~ 15 minutes long
7. Agents
Here we learn more about agents in LangChain. Agents are a powerful tool that allows you to use a language model as a reasoning engine. You can connect agents to different tools, such as search engines and APIs, which allows them to access and process information from the real world.
Agents are still under development.
~ 14 minutes long
Conclusion
A short summary what we have learnt during this course.
~ 2 minutes long
Top comments (0)