LangChain is the most popular framework for building LLM-powered applications. It connects LLMs to your data, APIs, and tools.
What You Get for Free
- Chains — compose LLM calls with logic
- Agents — LLMs that use tools autonomously
- RAG — retrieval-augmented generation
- Memory — conversation history
- 50+ LLM providers — OpenAI, Anthropic, local models
- 100+ integrations — vector stores, loaders, tools
RAG Pipeline
from langchain_openai import ChatOpenAI, OpenAIEmbeddings
from langchain_community.vectorstores import Chroma
from langchain.chains import RetrievalQA
llm = ChatOpenAI(model='gpt-4')
db = Chroma.from_documents(docs, OpenAIEmbeddings())
qa = RetrievalQA.from_chain_type(llm=llm, retriever=db.as_retriever())
result = qa.invoke('What is our refund policy?')
LangChain vs LlamaIndex
| Feature | LangChain | LlamaIndex |
|---|---|---|
| Focus | General LLM apps | Data/RAG |
| Agents | Full framework | Data agents |
| Flexibility | Higher | More opinionated |
Need LLM app development? Check my work on GitHub or email spinov001@gmail.com for consulting.
Top comments (0)