DEV Community

Alex Spinov
Alex Spinov

Posted on

LangChain Has a Free Framework for Building LLM-Powered Applications

LangChain connects LLMs to your data, tools, and APIs. RAG, agents, chains, and memory — the building blocks for AI applications beyond simple chat.

Beyond ChatGPT Wrappers

Calling an LLM API is easy. Building a production AI app is hard:

  • How do you search YOUR documents? (RAG)
  • How do you give the LLM access to tools? (Agents)
  • How do you maintain conversation history? (Memory)
  • How do you chain multiple LLM calls? (Chains)

LangChain provides abstractions for all of this.

What You Get for Free

RAG (Retrieval-Augmented Generation):

from langchain_community.document_loaders import PDFLoader
from langchain_openai import OpenAIEmbeddings, ChatOpenAI
from langchain_community.vectorstores import Chroma
from langchain.chains import RetrievalQA

# Load your documents
loader = PDFLoader('company_docs.pdf')
docs = loader.load()

# Create vector store
vectorstore = Chroma.from_documents(docs, OpenAIEmbeddings())

# Ask questions about your data
qa = RetrievalQA.from_chain_type(
    llm=ChatOpenAI(model='gpt-4o'),
    retriever=vectorstore.as_retriever(),
)

result = qa.invoke('What is our refund policy?')
Enter fullscreen mode Exit fullscreen mode

The LLM answers using YOUR documents, not its training data.

Agents (LLM + Tools):

from langchain.agents import create_react_agent
from langchain_community.tools import DuckDuckGoSearchRun, WikipediaQueryRun

agent = create_react_agent(
    llm=ChatOpenAI(model='gpt-4o'),
    tools=[DuckDuckGoSearchRun(), WikipediaQueryRun()],
)

result = agent.invoke('What happened in tech news today?')
Enter fullscreen mode Exit fullscreen mode

The agent decides which tools to use, calls them, and synthesizes results.

LangChain Expression Language (LCEL)

from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser

chain = (
    ChatPromptTemplate.from_template('Tell me a joke about {topic}')
    | ChatOpenAI(model='gpt-4o')
    | StrOutputParser()
)

result = chain.invoke({'topic': 'programming'})
Enter fullscreen mode Exit fullscreen mode

Pipe syntax for composable, readable chains.

Quick Start

pip install langchain langchain-openai
export OPENAI_API_KEY=sk-...
Enter fullscreen mode Exit fullscreen mode

LangSmith (Observability)

LangSmith traces every LLM call, shows token usage, latency, and lets you debug chains visually. Free tier included.

If you're building AI features beyond simple chat — LangChain gives you the building blocks.


Need web scraping or data extraction? Check out my tools on Apify — get structured data from any website in minutes.

Custom solution? Email spinov001@gmail.com — quote in 2 hours.

Top comments (0)