Introduction
Imagine having a personal assistant that can:
- Answer your questions from your own documents
- Search the internet for real-time information
- Execute code and automate tasks
- Remember your previous conversations
That's exactly what AI Agents do — and with LangChain, you can build one in Python in under 30 minutes.
In this post, I'll walk you through:
- What AI Agents are and why they matter
- How LangChain makes building agents easy
- Building a fully functional AI Chatbot Agent step-by-step
- Adding memory, tools, and real-world capabilities
What is an AI Agent?
An AI Agent is an autonomous system that can:
- Think — Understand your request using an LLM (Large Language Model)
- Decide — Choose the right tool or action to take
- Act — Execute the action (search, calculate, code, respond)
- Learn — Remember context from previous interactions
What is LangChain?
LangChain is a Python framework that makes it easy to build LLM-powered applications.
- Your LLM (OpenAI, Google Gemini, Llama, etc.)
- Your data (documents, databases, APIs)
- Your tools (search engines, calculators, code executors)
Why LangChain?
- Chains — Connect multiple LLM calls together
- Memory — Store and recall conversation history
- Tools — Give your agent superpowers (Google search, Wikipedia, Python REPL)
- RAG — Retrieve answers from your own documents
- Agents — Autonomous decision-making bots
Let's Build: AI Agent Chatbot with LangChain
from langchain_openai import ChatOpenAI, OpenAIEmbeddings
from langchain.agents import initialize_agent, AgentType, Tool
from langchain.tools import DuckDuckGoSearchRun, WikipediaQueryRun
from langchain.utilities import WikipediaAPIWrapper
from langchain.memory import ConversationBufferMemory
from langchain.document_loaders import TextLoader
from langchain.text_splitter import CharacterTextSplitter
from langchain.vectorstores import FAISS
from langchain.chains import RetrievalQA
from dotenv import load_dotenv
load_dotenv()
llm = ChatOpenAI(model="gpt-4", temperature=0)
---- Tool 1: Internet Search ----
search = DuckDuckGoSearchRun()
---- Tool 2: Wikipedia ----
wikipedia = WikipediaQueryRun(api_wrapper=WikipediaAPIWrapper())
---- Tool 3: RAG (Your Documents) ----
loader = TextLoader("my_document.txt")
documents = loader.load()
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=200)
docs = text_splitter.split_documents(documents)
embeddings = OpenAIEmbeddings()
vectorstore = FAISS.from_documents(docs, embeddings)
qa_chain = RetrievalQA.from_chain_type(
llm=llm,
chain_type="stuff",
retriever=vectorstore.as_retriever()
)
---- Define All Tools ----
tools = [
Tool(
name="Internet Search",
func=search.run,
description="Use this to search the internet for current information"
),
Tool(
name="Wikipedia",
func=wikipedia.run,
description="Use this to look up detailed information from Wikipedia"
),
Tool(
name="Document QA",
func=qa_chain.run,
description="Use this to answer questions from uploaded documents"
)
]
---- Memory ----
memory = ConversationBufferMemory(
memory_key="chat_history",
return_messages=True
)
---- Create Agent ----
agent = initialize_agent(
tools=tools,
llm=llm,
agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,
memory=memory,
verbose=True,
handle_parsing_errors=True
)
---- Chat Loop ----
print("AI Agent Ready! Type 'quit' to exit.\n")
while True:
user_input = input("You: ")
if user_input.lower() == "quit":
print("Goodbye!")
break
response = agent.run(user_input)
print(f"Agent: {response}\n")
Top comments (0)