π± 0οΈβ£ What We Are Building
We are building an AI Science Teacher backend.
This AI:
- answers science questions
- explains like teacher
- remembers conversation
- ignores non-science
Example:
You: What is gravity?
AI: Gravity is a forceβ¦
You: explain simpler
AI: Imagine dropping a ballβ¦
π AI remembered context.
π§ 1οΈβ£ How Chatbot Works (Easy)
Student β API β AI β Answer
β
Memory
Steps:
- Student sends question
- Backend receives
- Memory added
- AI generates answer
- Answer returned
π§° 2οΈβ£ Tools We Use (Simple)
Node.js β runs backend
Express β API server
OpenAI β AI brain
LangChain β connects AI + memory
Memory β remembers chat
dotenv β loads API key
π¦ 3οΈβ£ Create Project
Open terminal:
mkdir science-teacher-bot
cd science-teacher-bot
pnpm init
π¦ 4οΈβ£ Install Packages
pnpm add express cors dotenv langchain @langchain/openai @langchain/core
pnpm add -D nodemon
π 5οΈβ£ Project Structure
science-teacher-bot/
β
βββ src/
β βββ memory.mjs
β βββ llm.mjs
β βββ route.mjs
β βββ server.mjs
β
βββ .env
βββ package.json
π 6οΈβ£ Add OpenAI Key
Create .env
OPENAI_API_KEY=your_key_here
PORT=3000
πΎ 7οΈβ£ memory.mjs β Chat History Storage
This file stores conversation history for each user/session.
import { InMemoryChatMessageHistory } from "@langchain/core/chat_history";
const histories = {};
export function getSessionHistory(sessionId = "default") {
if (!histories[sessionId]) {
histories[sessionId] = new InMemoryChatMessageHistory();
}
return histories[sessionId];
}
π§ What This Code Does
- Creates memory storage for chatbot
- Stores chat separately per user
- Each user has unique sessionId
- Returns that userβs history
π Enables multi-user chatbot memory.
Hinglish:
Har user ka alag chat history store hota hai
π€ What are student1, student2 ?
They are session IDs (user IDs).
Example:
- student1 β first user
- student2 β second user
You pass them when calling chatbot.
So each user gets separate memory.
π¦ What new InMemoryChatMessageHistory() Creates
It creates an object like:
InMemoryChatMessageHistory {
messages: []
}
This object stores:
- human messages
- AI messages
- conversation order
π How histories Object Works
histories is a dictionary:
sessionId β chat history object
So:
histories["student1"] = InMemoryChatMessageHistory
histories["student2"] = InMemoryChatMessageHistory
π§ͺ Example Conversation
User: student1
Question:
What is gravity?
AI:
Gravity is a force that pulls objects.
Memory stored for student1.
User: student2
Question:
What is photosynthesis?
Memory stored separately.
π Example Output
After chats:
histories = {
student1: InMemoryChatMessageHistory {
messages: [
{ role: "human", content: "What is gravity?" },
{ role: "ai", content: "Gravity is a force that pulls objects." }
]
},
student2: InMemoryChatMessageHistory {
messages: [
{ role: "human", content: "What is photosynthesis?" }
]
}
}
π§Ύ Simplified View
Sometimes shown as:
histories = {
student1: [
{ role: "human", content: "What is gravity?" },
{ role: "ai", content: "Gravity is a force that pulls objects." }
],
student2: [
{ role: "human", content: "What is photosynthesis?" }
]
}
(This is conceptual view, actual object is InMemoryChatMessageHistory)
π How It Is Used by LangChain
When chatbot runs:
getSessionHistory("student1")
LangChain:
- Finds student1 history
- Loads messages
- Sends to AI
- Adds new messages
So AI remembers conversation per user.
β οΈ Important Notes
- Memory stored in RAM
- Lost on server restart
- Good for demo/testing
- Production uses DB/Redis
Perfect π β you want the same section completed and polished like your previous memory section, with:
- clear explanation
- beginner-friendly
- consistent format
- ready for notes
Below is your **final clean βπ§
π§ 8οΈβ£ llm.mjs β AI Chatbot Logic
Creates AI teacher + memory chatbot.
import { config } from "dotenv";
config();
import { ChatOpenAI } from "@langchain/openai";
import { ChatPromptTemplate, MessagesPlaceholder } from "@langchain/core/prompts";
import { RunnableWithMessageHistory } from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { getSessionHistory } from "./memory.mjs";
const llm = new ChatOpenAI({
model: "gpt-4o-mini",
temperature: 0.3,
});
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a Science Teacher. Answer only science questions. Explain simply."],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
const chain = prompt.pipe(llm).pipe(new StringOutputParser());
export const chatChain = new RunnableWithMessageHistory({
runnable: chain,
getMessageHistory: getSessionHistory,
inputMessagesKey: "input",
historyMessagesKey: "chat_history",
});
π§ What This File Does
- Creates AI model
- Defines teacher behavior
- Connects chat memory
- Creates chatbot system
π Main chatbot brain.
Hinglish:
Ye file AI teacher + memory chatbot banati hai
π€ ChatOpenAI
Creates AI model.
- Connects to OpenAI
- Generates answers
- Brain of chatbot
Hinglish:
AI model banata hai
π§Ύ Prompt (Teacher Rules)
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a Science Teacher. Answer only science questions. Explain simply."],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
Defines how AI behaves:
- science teacher role
- simple explanation
- science-only answers
Hinglish:
AI ko teacher banata hai
π§ MessagesPlaceholder("chat_history")
- Inserts previous conversation
- Loads user memory
- Gives context to AI
Hinglish:
Yaha pichli chat aati hai
π chain = prompt β AI β text
const chain = prompt.pipe(llm).pipe(new StringOutputParser());
Flow:
- Prompt prepared
- Sent to AI
- Text answer returned
Hinglish:
Prompt β AI β answer
πΎ RunnableWithMessageHistory
Adds memory to chatbot.
new RunnableWithMessageHistory({...})
- Loads user chat history
- Adds new messages
- Maintains conversation
Hinglish:
AI ko user ki chat history deta hai
π·οΈ inputMessagesKey: "input"
- Name of user message field
- Matches
{input}in prompt
Hinglish:
Student ke question ka naam
π·οΈ historyMessagesKey: "chat_history"
- Name of memory variable
- Matches MessagesPlaceholder
Hinglish:
Memory ka naam
π How It Works (Example)
User: student1
Question: What is gravity?
LangChain sends to AI:
System: You are science teacher
History: (student1 chat)
User: What is gravity?
AI replies β stored in memory.
Next question uses same history.
π 9οΈβ£ route.mjs
API endpoint /ask
import { Router } from "express";
import { chatChain } from "./llm.mjs";
export const router = Router();
router.post("/", async (req, res) => {
try {
const { text, sessionId = "default" } = req.body;
if (!text) {
return res.status(400).json({ error: "Question required" });
}
const answer = await chatChain.invoke(
{ input: text },
{ configurable: { sessionId } }
);
res.json({ answer });
} catch (err) {
res.status(500).json({ error: "AI error" });
}
});
Easy Meaning
- Receives question
- Sends to AI
- Returns answer
π 1οΈβ£0οΈβ£ server.mjs
Starts backend server.
import express from "express";
import cors from "cors";
import { config } from "dotenv";
import { router } from "./route.mjs";
config();
const app = express();
app.use(cors());
app.use(express.json());
app.get("/", (req, res) => {
res.send("Science Teacher AI running");
});
app.use("/ask", router);
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log("Server running on http://localhost:" + PORT);
});
Easy Meaning
- Creates server
- Enables JSON
- Adds API route
- Starts app
βΆοΈ 1οΈβ£1οΈβ£ Run Project
Add script in package.json:
"scripts": {
"dev": "nodemon src/server.mjs"
}
Run:
pnpm dev
π§ͺ 1οΈβ£2οΈβ£ Test Chatbot
POST β http://localhost:3000/ask
Body:
{
"text": "What is gravity?"
}
Response:
{
"answer": "Gravity is a force..."
}
π§ 1οΈβ£3οΈβ£ Memory Test
Send:
{ "text": "What is gravity?", "sessionId": "u1" }
Then:
{ "text": "Explain simpler", "sessionId": "u1" }
AI remembers π
π 1οΈβ£4οΈβ£ What You Built
You created:
- AI backend
- LangChain chatbot
- Memory system
- Express API
- Science teacher AI
This is real AI architecture.




Top comments (0)