DEV Community

Abhishek Gupta
Abhishek Gupta

Posted on

πŸŽ“ Science Teacher Chatbot

🌱 0️⃣ What We Are Building

We are building an AI Science Teacher backend.

This AI:

  • answers science questions
  • explains like teacher
  • remembers conversation
  • ignores non-science

Example:

You: What is gravity?
AI: Gravity is a force…

You: explain simpler
AI: Imagine dropping a ball…

πŸ‘‰ AI remembered context.

🧠 1️⃣ How Chatbot Works (Easy)

Student β†’ API β†’ AI β†’ Answer
           ↑
        Memory
Enter fullscreen mode Exit fullscreen mode

Steps:

  1. Student sends question
  2. Backend receives
  3. Memory added
  4. AI generates answer
  5. Answer returned

🧰 2️⃣ Tools We Use (Simple)

Node.js β†’ runs backend
Express β†’ API server
OpenAI β†’ AI brain
LangChain β†’ connects AI + memory
Memory β†’ remembers chat
dotenv β†’ loads API key


πŸ“¦ 3️⃣ Create Project

Open terminal:

mkdir science-teacher-bot
cd science-teacher-bot
pnpm init
Enter fullscreen mode Exit fullscreen mode

πŸ“¦ 4️⃣ Install Packages

pnpm add express cors dotenv langchain @langchain/openai @langchain/core
pnpm add -D nodemon
Enter fullscreen mode Exit fullscreen mode

πŸ“ 5️⃣ Project Structure

science-teacher-bot/
β”‚
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ memory.mjs
β”‚   β”œβ”€β”€ llm.mjs
β”‚   β”œβ”€β”€ route.mjs
β”‚   └── server.mjs
β”‚
β”œβ”€β”€ .env
└── package.json
Enter fullscreen mode Exit fullscreen mode

πŸ” 6️⃣ Add OpenAI Key

Create .env

OPENAI_API_KEY=your_key_here
PORT=3000
Enter fullscreen mode Exit fullscreen mode

πŸ’Ύ 7️⃣ memory.mjs β€” Chat History Storage

This file stores conversation history for each user/session.

import { InMemoryChatMessageHistory } from "@langchain/core/chat_history";

const histories = {};

export function getSessionHistory(sessionId = "default") {
  if (!histories[sessionId]) {
    histories[sessionId] = new InMemoryChatMessageHistory();
  }
  return histories[sessionId];
}
Enter fullscreen mode Exit fullscreen mode


🧠 What This Code Does

  • Creates memory storage for chatbot
  • Stores chat separately per user
  • Each user has unique sessionId
  • Returns that user’s history

πŸ‘‰ Enables multi-user chatbot memory.

Hinglish:
Har user ka alag chat history store hota hai


πŸ‘€ What are student1, student2 ?

They are session IDs (user IDs).

Example:

  • student1 β†’ first user
  • student2 β†’ second user

You pass them when calling chatbot.

So each user gets separate memory.


πŸ“¦ What new InMemoryChatMessageHistory() Creates

It creates an object like:

InMemoryChatMessageHistory {
  messages: []
}
Enter fullscreen mode Exit fullscreen mode

This object stores:

  • human messages
  • AI messages
  • conversation order

πŸ” How histories Object Works

histories is a dictionary:

sessionId β†’ chat history object
Enter fullscreen mode Exit fullscreen mode

So:

histories["student1"] = InMemoryChatMessageHistory
histories["student2"] = InMemoryChatMessageHistory
Enter fullscreen mode Exit fullscreen mode

πŸ§ͺ Example Conversation

User: student1

Question:

What is gravity?

AI:

Gravity is a force that pulls objects.

Memory stored for student1.


User: student2

Question:

What is photosynthesis?

Memory stored separately.


πŸ“Š Example Output

After chats:

histories = {
  student1: InMemoryChatMessageHistory {
    messages: [
      { role: "human", content: "What is gravity?" },
      { role: "ai", content: "Gravity is a force that pulls objects." }
    ]
  },

  student2: InMemoryChatMessageHistory {
    messages: [
      { role: "human", content: "What is photosynthesis?" }
    ]
  }
}

Enter fullscreen mode Exit fullscreen mode

🧾 Simplified View

Sometimes shown as:

histories = {
  student1: [
    { role: "human", content: "What is gravity?" },
    { role: "ai", content: "Gravity is a force that pulls objects." }
  ],
  student2: [
    { role: "human", content: "What is photosynthesis?" }
  ]
}
Enter fullscreen mode Exit fullscreen mode

(This is conceptual view, actual object is InMemoryChatMessageHistory)


πŸ” How It Is Used by LangChain

When chatbot runs:

getSessionHistory("student1")
Enter fullscreen mode Exit fullscreen mode

LangChain:

  1. Finds student1 history
  2. Loads messages
  3. Sends to AI
  4. Adds new messages

So AI remembers conversation per user.


⚠️ Important Notes

  • Memory stored in RAM
  • Lost on server restart
  • Good for demo/testing
  • Production uses DB/Redis

Perfect πŸ‘ β€” you want the same section completed and polished like your previous memory section, with:

  • clear explanation
  • beginner-friendly
  • consistent format
  • ready for notes

Below is your **final clean β€œπŸ§ 

🧠 8️⃣ llm.mjs β€” AI Chatbot Logic

Creates AI teacher + memory chatbot.

import { config } from "dotenv";
config();

import { ChatOpenAI } from "@langchain/openai";
import { ChatPromptTemplate, MessagesPlaceholder } from "@langchain/core/prompts";
import { RunnableWithMessageHistory } from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { getSessionHistory } from "./memory.mjs";

const llm = new ChatOpenAI({
  model: "gpt-4o-mini",
  temperature: 0.3,
});

const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You are a Science Teacher. Answer only science questions. Explain simply."],
  new MessagesPlaceholder("chat_history"),
  ["human", "{input}"],
]);

const chain = prompt.pipe(llm).pipe(new StringOutputParser());

export const chatChain = new RunnableWithMessageHistory({
  runnable: chain,
  getMessageHistory: getSessionHistory,
  inputMessagesKey: "input",
  historyMessagesKey: "chat_history",
});
Enter fullscreen mode Exit fullscreen mode

🧠 What This File Does

  • Creates AI model
  • Defines teacher behavior
  • Connects chat memory
  • Creates chatbot system

πŸ‘‰ Main chatbot brain.

Hinglish:
Ye file AI teacher + memory chatbot banati hai


πŸ€– ChatOpenAI

Creates AI model.

  • Connects to OpenAI
  • Generates answers
  • Brain of chatbot

Hinglish:
AI model banata hai


🧾 Prompt (Teacher Rules)


const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You are a Science Teacher. Answer only science questions. Explain simply."],
  new MessagesPlaceholder("chat_history"),
  ["human", "{input}"],
]);

Enter fullscreen mode Exit fullscreen mode

Defines how AI behaves:

  • science teacher role
  • simple explanation
  • science-only answers

Hinglish:
AI ko teacher banata hai


🧠 MessagesPlaceholder("chat_history")

  • Inserts previous conversation
  • Loads user memory
  • Gives context to AI

Hinglish:
Yaha pichli chat aati hai


πŸ”— chain = prompt β†’ AI β†’ text

const chain = prompt.pipe(llm).pipe(new StringOutputParser());
Enter fullscreen mode Exit fullscreen mode

Flow:

  1. Prompt prepared
  2. Sent to AI
  3. Text answer returned

Hinglish:
Prompt β†’ AI β†’ answer


πŸ’Ύ RunnableWithMessageHistory

Adds memory to chatbot.

new RunnableWithMessageHistory({...})
Enter fullscreen mode Exit fullscreen mode
  • Loads user chat history
  • Adds new messages
  • Maintains conversation

Hinglish:
AI ko user ki chat history deta hai


🏷️ inputMessagesKey: "input"

  • Name of user message field
  • Matches {input} in prompt

Hinglish:
Student ke question ka naam


🏷️ historyMessagesKey: "chat_history"

  • Name of memory variable
  • Matches MessagesPlaceholder

Hinglish:
Memory ka naam


πŸ” How It Works (Example)

User: student1
Question: What is gravity?

LangChain sends to AI:

System: You are science teacher
History: (student1 chat)
User: What is gravity?
Enter fullscreen mode Exit fullscreen mode

AI replies β†’ stored in memory.

Next question uses same history.

🌐 9️⃣ route.mjs

API endpoint /ask

import { Router } from "express";
import { chatChain } from "./llm.mjs";

export const router = Router();

router.post("/", async (req, res) => {
  try {
    const { text, sessionId = "default" } = req.body;

    if (!text) {
      return res.status(400).json({ error: "Question required" });
    }

    const answer = await chatChain.invoke(
      { input: text },
      { configurable: { sessionId } }
    );

    res.json({ answer });

  } catch (err) {
    res.status(500).json({ error: "AI error" });
  }
});
Enter fullscreen mode Exit fullscreen mode

Easy Meaning

  • Receives question
  • Sends to AI
  • Returns answer

πŸš€ 1️⃣0️⃣ server.mjs

Starts backend server.

import express from "express";
import cors from "cors";
import { config } from "dotenv";
import { router } from "./route.mjs";

config();

const app = express();

app.use(cors());
app.use(express.json());

app.get("/", (req, res) => {
  res.send("Science Teacher AI running");
});

app.use("/ask", router);

const PORT = process.env.PORT || 3000;

app.listen(PORT, () => {
  console.log("Server running on http://localhost:" + PORT);
});
Enter fullscreen mode Exit fullscreen mode

Easy Meaning

  • Creates server
  • Enables JSON
  • Adds API route
  • Starts app

▢️ 1️⃣1️⃣ Run Project

Add script in package.json:

"scripts": {
  "dev": "nodemon src/server.mjs"
}
Enter fullscreen mode Exit fullscreen mode

Run:

pnpm dev
Enter fullscreen mode Exit fullscreen mode

πŸ§ͺ 1️⃣2️⃣ Test Chatbot

POST β†’ http://localhost:3000/ask

Body:

{
  "text": "What is gravity?"
}
Enter fullscreen mode Exit fullscreen mode

Response:

{
  "answer": "Gravity is a force..."
}
Enter fullscreen mode Exit fullscreen mode

🧠 1️⃣3️⃣ Memory Test

Send:

{ "text": "What is gravity?", "sessionId": "u1" }
Enter fullscreen mode Exit fullscreen mode

Then:

{ "text": "Explain simpler", "sessionId": "u1" }
Enter fullscreen mode Exit fullscreen mode

AI remembers πŸ‘


πŸ† 1️⃣4️⃣ What You Built

You created:

  • AI backend
  • LangChain chatbot
  • Memory system
  • Express API
  • Science teacher AI

This is real AI architecture.

Top comments (0)