DEV Community

Alex Spinov
Alex Spinov

Posted on

LangChain.js Has a Free API — Build AI Agents in JavaScript

LangChain.js is the JavaScript/TypeScript framework for building LLM-powered applications. Chains, agents, RAG, memory — everything you need for production AI apps.

Quick Start

npm install langchain @langchain/openai
Enter fullscreen mode Exit fullscreen mode
import { ChatOpenAI } from '@langchain/openai';

const model = new ChatOpenAI({
  modelName: 'gpt-4o-mini',
  temperature: 0,
});

const response = await model.invoke('What is TypeScript?');
console.log(response.content);
Enter fullscreen mode Exit fullscreen mode

Prompt Templates

import { ChatPromptTemplate } from '@langchain/core/prompts';

const prompt = ChatPromptTemplate.fromMessages([
  ['system', 'You are a {role}. Respond in {language}.'],
  ['user', '{question}'],
]);

const chain = prompt.pipe(model);
const result = await chain.invoke({
  role: 'senior developer',
  language: 'English',
  question: 'Explain closures',
});
Enter fullscreen mode Exit fullscreen mode

Output Parsers

import { StructuredOutputParser } from 'langchain/output_parsers';
import { z } from 'zod';

const parser = StructuredOutputParser.fromZodSchema(
  z.object({
    name: z.string(),
    pros: z.array(z.string()),
    cons: z.array(z.string()),
    rating: z.number().min(1).max(10),
  })
);

const chain = prompt.pipe(model).pipe(parser);
const result = await chain.invoke({ topic: 'React' });
// { name: 'React', pros: [...], cons: [...], rating: 8 }
Enter fullscreen mode Exit fullscreen mode

RAG (Retrieval-Augmented Generation)

import { MemoryVectorStore } from 'langchain/vectorstores/memory';
import { OpenAIEmbeddings } from '@langchain/openai';
import { RecursiveCharacterTextSplitter } from 'langchain/text_splitter';

// Split documents
const splitter = new RecursiveCharacterTextSplitter({
  chunkSize: 1000,
  chunkOverlap: 200,
});
const docs = await splitter.splitDocuments(documents);

// Create vector store
const vectorStore = await MemoryVectorStore.fromDocuments(
  docs,
  new OpenAIEmbeddings()
);

// Search
const results = await vectorStore.similaritySearch('How to deploy?', 3);
Enter fullscreen mode Exit fullscreen mode

Agents with Tools

import { TavilySearchResults } from '@langchain/community/tools/tavily_search';
import { createReactAgent } from '@langchain/langgraph/prebuilt';

const tools = [new TavilySearchResults({ maxResults: 3 })];

const agent = createReactAgent({
  llm: model,
  tools,
});

const result = await agent.invoke({
  messages: [{ role: 'user', content: 'What is the latest version of Node.js?' }],
});
Enter fullscreen mode Exit fullscreen mode

Conversation Memory

import { BufferMemory } from 'langchain/memory';
import { ConversationChain } from 'langchain/chains';

const memory = new BufferMemory();
const chain = new ConversationChain({ llm: model, memory });

await chain.invoke({ input: 'My name is Alice' });
const response = await chain.invoke({ input: 'What is my name?' });
// "Your name is Alice"
Enter fullscreen mode Exit fullscreen mode

Using with Ollama (Local LLMs)

import { ChatOllama } from '@langchain/ollama';

const localModel = new ChatOllama({
  model: 'llama3.2',
  temperature: 0,
});

// Works with all LangChain features
const chain = prompt.pipe(localModel).pipe(parser);
Enter fullscreen mode Exit fullscreen mode

Need web data for your AI pipeline? Check out my Apify actors — scrape any site and feed it to LangChain. Email spinov001@gmail.com for custom AI data solutions.

LangChain or Vercel AI SDK — which do you use for AI apps? Comment below!

Top comments (0)