DEV Community

Jerry Gathu
Jerry Gathu

Posted on

Building Memory-Enabled AI Agents with LangMem

Introduction

Modern AI agents need more than just the ability to respond to queries, they need memory. They need to remember past interactions, learn from examples, and store important information for future use. This is where LangMem comes in.

LangMem is a powerful memory management system that integrates seamlessly with LangGraph agents, enabling them to store, search, and retrieve information across conversations. In this article, we'll explore how to build a customer support agent that uses LangMem to provide personalized, context-aware assistance.

What is LangMem?

LangMem provides two key capabilities for AI agents:

  1. Persistent Storage: Store information that persists across multiple conversations
  2. Semantic Search: Find relevant information using natural language queries powered by embeddings

Think of LangMem as giving your AI agent a notebook where it can write down important information and quickly find what it needs later.

Setting Up Your Environment

First, install the required packages:

pip install langchain langchain-openai langgraph langmem python-dotenv
Enter fullscreen mode Exit fullscreen mode

Next, set up your environment with the necessary API keys:

import os
from dotenv import load_dotenv

load_dotenv()
os.environ['OPENAI_API_KEY'] = 'your-api-key-here'
Enter fullscreen mode Exit fullscreen mode

Creating a Memory Store

The foundation of LangMem is the InMemoryStore. This is where all your agent's memories will be stored:

from langgraph.store.memory import InMemoryStore

store = InMemoryStore(
    index={"embed": "openai:text-embedding-3-small"}
)
Enter fullscreen mode Exit fullscreen mode

The index parameter enables semantic search by creating embeddings of stored content. This allows your agent to find relevant memories even when queries don't match exactly.

Understanding Namespaces

LangMem uses namespaces to organize memories, similar to folders in a file system. A namespace is a tuple that creates a hierarchical structure:

# User-specific namespace
namespace = ("lance",)

# More specific namespace for examples
examples_namespace = (
    "support_assistant",
    "lance",
    "examples"
)
Enter fullscreen mode Exit fullscreen mode

This organization lets you:

  • Separate memories by user
  • Categorize different types of information
  • Control access to specific memory sets

Storing Information in Memory

Basic Storage Operations

LangMem provides simple methods to store and retrieve data:

# Store a value
store.put(
    namespace=("lance",),
    key="triage_tech",
    value={"prompt": "Handle login issues, API problems, system errors"}
)

# Retrieve a value
result = store.get(namespace=("lance",), key="triage_tech")
if result is not None:
    print(result.value['prompt'])
Enter fullscreen mode Exit fullscreen mode

Practical Example: Storing Configuration

Here's how our customer support agent stores triage rules:

def store_triage_rules(store, user_id):
    namespace = (user_id,)

    # Store tech support rules
    store.put(
        namespace,
        "triage_tech",
        {"prompt": "Login issues, API problems, system errors"}
    )

    # Store sales support rules
    store.put(
        namespace,
        "triage_sales",
        {"prompt": "Pricing questions, demos, upgrade requests"}
    )

    # Store finance support rules
    store.put(
        namespace,
        "triage_finance",
        {"prompt": "Payment issues, refunds, billing disputes"}
    )
Enter fullscreen mode Exit fullscreen mode

Semantic Search with LangMem

One of LangMem's most powerful features is semantic search—finding relevant memories based on meaning, not just exact matches:

# Search for relevant examples
examples = store.search(
    namespace=("support_assistant", "lance", "examples"),
    query="customer asking about payment problems"
)

# Process search results
for example in examples:
    print(f"Subject: {example.value['subject']}")
    print(f"Content: {example.value['content']}")
    print(f"Label: {example.value['label']}")
Enter fullscreen mode Exit fullscreen mode

The search returns items ranked by semantic similarity to your query, even if they don't contain the exact words.

Creating Memory Tools for Agents

LangMem provides pre-built tools that agents can use to manage their own memory:

from langmem import create_manage_memory_tool, create_search_memory_tool

# Tool for storing memories
manage_memory_tool = create_manage_memory_tool(
    namespace=(
        "support_assistant",
        "{langgraph_user_id}",
        "collection"
    )
)

# Tool for searching memories
search_memory_tool = create_search_memory_tool(
    namespace=(
        "support_assistant",
        "{langgraph_user_id}",
        "collection"
    )
)
Enter fullscreen mode Exit fullscreen mode

These tools allow your agent to:

  • Decide what information is important to remember
  • Store it for future reference
  • Search for relevant past information when needed

Building a Memory-Enabled Customer Support Agent

Let's put it all together with a complete example. Our agent will:

  1. Remember triage rules for different support categories
  2. Search for similar past tickets (few-shot examples)
  3. Store information about customer interactions

Step 1: Define the Agent's System Prompt

agent_system_prompt = """
You are ABC Company's customer support assistant.

You have access to the following tools:

1. send_to_tech_support() - Route technical issues
2. send_to_sales_support() - Route sales inquiries  
3. send_to_finance_support() - Route billing issues
4. manage_memory - Store relevant information for future reference
5. search_memory - Search for relevant past information

Use manage_memory to store important details about:
- Customer issues and resolutions
- Common problems and solutions
- Customer preferences and history

Use search_memory to find:
- Similar past tickets
- Previous interactions with this customer
- Relevant solutions or patterns
"""
Enter fullscreen mode Exit fullscreen mode

Step 2: Create the Prompt Function

This function retrieves stored instructions from memory:

def create_prompt(state, config, store):
    user_id = config['configurable']['langgraph_user_id']
    namespace = (user_id,)

    # Try to get custom instructions from memory
    result = store.get(namespace, "agent_instructions")

    if result is None:
        # Use default instructions if none stored
        instructions = "Use these tools appropriately"
        store.put(namespace, "agent_instructions", {"prompt": instructions})
    else:
        instructions = result.value['prompt']

    return [
        {"role": "system", "content": agent_system_prompt.format(instructions=instructions)},
        *state['messages']
    ]
Enter fullscreen mode Exit fullscreen mode

Step 3: Build the Agent with Memory Tools

from langgraph.prebuilt import create_react_agent

tools = [
    send_to_tech_support,
    send_to_sales_support,
    send_to_finance_support,
    manage_memory_tool,  # Agent can store memories
    search_memory_tool   # Agent can search memories
]

agent = create_react_agent(
    model="openai:gpt-4o",
    tools=tools,
    prompt=create_prompt,
    store=store  # Pass the store to enable memory
)
Enter fullscreen mode Exit fullscreen mode

Step 4: Using the Agent

customer_input = {
    "subject": "Payment refund",
    "message": "I am unsatisfied with the service and wish to receive a full refund"
}

config = {"configurable": {"langgraph_user_id": "lance"}}

response = agent.invoke(
    {"customer_input": customer_input},
    config=config
)
Enter fullscreen mode Exit fullscreen mode

Real-World Use Cases

1. Few-Shot Learning from Past Examples

Store successful ticket resolutions and retrieve similar ones:

# Store a successful resolution
store.put(
    namespace=("support_assistant", "lance", "examples"),
    key="ticket_001",
    value={
        "subject": "Cannot log in",
        "content": "User forgot password",
        "label": "tech_support",
        "resolution": "Password reset link sent"
    }
)

# Later, search for similar tickets
similar = store.search(
    namespace=("support_assistant", "lance", "examples"),
    query="user can't access account"
)
Enter fullscreen mode Exit fullscreen mode

2. Personalized User Preferences

Remember how each user prefers to be helped:

# Agent stores a memory
manage_memory_tool.invoke({
    "content": "Customer prefers detailed technical explanations"
})

# Agent searches before responding
preferences = search_memory_tool.invoke({
    "query": "how does this customer like to receive help"
})
Enter fullscreen mode Exit fullscreen mode

3. Dynamic Rule Updates

Update triage rules based on new policies:

# Update tech support criteria
store.put(
    namespace=("lance",),
    key="triage_tech",
    value={"prompt": "Now also includes mobile app issues"}
)
Enter fullscreen mode Exit fullscreen mode

Best Practices

1. Use Hierarchical Namespaces

# Good: Organized and specific
("company", "user_id", "ticket_examples")

# Less ideal: Flat structure
("examples",)
Enter fullscreen mode Exit fullscreen mode

2. Check for Existing Values

result = store.get(namespace, key)
if result is None:
    # Initialize with default
    store.put(namespace, key, default_value)
else:
    # Use existing value
    value = result.value
Enter fullscreen mode Exit fullscreen mode

3. Use Descriptive Keys

# Good
store.put(namespace, "triage_rules_tech", data)

# Less clear
store.put(namespace, "tr_t", data)
Enter fullscreen mode Exit fullscreen mode

4. Leverage Semantic Search

# Let the agent describe what it's looking for naturally
query = "customer who had billing problems last month"
results = store.search(namespace, query=query)
Enter fullscreen mode Exit fullscreen mode

Conclusion

LangMem transforms stateless AI agents into intelligent assistants with memory. By combining persistent storage with semantic search, your agents can:

  • Learn from past interactions
  • Provide personalized experiences
  • Improve over time with few-shot learning
  • Maintain context across conversations

The key is to think about what information your agent needs to remember and organize it effectively using namespaces. With LangMem, you're not just building a chatbot, you're building an assistant that gets smarter with every interaction.


Top comments (0)