DEV Community

harshit_lakhani
harshit_lakhani

Posted on

Function Calling and Memory: Taking AI Chat to the Next Level

Ever wished your AI could remember that you prefer short, direct answers? Or that you like more detailed responses on certain topics? AI memory makes this possible, allowing the system to recall your preferences and adapt across different conversations.

At LLMChat, we've been working on building AI chat experiences that feel more intuitive—by making AI smarter, but also more personal. One of the key ways we've done this is by giving AI the ability to remember.

How AI Memory Works

AI memory stores user-specific information to personalize future interactions. It leverages a function-calling approach, triggering specific actions when new information needs to be added, updated, or removed. For example, if you tell the AI you prefer concise answers, it remembers that and adjusts its responses in future chats.

Here's the schema we use for managing memory:

const memoryToolSchema = z.object({
  memory: z.array(
    z.string().describe("Key information about the user")
  ).describe("New info to be added or updated"),
  question: z.string().describe("The user's request"),
});
Enter fullscreen mode Exit fullscreen mode

The Memory Function in Action

Let's look at the core of our AI memory system. When new information is provided, such as user preferences, our DynamicStructuredTool ensures that the AI updates or adds the necessary details dynamically. Here's a glimpse of how it works:

const memoryFunction = (context: ToolExecutionContext) => {
  return new DynamicStructuredTool({
    name: "memory",
    description: "Manages user preferences and adapts interactions...",
    schema: memoryToolSchema,
    func: async ({ memory, question }) => {
      const existingMemories = context.preferences?.memories || [];

      const chain = RunnableSequence.from([
        PromptTemplate.fromTemplate(`
          User request: "{question}"
          New info: {new_memory}
          Existing memories: {existing_memory}

          Update memories:
          1. Update existing details
          2. Remove if necessary
          3. Add new unique memories`),
        context.model,
        memoryParser,
      ]);

      const response = await chain.invoke({
        new_memory: memory.join("\n"),
        existing_memory: existingMemories.join("\n"),
        question: question,
      });

      context.updatePreferences?.({ memories: response.memories });
      return question;
    },
  });
};
Enter fullscreen mode Exit fullscreen mode

This function ensures that the AI continuously adapts to user preferences, making every interaction feel tailored and more relevant.

Why Memory Matters

AI memory enhances user experiences by making interactions more personalized. Whether it's remembering how you like your answers, tracking ongoing projects, or knowing your preferences, memory allows AI to operate more intelligently. It also gives users control—allowing them to manage what's remembered, update preferences, or clear everything if needed.

// Example: Updating user preferences in real-time
context.updatePreferences?.({
  memories: response.memories,
});
Enter fullscreen mode Exit fullscreen mode

Conclusion

Memory makes AI more than just a tool—it becomes a companion that adapts to you. By using a function-calling approach, we've unlocked new possibilities for dynamic and personalized conversations. At LLMChat, we're excited about how memory can transform AI interactions, making them smarter and more human-like.

Top comments (1)

Collapse
 
harshit_lakhani profile image
harshit_lakhani • Edited

Here are some screenshots of memory plugin

  1. Before it knew my preference on News reading
  2. After it knew my preference
  3. Memory Management

Image description

Image description

Image description