DEV Community

ASHDEEP SINGH
ASHDEEP SINGH

Posted on

Relationship in LLM

Hi
So this week was spent in learning about how AI stores relationships so as to generate more relevant and precise answers to queries.
Uptil this point we know that we can use memory in AI to make it's query response more organic , and it works too , but issue is we still cant expect AI to know what I like or where I live as I havent stored that in vector space. Now obviously we can store it there but we'll have to manually extract relevant info and then make it vector entry.

Wont it be cool if our gen-ai model could know about us just by us talking with it. This way , even when we are asking for something, it can know and store some info about us. This is how it can next time answer more precisely if the question revolves around something that we might have not asked but talked about with AI.

here's the github : https://github.com/Ashdeep-Singh-97/gen-ai-graph

So here's an example : we can say this to AI
"When should i travel to Patiala. I live in Chandigarh and wanted to go for some outing."

Note that here we are not explicitly feeding the AI that we live in Chandigarh , but it will store this info in relation to us so it can fetch it in future whenever it needs it or even use it when it has to.
With all said and done , let's see how we can do that. We will need to update code from previous week (in which we were using memory) to accomodate the relationship part. For tech , we can use neo4j.
Let's have a code walkthrough:

  • Configuring Memory

Here comes the cool part: setting up where and how memories will be stored.

const mem = new Memory({
  version: 'v1.1',
  enableGraph: true,
  graphStore: {
    provider: 'neo4j',
    config: {
      url: 'neo4j://localhost:7687',
      username: 'neo4j',
      password: process.env.PASSWORD,
    },
  },
  vectorStore: {
    provider: 'qdrant',
    config: {
      collectionName: 'memories',
      embeddingModelDims: 1536,
      host: 'localhost',
      port: 6333,
    },
  },
});

Enter fullscreen mode Exit fullscreen mode

Graph Store (Neo4j)

Stores relationships between entities (like “Arsh likes Book A” or “Arsh asked about travel”).

Helps the AI understand connections.

  • The Chat Function
async function chat(query = '') {
  const memories = await mem.search(query, { userId: 'arsh' });
  const memStr = memories.results.map((e) => e.memory).join('\n');

Enter fullscreen mode Exit fullscreen mode

First, it searches memory based on the current query.

It looks up all past interactions for the user "arsh".

Then it joins those memories into a big string (memStr).

This ensures the AI knows the context about the user before replying.

  • System Prompt with Context
const SYSTEM_PROMPT = `
    Context About User:
    ${memStr}
  `;

Enter fullscreen mode Exit fullscreen mode

The system prompt tells GPT:
“Here’s everything I know about the user from past conversations. Use this while responding.”

This is how GPT gets a kind of long-term memory.

  • Sending the Query to GPT
const response = await client.chat.completions.create({
  model: 'gpt-4.1-mini',
  messages: [
    { role: 'system', content: SYSTEM_PROMPT },
    { role: 'user', content: query },
  ],
});

Enter fullscreen mode Exit fullscreen mode

Model used: gpt-4.1-mini (a lightweight GPT-4 variant).

Messages:

system → The context (memory of the user).

user → The actual question/query.

The model then generates a response considering both.

  • Showing and Saving the Response
console.log(`\n\n\nBot:`, response.choices[0].message.content);
console.log('Adding to memory...');
await mem.add(
  [
    { role: 'user', content: query },
    { role: 'assistant', content: response.choices[0].message.content },
  ],
  { userId: 'arsh' }
);
console.log('Adding to memory done...');

Enter fullscreen mode Exit fullscreen mode

First, it prints GPT’s answer.

Then, both the user query and the assistant’s reply are stored back in memory.

This way, next time you ask something, the bot already knows you asked about books, movies, or whatever in the past.

  • Running the Chat
chat('Suggest me some books?');

Enter fullscreen mode Exit fullscreen mode

When you run this, the chatbot will:

Check if you’ve asked about books before.

Use that past context in its answer.

Save this conversation for the future.

Over time, the bot starts feeling more personalized, because it remembers your preferences.

And that's how you make an LLM store a chat's relationships. Now your relationship with LLM can improve.

And this was all for this week folks. Keep following for more.

Peace....

Top comments (0)