DEV Community

Cover image for No More Forgetful Robots: My Test Drive with Cognee AI's "AI Memory"
Pravesh Sudha
Pravesh Sudha Subscriber

Posted on

No More Forgetful Robots: My Test Drive with Cognee AI's "AI Memory"

If you've played around with AI chatbots like Gemini or ChatGPT, you know they're super smart, but they have one huge flaw: they forget everything.

Every time you ask them something, it's a completely new request. They don't remember what happened last time, and they don't know anything about your personal files or documents. It's like talking to someone with total short-term memory loss when it comes to your information.

I recently got to try Cognee AI, and itโ€™s basically a tool that fixes this memory problem. It gives the AI a brain for your data.


How Cognee Teaches AI to Remember

The main idea is simple: instead of just handing your documents to the AI, Cognee first organizes them.

  1. You give Cognee your data (text, documents, whatever).
  2. It runs a special process called cognify.
  3. This process turns your messy data into a neat "Knowledge Map" (or graph) that shows how everything is connected.

Now, your data isn't just a list of words; it's an AI memory the LLM can understand deeply.


The Experiment: Getting Only the Right Answer

I did a quick test: I fed Cognee a few sentences about myself.

from cognee import SearchType, visualize_graph
import cognee 
import asyncio
import os, pathlib

async def main():

    # Create a clean slate for cognee -- reset data and system state
    await cognee.prune.prune_data()
    await cognee.prune.prune_system(metadata=True)

    # Add sample content
    text = "Pravesh Sudha is a DevOps Engineer, AWS Community Builder and Content Creator. He loves to try out new AI tool like Cognee AI, Portia AI and Runner-H. He shares his learning on his socials and detailed blogs on Hashnode, Dev.to and medium along with Detailed project tutorial on Youtube"

    await cognee.add(text)

    # Process with LLMs to build the knowledge graph
    await cognee.cognify()

    graph_file_path = str(
        pathlib.Path(
            os.path.join(pathlib.Path(__file__).parent, ".artifacts/graph_visualization.html")
        ).resolve()
    )
    await visualize_graph(graph_file_path)


    # Search the knowledge graph
    graph_result = await cognee.search(
        query_text="What does Pravesh Sudha do?", query_type=SearchType.GRAPH_COMPLETION
    )
    print("Graph Result: ")
    print(graph_result)

    rag_result = await cognee.search(
        query_text="What does Pravesh Sudha do?", query_type=SearchType.RAG_COMPLETION
    )
    print("RAG Result: ")
    print(rag_result)

    basic_result = await cognee.search(
        query_text="What are the main themes in my data?"
    )
    print("Basic Result: ")
    print(basic_result)


if __name__ == '__main__':
    asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Then, I used Gemini 2.5 Flash (the actual chatbot) through Cognee to ask a detailed question.

The result? The answer was 100% accurate and used ONLY the specific facts I gave it. It was completely focused. I used SearchType.GRAPH_COMPLETION (focused on generating information from the completed Knowledge Graph) and SearchType.RAG_COMPLETION (focused on generating information using "is a" relationship from the graph) for my use case. There are two other types including SUMMARIES and CHUNKS.

When I tried to ask Gemini the same question without using Cognee, it couldn't answer the specific details because it had no memory of my text.

Cognee also provide amazing vector Diagram to understand the relationship of the components of the text Cognified. You can find the vector generated at .artifacts/graph_visualization.html.

This showed me that Cognee is the key to making AI reliable. It stops the AI from guessing or making things up, forcing it to stick to your facts. If you need an AI to be an expert on your documents, this memory layer is a total game-changer!

Have you tried anything like this to make your AI smarter about your own data? Let me know! I am also working on a blog about Deploying Cognee Over AWS, Stay Tuned for that one!

If you found this guide useful, feel free to connect with me and check out more of my work ๐Ÿ‘‡

๐Ÿ”— Connect with me

๐Ÿ‘‹ Adios, see you in next one!

Top comments (6)

Collapse
 
nadinev profile image
Nadine

Iโ€™m interested in this Pravesh. Since Cognee creates knowledge graphs, entity extraction is part of the pipeline. It makes the difference between models that can simply answer questions and models that find connections between distinct pieces of information. Great job!

Collapse
 
pravesh_sudha_3c2b0c2b5e0 profile image
Pravesh Sudha

Happy to help!

Collapse
 
aaron_rose_0787cc8b4775a0 profile image
Aaron Rose

nice one pravesh! ๐Ÿ’ฏ

Collapse
 
pravesh_sudha_3c2b0c2b5e0 profile image
Pravesh Sudha

Thanks ๐Ÿ˜„

Collapse
 
hande_kafkas_16805c7d4eab profile image
Hande Kafkas

great work and cant wait for the next one!!

Collapse
 
pravesh_sudha_3c2b0c2b5e0 profile image
Pravesh Sudha

Thanks :)