DEV Community

Ruly Altamirano
Ruly Altamirano

Posted on

LuminoraCore v1.1: Your AI Memory That Actually Travels 🧠✨

TL;DR: I built an open-source framework that gives AI persistent memory across platforms. Your ChatGPT conversations can inform Claude, your Claude chats can continue in Gemini. True data portability, privacy-first.

🎬 See It In Action (2 minutes)

Watch how memory persists when switching between ChatGPT, Claude, and other LLMs


πŸ€” The Problem I Was Trying to Solve

We've all been there:

  1. ChatGPT forgets your conversation from yesterday
  2. Claude doesn't know what you discussed with ChatGPT
  3. Gemini starts fresh every time
  4. Your data is locked to each platform
  5. Export options? LOL, good luck

I was frustrated. I wanted my AI to remember me, know my preferences, and travel with me across platforms.

So I built LuminoraCore.


✨ What Makes It Different

Not Just Another Chatbot Framework

LuminoraCore isn't about building yet another chatbot. It's about creating portable AI identities that persist across:

  • πŸ”„ Any LLM provider (OpenAI, Anthropic, DeepSeek, Groq, Mistral...)
  • πŸ’Ύ Any storage backend (SQLite, PostgreSQL, MongoDB, DynamoDB...)
  • πŸš€ Any application (your own apps, integrations, plugins...)

The Three Pillars

1. Persistent Memory 🧠

from luminoracore import LuminoraCoreClient

client = LuminoraCoreClient()

# First conversation
response = await client.send_message_with_memory(
    session_id="user_123",
    message="I'm Carlos, I love Python and basketball"
)
# AI: "Nice to meet you, Carlos! Python and basketball, great combo!"

# Next day, different conversation
response = await client.send_message_with_memory(
    session_id="user_123",
    message="What do you know about me?"
)
# AI: "You're Carlos, you love Python and basketball!"
Enter fullscreen mode Exit fullscreen mode

How it works:

  • Automatically extracts facts from conversations (9 categories)
  • Classifies importance (0-10 scale)
  • Creates episodic memories for significant events
  • Compiles relevant context for each response

2. Affinity System πŸ’–

The AI's relationship with you evolves naturally:

Stranger (0-10 points)    β†’ Formal, polite
    ↓
Acquaintance (11-30)      β†’ Friendly, helpful  
    ↓
Friend (31-60)            β†’ Warm, personal
    ↓
Confidant (61+)           β†’ Very personal, empathetic
Enter fullscreen mode Exit fullscreen mode

Real example from my own usage:

  • Day 1 (Stranger): "Hello. How may I assist you today?"
  • Week 2 (Acquaintance): "Hey! How's your Python project going?"
  • Month 2 (Friend): "Carlos! Ready to talk about that basketball game?"
  • Month 6 (Confidant): "I remember you mentioned family stress last week. How are you holding up?"

The tone, depth, and personalization adapt automatically based on your relationship.

3. Provider Agnostic πŸ”„

Switch LLM providers mid-conversation without losing context:

# Start with ChatGPT
client = LuminoraCoreClient(provider="openai")
response = await client.send_message_with_memory(...)

# Switch to Claude (memory persists!)
client = LuminoraCoreClient(provider="anthropic")
response = await client.send_message_with_memory(...)
# Claude knows everything ChatGPT learned

# Try DeepSeek (20x cheaper!)
client = LuminoraCoreClient(provider="deepseek")
response = await client.send_message_with_memory(...)
# Still remembers everything
Enter fullscreen mode Exit fullscreen mode

Supported providers:

  • OpenAI (GPT-4, GPT-3.5)
  • Anthropic (Claude 3.5, Claude 3)
  • DeepSeek
  • Groq
  • Mistral
  • Google (Gemini)
  • Local models via Ollama

πŸ—οΈ Architecture Deep Dive

How Memory Actually Works

User: "I'm learning React"
    ↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Fact Extraction    β”‚ ← LLM analyzes message
β”‚  (9 categories)     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
           β”‚
           β”œβ”€> Category: "interests"
           β”œβ”€> Key: "programming_framework"
           β”œβ”€> Value: "React"
           β”œβ”€> Importance: 0.7
           ↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Update Affinity    β”‚ ← Points +3 (shared interest)
β”‚  (0-100 scale)      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
           β”‚
           β”œβ”€> Total: 15 points
           β”œβ”€> Level: Acquaintance
           ↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Select Personality  β”‚ ← Choose based on affinity
β”‚ (4 hierarchical)    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
           β”‚
           β”œβ”€> Use: "friendly_assistant"
           ↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Compile Context     β”‚ ← Gather relevant facts
β”‚ (Dynamic selection) β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
           β”‚
           β”œβ”€> Include: React interest, other programming facts
           ↓
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Generate Response   β”‚ ← LLM with full context
β”‚ (Provider-agnostic) β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
           β”‚
           ↓
Response: "That's awesome! React is great. 
           Since you're into Python too, 
           have you tried Django + React?"
Enter fullscreen mode Exit fullscreen mode

Fact Categories (9 Types)

  1. basics: Name, age, location, occupation
  2. preferences: Likes, dislikes, favorites
  3. interests: Hobbies, topics of interest
  4. family_friends: Relationships, social connections
  5. work: Career, projects, professional info
  6. health: Wellness, fitness, medical (sensitive)
  7. experiences: Past events, travel, life moments
  8. beliefs: Values, opinions, worldview
  9. goals: Aspirations, plans, ambitions

Episodic Memory (7 Types)

  • conversation: Significant discussions
  • achievement: Accomplishments, milestones
  • challenge: Obstacles, difficulties
  • decision: Important choices made
  • insight: Realizations, learnings
  • social: Interactions with others
  • milestone: Life events

πŸš€ Getting Started (5 Minutes)

Installation

pip install luminoracore
Enter fullscreen mode Exit fullscreen mode

Minimal Example

import asyncio
from luminoracore import LuminoraCoreClient

async def main():
    # Initialize client
    client = LuminoraCoreClient(
        provider="openai",  # or anthropic, deepseek, etc.
        api_key="your-api-key"
    )

    # Create session
    session_id = "user_carlos_session1"

    # First message
    response = await client.send_message_with_memory(
        session_id=session_id,
        message="Hi! I'm Carlos, I'm a software engineer from Madrid"
    )
    print(response['response'])
    print(f"Facts learned: {response['new_facts_count']}")

    # Second message (will remember Carlos)
    response = await client.send_message_with_memory(
        session_id=session_id,
        message="What do you know about me?"
    )
    print(response['response'])
    print(f"Total facts: {response['memory_facts_count']}")

    # Export all memory
    export = await client.export_session(session_id)
    print(export)  # Full JSON with all memories

asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Output:

Response: "Hello Carlos! Nice to meet you..."
Facts learned: 3

Response: "You're Carlos, a software engineer from Madrid!"
Total facts: 3

{
  "learned_facts": [
    {"category": "basics", "key": "name", "value": "Carlos"},
    {"category": "basics", "key": "occupation", "value": "software engineer"},
    {"category": "basics", "key": "location", "value": "Madrid"}
  ],
  "affinity": {"points": 5, "level": "stranger"}
}
Enter fullscreen mode Exit fullscreen mode

πŸ’‘ Real-World Use Cases

1. Personal AI Assistant

# Remembers your preferences, schedule, habits
assistant = LuminoraCoreClient(provider="anthropic")

await assistant.send_message_with_memory(
    session_id="my_assistant",
    message="Schedule my morning workout"
)
# AI knows: You prefer 7am, you do HIIT, 30 minutes
Enter fullscreen mode Exit fullscreen mode

2. Customer Support Bot

# Remembers customer history, past issues, preferences
support = LuminoraCoreClient(provider="deepseek")  # Cheaper!

await support.send_message_with_memory(
    session_id=f"customer_{customer_id}",
    message="My order hasn't arrived"
)
# AI knows: Order #12345, placed 3 days ago, shipped to Madrid
Enter fullscreen mode Exit fullscreen mode

3. Learning Tutor

# Adapts to student's level, tracks progress
tutor = LuminoraCoreClient(provider="openai")

await tutor.send_message_with_memory(
    session_id=f"student_{student_id}",
    message="I don't understand recursion"
)
# AI knows: Student struggles with CS concepts, prefers visual examples
Enter fullscreen mode Exit fullscreen mode

4. Mental Health Companion

# Builds trust over time, remembers emotional patterns
companion = LuminoraCoreClient(provider="anthropic")

await companion.send_message_with_memory(
    session_id="user_therapy",
    message="I'm feeling anxious today"
)
# AI knows: User has anxiety triggers, past coping strategies
# Relationship: Confidant β†’ very empathetic, personal response
Enter fullscreen mode Exit fullscreen mode

🎨 Advanced Features

PersonaBlendβ„’: Mix Personalities

# 70% friendly + 30% professional
blended = client.blend_personalities(
    base_personality="friendly_assistant",
    blend_with="professional_consultant",
    ratio=0.7
)

response = await client.send_message_with_memory(
    session_id=session_id,
    message="Help me write a proposal",
    personality=blended
)
# Gets: Warm tone + professional structure
Enter fullscreen mode Exit fullscreen mode

Multi-Storage Support

# Development: SQLite (local, fast)
client = LuminoraCoreClient(
    provider="openai",
    storage_backend="sqlite",
    db_path="./memory.db"
)

# Production: PostgreSQL (robust, scalable)
client = LuminoraCoreClient(
    provider="anthropic",
    storage_backend="postgresql",
    connection_string="postgresql://..."
)

# Cloud: DynamoDB (managed, serverless)
client = LuminoraCoreClient(
    provider="deepseek",
    storage_backend="dynamodb",
    table_name="luminora_memories"
)
Enter fullscreen mode Exit fullscreen mode

Export & Migrate

# Export from ChatGPT setup
export = await client_gpt.export_session("user_123")

# Import to Claude setup
client_claude = LuminoraCoreClient(provider="anthropic")
await client_claude.import_session("user_123", export)

# All memories preserved!
Enter fullscreen mode Exit fullscreen mode

🎯 Roadmap

v1.2 (Q1 2025): Semantic Search

# Instead of exact key matching
results = await client.semantic_search(
    user_id="carlos",
    query="what sports does carlos like",  # Natural language!
    top_k=5
)
# Returns: basketball, running, cycling...
Enter fullscreen mode Exit fullscreen mode

v1.3 (Q2 2025): Knowledge Graphs

# Connect related facts
graph = await client.get_knowledge_graph(user_id="carlos")
# Carlos β†’ knows β†’ John β†’ plays β†’ basketball
#       ↓
#     likes β†’ basketball
Enter fullscreen mode Exit fullscreen mode

v1.4 (Q3 2025): Memory Compression

# Problem: 6 months = 92K tokens = $2,775 per request!
# Solution: Tiered compression (recent=full, old=compressed)
# Result: 20K tokens = $600 per request (78% savings)
Enter fullscreen mode Exit fullscreen mode

v2.0 (2026): API SaaS

  • Managed API service
  • Multi-tenant
  • Integrations (LangChain, N8N, etc.)
  • Browser extension marketplace

🀝 Contributing

LuminoraCore is MIT licensed and community-driven.

Ways to contribute:

  1. πŸ› Report bugs
  2. πŸ’‘ Suggest features
  3. πŸ“ Improve documentation
  4. πŸ§ͺ Add test cases
  5. πŸ”§ Submit PRs

Priority areas:

  • Semantic search implementation
  • Additional LLM providers
  • Storage backend optimizations
  • Real-world use case examples

πŸ“š Resources

Code & Docs:


🎬 Try It Now

# Install
pip install luminoracore

# Clone examples
git clone https://github.com/luminoracore/luminoracore
cd luminoracore/examples

# Run simple chat
python simple_chat.py
Enter fullscreen mode Exit fullscreen mode

πŸ™ Why I Built This

I'm tired of:

  • 😀 AI platforms that forget me
  • 😀 Vendor lock-in
  • 😀 Opaque costs
  • 😀 Lost conversation history

I wanted:

  • βœ… My data, my control
  • βœ… Privacy-first (self-hosted option)
  • βœ… Portable (works anywhere)
  • βœ… Transparent (open source)

If you feel the same, give it a try ⭐


πŸ’¬ Let's Talk

What would you build with persistent AI memory?

  • Personal assistant?
  • Customer service bot?
  • Learning tutor?
  • Something else?

Drop a comment below! I'd love to hear your ideas and help you build it.


P.S. If you found this useful, consider:

Building in public. Let's make AI memory better together. πŸš€


Updated: November 2025 | Version: 1.1 | Author: @rulyaltamirano

Top comments (0)