TL;DR: I built an open-source framework that gives AI persistent memory across platforms. Your ChatGPT conversations can inform Claude, your Claude chats can continue in Gemini. True data portability, privacy-first.
π¬ See It In Action (2 minutes)
Watch how memory persists when switching between ChatGPT, Claude, and other LLMs
π€ The Problem I Was Trying to Solve
We've all been there:
- ChatGPT forgets your conversation from yesterday
- Claude doesn't know what you discussed with ChatGPT
- Gemini starts fresh every time
- Your data is locked to each platform
- Export options? LOL, good luck
I was frustrated. I wanted my AI to remember me, know my preferences, and travel with me across platforms.
So I built LuminoraCore.
β¨ What Makes It Different
Not Just Another Chatbot Framework
LuminoraCore isn't about building yet another chatbot. It's about creating portable AI identities that persist across:
- π Any LLM provider (OpenAI, Anthropic, DeepSeek, Groq, Mistral...)
- πΎ Any storage backend (SQLite, PostgreSQL, MongoDB, DynamoDB...)
- π Any application (your own apps, integrations, plugins...)
The Three Pillars
1. Persistent Memory π§
from luminoracore import LuminoraCoreClient
client = LuminoraCoreClient()
# First conversation
response = await client.send_message_with_memory(
session_id="user_123",
message="I'm Carlos, I love Python and basketball"
)
# AI: "Nice to meet you, Carlos! Python and basketball, great combo!"
# Next day, different conversation
response = await client.send_message_with_memory(
session_id="user_123",
message="What do you know about me?"
)
# AI: "You're Carlos, you love Python and basketball!"
How it works:
- Automatically extracts facts from conversations (9 categories)
- Classifies importance (0-10 scale)
- Creates episodic memories for significant events
- Compiles relevant context for each response
2. Affinity System π
The AI's relationship with you evolves naturally:
Stranger (0-10 points) β Formal, polite
β
Acquaintance (11-30) β Friendly, helpful
β
Friend (31-60) β Warm, personal
β
Confidant (61+) β Very personal, empathetic
Real example from my own usage:
- Day 1 (Stranger): "Hello. How may I assist you today?"
- Week 2 (Acquaintance): "Hey! How's your Python project going?"
- Month 2 (Friend): "Carlos! Ready to talk about that basketball game?"
- Month 6 (Confidant): "I remember you mentioned family stress last week. How are you holding up?"
The tone, depth, and personalization adapt automatically based on your relationship.
3. Provider Agnostic π
Switch LLM providers mid-conversation without losing context:
# Start with ChatGPT
client = LuminoraCoreClient(provider="openai")
response = await client.send_message_with_memory(...)
# Switch to Claude (memory persists!)
client = LuminoraCoreClient(provider="anthropic")
response = await client.send_message_with_memory(...)
# Claude knows everything ChatGPT learned
# Try DeepSeek (20x cheaper!)
client = LuminoraCoreClient(provider="deepseek")
response = await client.send_message_with_memory(...)
# Still remembers everything
Supported providers:
- OpenAI (GPT-4, GPT-3.5)
- Anthropic (Claude 3.5, Claude 3)
- DeepSeek
- Groq
- Mistral
- Google (Gemini)
- Local models via Ollama
ποΈ Architecture Deep Dive
How Memory Actually Works
User: "I'm learning React"
β
βββββββββββββββββββββββ
β Fact Extraction β β LLM analyzes message
β (9 categories) β
ββββββββββββ¬βββββββββββ
β
ββ> Category: "interests"
ββ> Key: "programming_framework"
ββ> Value: "React"
ββ> Importance: 0.7
β
βββββββββββββββββββββββ
β Update Affinity β β Points +3 (shared interest)
β (0-100 scale) β
ββββββββββββ¬βββββββββββ
β
ββ> Total: 15 points
ββ> Level: Acquaintance
β
βββββββββββββββββββββββ
β Select Personality β β Choose based on affinity
β (4 hierarchical) β
ββββββββββββ¬βββββββββββ
β
ββ> Use: "friendly_assistant"
β
βββββββββββββββββββββββ
β Compile Context β β Gather relevant facts
β (Dynamic selection) β
ββββββββββββ¬βββββββββββ
β
ββ> Include: React interest, other programming facts
β
βββββββββββββββββββββββ
β Generate Response β β LLM with full context
β (Provider-agnostic) β
ββββββββββββ¬βββββββββββ
β
β
Response: "That's awesome! React is great.
Since you're into Python too,
have you tried Django + React?"
Fact Categories (9 Types)
- basics: Name, age, location, occupation
- preferences: Likes, dislikes, favorites
- interests: Hobbies, topics of interest
- family_friends: Relationships, social connections
- work: Career, projects, professional info
- health: Wellness, fitness, medical (sensitive)
- experiences: Past events, travel, life moments
- beliefs: Values, opinions, worldview
- goals: Aspirations, plans, ambitions
Episodic Memory (7 Types)
- conversation: Significant discussions
- achievement: Accomplishments, milestones
- challenge: Obstacles, difficulties
- decision: Important choices made
- insight: Realizations, learnings
- social: Interactions with others
- milestone: Life events
π Getting Started (5 Minutes)
Installation
pip install luminoracore
Minimal Example
import asyncio
from luminoracore import LuminoraCoreClient
async def main():
# Initialize client
client = LuminoraCoreClient(
provider="openai", # or anthropic, deepseek, etc.
api_key="your-api-key"
)
# Create session
session_id = "user_carlos_session1"
# First message
response = await client.send_message_with_memory(
session_id=session_id,
message="Hi! I'm Carlos, I'm a software engineer from Madrid"
)
print(response['response'])
print(f"Facts learned: {response['new_facts_count']}")
# Second message (will remember Carlos)
response = await client.send_message_with_memory(
session_id=session_id,
message="What do you know about me?"
)
print(response['response'])
print(f"Total facts: {response['memory_facts_count']}")
# Export all memory
export = await client.export_session(session_id)
print(export) # Full JSON with all memories
asyncio.run(main())
Output:
Response: "Hello Carlos! Nice to meet you..."
Facts learned: 3
Response: "You're Carlos, a software engineer from Madrid!"
Total facts: 3
{
"learned_facts": [
{"category": "basics", "key": "name", "value": "Carlos"},
{"category": "basics", "key": "occupation", "value": "software engineer"},
{"category": "basics", "key": "location", "value": "Madrid"}
],
"affinity": {"points": 5, "level": "stranger"}
}
π‘ Real-World Use Cases
1. Personal AI Assistant
# Remembers your preferences, schedule, habits
assistant = LuminoraCoreClient(provider="anthropic")
await assistant.send_message_with_memory(
session_id="my_assistant",
message="Schedule my morning workout"
)
# AI knows: You prefer 7am, you do HIIT, 30 minutes
2. Customer Support Bot
# Remembers customer history, past issues, preferences
support = LuminoraCoreClient(provider="deepseek") # Cheaper!
await support.send_message_with_memory(
session_id=f"customer_{customer_id}",
message="My order hasn't arrived"
)
# AI knows: Order #12345, placed 3 days ago, shipped to Madrid
3. Learning Tutor
# Adapts to student's level, tracks progress
tutor = LuminoraCoreClient(provider="openai")
await tutor.send_message_with_memory(
session_id=f"student_{student_id}",
message="I don't understand recursion"
)
# AI knows: Student struggles with CS concepts, prefers visual examples
4. Mental Health Companion
# Builds trust over time, remembers emotional patterns
companion = LuminoraCoreClient(provider="anthropic")
await companion.send_message_with_memory(
session_id="user_therapy",
message="I'm feeling anxious today"
)
# AI knows: User has anxiety triggers, past coping strategies
# Relationship: Confidant β very empathetic, personal response
π¨ Advanced Features
PersonaBlendβ’: Mix Personalities
# 70% friendly + 30% professional
blended = client.blend_personalities(
base_personality="friendly_assistant",
blend_with="professional_consultant",
ratio=0.7
)
response = await client.send_message_with_memory(
session_id=session_id,
message="Help me write a proposal",
personality=blended
)
# Gets: Warm tone + professional structure
Multi-Storage Support
# Development: SQLite (local, fast)
client = LuminoraCoreClient(
provider="openai",
storage_backend="sqlite",
db_path="./memory.db"
)
# Production: PostgreSQL (robust, scalable)
client = LuminoraCoreClient(
provider="anthropic",
storage_backend="postgresql",
connection_string="postgresql://..."
)
# Cloud: DynamoDB (managed, serverless)
client = LuminoraCoreClient(
provider="deepseek",
storage_backend="dynamodb",
table_name="luminora_memories"
)
Export & Migrate
# Export from ChatGPT setup
export = await client_gpt.export_session("user_123")
# Import to Claude setup
client_claude = LuminoraCoreClient(provider="anthropic")
await client_claude.import_session("user_123", export)
# All memories preserved!
π― Roadmap
v1.2 (Q1 2025): Semantic Search
# Instead of exact key matching
results = await client.semantic_search(
user_id="carlos",
query="what sports does carlos like", # Natural language!
top_k=5
)
# Returns: basketball, running, cycling...
v1.3 (Q2 2025): Knowledge Graphs
# Connect related facts
graph = await client.get_knowledge_graph(user_id="carlos")
# Carlos β knows β John β plays β basketball
# β
# likes β basketball
v1.4 (Q3 2025): Memory Compression
# Problem: 6 months = 92K tokens = $2,775 per request!
# Solution: Tiered compression (recent=full, old=compressed)
# Result: 20K tokens = $600 per request (78% savings)
v2.0 (2026): API SaaS
- Managed API service
- Multi-tenant
- Integrations (LangChain, N8N, etc.)
- Browser extension marketplace
π€ Contributing
LuminoraCore is MIT licensed and community-driven.
Ways to contribute:
- π Report bugs
- π‘ Suggest features
- π Improve documentation
- π§ͺ Add test cases
- π§ Submit PRs
Priority areas:
- Semantic search implementation
- Additional LLM providers
- Storage backend optimizations
- Real-world use case examples
π Resources
Code & Docs:
π¬ Try It Now
# Install
pip install luminoracore
# Clone examples
git clone https://github.com/luminoracore/luminoracore
cd luminoracore/examples
# Run simple chat
python simple_chat.py
π Why I Built This
I'm tired of:
- π€ AI platforms that forget me
- π€ Vendor lock-in
- π€ Opaque costs
- π€ Lost conversation history
I wanted:
- β My data, my control
- β Privacy-first (self-hosted option)
- β Portable (works anywhere)
- β Transparent (open source)
If you feel the same, give it a try β
π¬ Let's Talk
What would you build with persistent AI memory?
- Personal assistant?
- Customer service bot?
- Learning tutor?
- Something else?
Drop a comment below! I'd love to hear your ideas and help you build it.
P.S. If you found this useful, consider:
- β Star the repo
Building in public. Let's make AI memory better together. π
Updated: November 2025 | Version: 1.1 | Author: @rulyaltamirano
Top comments (0)