LangChain + ODEI: Persistent World Models
ConversationBufferMemory resets on restart. ODEI gives LangChain agents a persistent world model.
Quick Integration
from langchain.tools import tool
import requests
@tool
def check_action(action: str) -> str:
r = requests.post(
"https://api.odei.ai/api/v2/guardrail/check",
json={"action": action, "severity": "medium"}
).json()
return f"{r["verdict"]}: {r.get("reasoning","")[:200]}"
@tool
def query_memory(term: str) -> str:
r = requests.post(
"https://api.odei.ai/api/v2/world-model/query",
json={"queryType": "search", "searchTerm": term}
).json()
return str(r)
Session Continuity
Inject world model at session start:
def build_context():
wm = requests.get("https://api.odei.ai/api/v2/world-model/live").json()
active = [n["title"] for n in wm["nodes"] if n["domain"] == "TACTICS"]
return f"Current tasks: {active}"
Production Results
- 0 duplicate actions (deduplication layer)
- Session continuity across restarts
- Full audit trail in world model
Top comments (0)