ARC-StreamMemory: Building a Local-First Visual Second Brain for AI-Readable Video Memory
I’m building ARC-StreamMemory, a local-first visual memory system for AI-readable video, screen, snapshot, robotics, DAW/plugin, game, and app UI sessions.
The goal is to turn visual activity into something an AI can inspect, replay, cite, verify, and attach to a module.
Instead of treating video as a flat recording, ARC-StreamMemory turns it into a structured memory object:
visual source
→ FFmpeg video/snapshot ingest
→ AI frame-speed schedule
→ frame hashes
→ seeded source spine
→ OCR-ready/event-ready timeline
→ AI digest
→ ARC-style receipts
→ OmniBinary-style chunk map
→ Arc-RAR-style bundle manifest
→ local source-spine viewer
→ AI module attachment JSON
What ARC-StreamMemory does
ARC-StreamMemory can ingest visual sources such as:
- video files
- screen recordings
- screenshots
- DAW/plugin sessions
- game footage
- browser workflows
- robotics camera feeds
- app UI states
The output is not just a folder of screenshots.
The output is a deterministic visual memory bundle with:
- frame indexes
- frame hashes
- event timelines
- AI digest files
- module attachment JSON
- seeded memory spine
- validation reports
- bundle manifests
- a local HTML viewer
Why this matters
A normal screen recording answers:
What happened?
Maybe watch the whole video again.
ARC-StreamMemory is designed to answer:
What happened?
→ Read the AI digest.
→ Jump to the relevant event.
→ Open the frame.
→ Verify the frame hash.
→ Follow the receipt.
→ Follow the chunk pointer.
→ Restore or export the bundle.
That makes visual memory easier for an AI or developer to inspect and verify.
Current capabilities
The current release foundation supports:
- demo visual-memory session generation
- snapshot folder ingest
- regular FFmpeg video ingest
- AI frame-speed policies
- per-frame SHA-256 hashing
- deterministic memory spine hashing
- seeded source-spine lineage
- Markdown and JSON AI digests
- AI module attachment output
- ARC-style receipt export
- OmniBinary-style chunk map export
- Arc-RAR-style bundle manifest export
- local HTML viewer
- validation reports
- ZIP bundle export
- ARC-FusionCapture adapter/spec layer
The repo intentionally avoids overclaiming unfinished integrations.
The current public foundation is complete for deterministic visual memory ingest, indexing, hashing, digesting, viewing, validating, and bundle export. Future gates include native live screen capture, full OCR engine hookup, native OmniBinary persistence, native Arc-RAR packaging, live ARC-Core sync, and production robotics sensor bus integration.
AI frame-speed policy
ARC-StreamMemory supports different frame sampling speeds depending on what the AI needs to remember.
Recommended frame rates include:
0.2 FPS → long passive session memory
0.5 FPS → lightweight visual diary
1 FPS → general AI inspection default
2 FPS → UI debugging / GitHub / DAW workflows
5 FPS → detailed interaction review
10 FPS → motion-sensitive review
This matters because not every AI memory task needs full video.
A long passive session may only need sparse visual anchors, while a DAW/plugin bug or UI regression may need denser frame sampling.
Deterministic source-spine model
The memory spine is built around a deterministic seed chain:
capture_policy_hash
+ source_fingerprint
+ frame_schedule_hash
+ ordered_frame_hashes
+ chunk_hash
= session_root_seed
That creates a reproducible source spine:
root_seed
→ chunk
→ frame
→ frame_hash
→ event_receipt
→ module_attachment_pointer
The goal is to make visual memory verifiable and replayable instead of vague.
Example workflows
A standard FFmpeg workflow looks like this:
python scripts/ffmpeg_probe.py
python scripts/ingest_video.py input.mp4 --fps 1 --out sessions/video_memory
python scripts/build_stream_memory.py sessions/video_memory --title "Video memory"
python scripts/hash_memory_spine.py sessions/video_memory
python scripts/build_seed_spine.py sessions/video_memory
python scripts/build_ai_digest.py sessions/video_memory
python scripts/validate_memory_bundle.py sessions/video_memory
A demo session workflow looks like this:
python scripts/create_demo_session.py
python scripts/build_stream_memory.py examples/demo_session --title "ARC demo visual memory"
python scripts/hash_memory_spine.py examples/demo_session
python scripts/build_seed_spine.py examples/demo_session
python scripts/build_ai_digest.py examples/demo_session
python scripts/validate_memory_bundle.py examples/demo_session
python scripts/make_bundle.py examples/demo_session --out release_evidence/demo_streammemory_bundle.zip
Output structure
A memory session can include:
session/
├─ frames/
├─ memory/
│ ├─ capture_policy.json
│ ├─ frame_index.json
│ ├─ event_timeline.jsonl
│ ├─ ocr_index.jsonl
│ ├─ ai_digest.md
│ ├─ ai_digest.json
│ ├─ module_attachment.json
│ ├─ memory_spine.json
│ ├─ seed_spine.json
│ └─ session_summary.md
├─ receipts/arc_receipts.jsonl
├─ omnibinary/chunk_map.json
├─ arcrar/bundle_manifest.json
├─ reports/validation_report.json
└─ reports/bundle_export_report.json
This gives each visual memory session a structure an AI system can navigate.
ARC-FusionCapture direction
ARC-StreamMemory also includes a compatibility layer for the planned ARC-FusionCapture runtime.
The future capture layer is meant to wrap regular FFmpeg with:
- camera/feed profiles
- robotics capture modes
- hardware acceleration selection
- sensor timestamp sync
- rolling buffer policy
- event-triggered clips
- AI-friendly frame-speed output
- ARC receipts
- OmniBinary pointers
- Arc-RAR bundle manifests
This creates a path from simple video ingest today toward robotics/media capture workflows later.
Public use cases
ARC-StreamMemory can be useful for:
AI developers
Turn debugging videos, browser workflows, and UI sessions into reproducible visual memory modules.
Audio/plugin developers
Archive DAW/plugin tests, plugin validation sessions, FreeEQ8 or FreeVox8 regressions, and visual evidence from test runs.
Robotics developers
Use FFmpeg now, then connect ARC-FusionCapture later for sensor-synced camera memory and robot black-box replay.
Research and reproducibility
Use seeded spines, hashes, citations, validation reports, and module attachments to make visual sessions inspectable and reproducible.
Game and app developers
Capture game states, UI flows, visual bugs, and build history as replayable evidence bundles.
Repo
https://github.com/GareBear99/ARC-StreamMemory
What I’m looking for
I’m looking for feedback from:
- AI developers
- computer vision developers
- robotics developers
- Python developers
- FFmpeg users
- local-first builders
- reproducibility researchers
- audio/plugin developers
- game developers
- people interested in AI visual memory
Useful feedback includes:
- frame sampling policy ideas
- OCR integration suggestions
- robotics capture suggestions
- viewer/UI feedback
- validation/reporting improvements
- bundle format feedback
- source-spine design feedback
- module attachment use cases
- local-first architecture feedback
Long-term direction
The long-term goal is to make ARC-StreamMemory a local-first visual second brain for AI systems.
Not just video storage.
Not just screenshots.
A deterministic, replayable, source-verifiable memory spine that can turn visual sessions into AI-readable evidence.
Top comments (0)