This is a submission for the Gemma 4 Challenge: Build with Gemma 4
What I Built
Gemma.Witness is an offline-first multimodal evidence capture system built for environments where cloud access, trust, or chain-of-custody assumptions fail.
The system records audio alongside supporting images, runs local multimodal analysis through Gemma 4, and produces a signed evidence bundle containing:
- Structured incident reports
- Timestamped evidence metadata
- Local reasoning traces
- Hash-linked verification artifacts
- Exportable forensic bundles
The focus was reliability and local verification instead of "AI assistant" behavior.
Most evidence tooling today assumes internet access, centralized APIs, or mutable storage. Gemma.Witness was designed around the opposite assumption: the network may be unavailable, the machine may be isolated, and every generated output may eventually need independent verification.
The application runs fully local through a desktop interface using Rust, Tauri, and local inference orchestration.
Demo
github: https://github.com/moonrunnerkc/gemma-witness
Code
Source code is available at the repository above.
How I Used Gemma 4
Gemma 4 is the reasoning layer behind the entire evidence pipeline.
I used Gemma 4's multimodal capabilities to process:
- Audio-derived transcripts
- Scene images
- Cross-evidence consistency analysis
- Structured incident extraction
- Reasoning trace generation
The model is used in a multi-pass workflow instead of a single prompt-response cycle. Each pass validates or expands on the previous stage before the final signed bundle is emitted.
This matters because evidence systems fail quietly when models hallucinate details, merge assumptions into facts, or overstate certainty. The pipeline was intentionally designed to separate:
- Raw observations
- Inferred conclusions
- Confidence scoring
- Verifiable artifacts
Gemma 4 was a strong fit because it could operate locally while still handling multimodal reasoning tasks without requiring cloud APIs or external orchestration services.
The project prioritizes:
- Offline operation
- Verifiable outputs
- Local ownership of evidence
- Minimal trust assumptions
- Reproducible forensic artifacts
A surprising challenge was not getting the model to generate reports. That part was easy.
The difficult part was building guardrails around evidence integrity so the system does not quietly become a very confident fiction generator wearing a necktie.
Tech Stack
- Gemma 4
- Rust
- Tauri
- Node.js
- Local multimodal inference
- Cryptographic hashing and bundle verification
Repository
moonrunnerkc
/
gemma-witness
Offline multimodal evidence capture that emits a signed, locally verifiable .witness bundle. Tauri + Rust + Gemma 4 + Ed25519. Static HTML verifier runs with no server.
Gemma.Witness
Offline, tamper-evident evidence capture for field journalism. Signed in your hand, verified in a browser, with no server in the loop
demo.mov
Why · Status · Install · Usage · Configuration · Threat model · Limitations · Verify yourself · Contributing
Why this matters
A reporter is working in a country where journalists are detained for their reporting. She records a witness account. She attaches the photos she just took. She seals the file before she leaves the room.
A week later, an editor on another continent opens a single static HTML page in any browser and drags the file in. Three checks turn green:
- the signature comes from the reporter's device
- the audio and the photos have not been altered by a single byte
- the AI model in the chain is bit-for-bit the published Gemma model her manifest names, by
model_id,revision, andmodel.safetensorsSHA-256
In…
Top comments (0)