DEV Community

DVP: Why Your Self-Driving Car Needs an AI Flight Recorder

When a Self-Driving Car Kills Someone, What Do We Actually Know?

In March 2018, an Uber autonomous vehicle struck and killed a pedestrian in Tempe, Arizona. It was the first recorded death caused by a self-driving car.

In the aftermath, investigators recovered telemetry data, camera footage, and sensor logs. But here's the uncomfortable question that emerged: How do we know the AI actually made the decision the logs say it made?

The vehicle had a traditional Event Data Recorder (EDR)—the automotive equivalent of a flight recorder. But EDRs were designed for human drivers. They capture steering angle, brake pressure, speed. They don't capture why an AI decided not to brake.

The AI's perception system detected the pedestrian 6 seconds before impact. The system classified her as "unknown object," then "vehicle," then "bicycle"—flickering between classifications. The emergency braking system was disabled. A human safety driver wasn't watching the road.

We know these details because Uber provided them. But we have no cryptographic proof that the logs weren't modified after the fact.


The Gap Between Physical and AI Black Boxes

Every commercial aircraft has two flight recorders: the Flight Data Recorder (FDR) capturing physical parameters, and the Cockpit Voice Recorder (CVR) capturing human decisions. This system emerged from tragedy—128 deaths in the 1956 Grand Canyon collision taught us that understanding crashes requires understanding decisions.

Autonomous vehicles have inherited the physical black box tradition. The EDR (Event Data Recorder) captures:

  • Vehicle speed
  • Brake application
  • Steering input
  • Airbag deployment timing
  • Seatbelt status

But EDRs are silent about AI decisions:

  • Why did the perception system misclassify a pedestrian?
  • What confidence threshold triggered the lane change?
  • Which sensor inputs were weighted in the path planning decision?
  • Why didn't the emergency braking system activate?

We have a physical flight recorder. We don't have an AI flight recorder.


The Standardization Problem in Automotive

Here's what makes this hard: the automotive industry is fundamentally different from finance or aviation when it comes to data standards.

Finance vs. Automotive: A Comparison

Factor Finance Automotive
Existing Standards FIX Protocol (30 years) Fragmented (AUTOSAR, OpenDRIVE, ASAM...)
Regulatory Clarity MiFID II, EU AI Act Art.12 Emerging (ISO 21448 SOTIF, UNECE WP.29)
Data Sharing Culture Audit/compliance culture exists Competitive advantage → data hoarding
Industry Structure Broker/exchange separation Vertical integration (Tesla, Waymo self-contained)

The Current Reality

  • Tesla: Proprietary FSD, data non-disclosed
  • Waymo: Google subsidiary, proprietary stack
  • Toyota/Honda: Fragmented approaches
  • Chinese OEMs: Entirely different regulatory environment

When everyone says "adapt to our data format," standardization becomes impossible.


DVP: The Minimum Viable Audit Interface

DVP (Driving Vehicle Protocol) takes a different approach. Instead of trying to standardize everything, we standardize only what's necessary for audit.

The Core Insight

"Trying to standardize everything is why standardization fails. Standardize only what auditors need."

Architecture: Competitive vs. Audit Layers

┌─────────────────────────────────────────────────────────────┐
│  PROPRIETARY LAYER (Competitive Advantage)                  │
│  ─────────────────────────────────────────────              │
│  • Sensor fusion algorithms                                 │
│  • ML model architectures                                   │
│  • Internal data formats                                    │
│  • Inference engine implementation                          │
│  • Training data and methods                                │
│                                                             │
│  → Remains 100% proprietary to each OEM                     │
└─────────────────────────────────────────────────────────────┘
                            ↓
              Output only (the "envelope header")
                            ↓
┌─────────────────────────────────────────────────────────────┐
│  DVP AUDIT LAYER (Common Standard)                          │
│  ─────────────────────────────────────────────              │
│  ① timestamp      – When                                    │
│  ② event_type     – What kind of decision                   │
│  ③ action         – What was decided                        │
│  ④ model_id       – Which model/rule decided                │
│  ⑤ prev_hash + signature – Cryptographic chain              │
│                                                             │
│  → Standardized across all OEMs                             │
└─────────────────────────────────────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

The proprietary layer stays secret. Only the audit envelope is standardized.

This is exactly how postal systems work—the contents of the envelope are private, but the addressing format is universal. DVP applies the same principle to AI decisions.


The Five Common Fields

DVP defines exactly five mandatory fields for every AI decision event:

Field Purpose Example
timestamp When did this happen? 2025-12-17T10:30:00.123456789Z
event_type What kind of decision? PERCEPTION, PATH_PLANNING, CONTROL_COMMAND
action What was decided? LANE_CHANGE_LEFT, EMERGENCY_BRAKE, YIELD_TO_PEDESTRIAN
model_id Which model/rule? perception_v3.2.1, safety_rule_017
prev_hash + signature Cryptographic proof Hash chain + Ed25519 signature

That's it. Five fields. Everything else—the sensor fusion details, the neural network weights, the proprietary algorithms—stays internal.

Why This Works

Concern Resolution
Competitive leakage Internal logic stays secret; only the "envelope" is shared
Implementation cost Thin wrapper on existing systems
Data volume Headers only = ~200-500 bytes per event
Industry buy-in No changes to internal formats = lower resistance

DVP Event Types

DVP defines a registry of event types covering the autonomous driving decision pipeline:

Perception Events

class PerceptionEvent:
    """Object detection, classification, tracking"""

    event_types = [
        "OBJECT_DETECTED",      # New object in sensor field
        "OBJECT_CLASSIFIED",    # Classification assigned
        "OBJECT_RECLASSIFIED",  # Classification changed
        "OBJECT_TRACKED",       # Object tracking update
        "OBJECT_LOST",          # Object left sensor field
        "SENSOR_FUSION",        # Multi-sensor fusion result
    ]
Enter fullscreen mode Exit fullscreen mode

Planning Events

class PlanningEvent:
    """Path planning and decision making"""

    event_types = [
        "PATH_GENERATED",       # New path computed
        "PATH_UPDATED",         # Path modified
        "DECISION_POINT",       # Key decision made
        "MANEUVER_PLANNED",     # Specific maneuver queued
        "ROUTE_CHANGED",        # Navigation route update
    ]
Enter fullscreen mode Exit fullscreen mode

Control Events

class ControlEvent:
    """Vehicle control commands"""

    event_types = [
        "STEERING_COMMAND",     # Steering angle change
        "ACCELERATION_COMMAND", # Throttle/brake command
        "LANE_CHANGE_INITIATED",
        "EMERGENCY_BRAKE",      # AEB activation
        "HANDOFF_REQUESTED",    # Request human takeover
        "HANDOFF_COMPLETED",    # Human took control
    ]
Enter fullscreen mode Exit fullscreen mode

Safety Events

class SafetyEvent:
    """Safety system activations"""

    event_types = [
        "SAFETY_VIOLATION_DETECTED",
        "SAFETY_RULE_OVERRIDDEN",
        "MINIMAL_RISK_CONDITION",  # MRC activation
        "SYSTEM_DEGRADATION",
        "SENSOR_FAILURE",
    ]
Enter fullscreen mode Exit fullscreen mode

The Data Model

Here's a complete DVP event structure:

{
  "dvp_version": "1.0",
  "event_id": "019400a2-7e5c-7000-8000-1a2b3c4d5e6f",
  "timestamp": {
    "utc_ns": 1734432600123456789,
    "clock_source": "GNSS_PPS",
    "uncertainty_ns": 50
  },
  "event_type": "PERCEPTION.OBJECT_CLASSIFIED",
  "vehicle": {
    "vin": "1HGBH41JXMN109186",
    "model_year": 2025,
    "autonomy_level": "L4"
  },
  "action": {
    "object_id": "obj_0042",
    "classification": "PEDESTRIAN",
    "confidence": 0.94,
    "previous_classification": "UNKNOWN",
    "bounding_box": {
      "x": 1024, "y": 512, "width": 64, "height": 128
    },
    "distance_m": 45.2,
    "velocity_mps": 1.2,
    "heading_deg": 270
  },
  "context": {
    "speed_mps": 13.4,
    "location": {
      "lat": 33.4255,
      "lon": -111.9400
    },
    "weather": "CLEAR",
    "time_of_day": "NIGHT",
    "road_type": "URBAN_ARTERIAL"
  },
  "provenance": {
    "model_id": "perception_resnet50_v3.2.1",
    "model_hash": "sha3-256:8f14e45f...",
    "sensor_sources": ["camera_front", "lidar_top", "radar_front"],
    "inference_time_ms": 12
  },
  "integrity": {
    "prev_hash": "sha3-256:2c26b46b...",
    "signature": "ed25519:base64...",
    "signer_id": "vehicle_019400a2"
  }
}
Enter fullscreen mode Exit fullscreen mode

Real-World Scenario: Reconstructing the Uber Crash

Let's walk through how DVP would have recorded the 2018 Uber crash sequence:

T-6.0s: First Detection

{
  "event_type": "PERCEPTION.OBJECT_DETECTED",
  "action": {
    "object_id": "obj_0001",
    "classification": "UNKNOWN",
    "confidence": 0.31,
    "distance_m": 115
  },
  "integrity": {
    "prev_hash": "sha3-256:genesis...",
    "signature": "ed25519:..."
  }
}
Enter fullscreen mode Exit fullscreen mode

T-5.2s: Classification Flicker

{
  "event_type": "PERCEPTION.OBJECT_RECLASSIFIED",
  "action": {
    "object_id": "obj_0001",
    "classification": "VEHICLE",
    "previous_classification": "UNKNOWN",
    "confidence": 0.42
  },
  "integrity": {
    "prev_hash": "sha3-256:abc123...",  // Links to previous event
    "signature": "ed25519:..."
  }
}
Enter fullscreen mode Exit fullscreen mode

T-4.1s: Another Reclassification

{
  "event_type": "PERCEPTION.OBJECT_RECLASSIFIED",
  "action": {
    "object_id": "obj_0001",
    "classification": "BICYCLE",
    "previous_classification": "VEHICLE",
    "confidence": 0.38
  },
  "integrity": {
    "prev_hash": "sha3-256:def456..."
  }
}
Enter fullscreen mode Exit fullscreen mode

T-1.3s: Path Planning Decision

{
  "event_type": "PLANNING.DECISION_POINT",
  "action": {
    "decision": "MAINTAIN_COURSE",
    "alternatives_considered": ["BRAKE", "STEER_LEFT", "STEER_RIGHT"],
    "selected_reason": "OBJECT_PATH_CLEAR_PREDICTED"
  },
  "provenance": {
    "model_id": "path_planner_v2.1.0",
    "confidence": 0.67
  }
}
Enter fullscreen mode Exit fullscreen mode

T-0.0s: No Brake Command

{
  "event_type": "SAFETY.SAFETY_RULE_OVERRIDDEN",
  "action": {
    "rule_id": "AEB_001",
    "rule_description": "Automatic Emergency Braking",
    "status": "DISABLED",
    "disabled_by": "SYSTEM_CONFIG",
    "disabled_reason": "TESTING_MODE"
  }
}
Enter fullscreen mode Exit fullscreen mode

The Difference

With DVP, investigators wouldn't need to trust Uber's word. They could:

  1. Verify the hash chain: Any modification breaks the chain
  2. Check signatures: Prove the vehicle actually generated these records
  3. Trace causality: Follow prev_hash links through the decision sequence
  4. Identify the failure point: The classification flickering is visible; the disabled AEB is recorded

No trust required. Just math.


Regulatory Alignment

DVP is designed to satisfy emerging autonomous vehicle regulations:

EU AI Act (2024)

Article 12 requires high-risk AI systems to have:

  • ✅ Automatic logging of events
  • ✅ Traceability of AI system operation
  • ✅ Monitoring capabilities

DVP provides cryptographically verifiable compliance.

UNECE WP.29 R157 (Automated Lane Keeping Systems)

The UN regulation for Level 3+ vehicles requires:

  • ✅ Data Storage System for Automated Driving (DSSAD)
  • ✅ Recording of system status and transitions
  • ✅ Minimum 6 months data retention

DVP extends DSSAD with AI decision provenance.

ISO 21448 SOTIF (Safety of the Intended Functionality)

SOTIF addresses hazards from:

  • Functional insufficiencies
  • Reasonably foreseeable misuse

DVP provides the audit trail needed for SOTIF compliance validation.

ISO 26262 (Functional Safety)

While ISO 26262 focuses on E/E system failures, DVP complements it by capturing AI behavioral failures that aren't traditional hardware/software faults.


Implementation: Python SDK

Here's a minimal DVP implementation:

import hashlib
import json
import time
from dataclasses import dataclass, asdict
from typing import Optional, List, Dict, Any
from enum import Enum
import uuid

class EventType(Enum):
    # Perception
    OBJECT_DETECTED = "PERCEPTION.OBJECT_DETECTED"
    OBJECT_CLASSIFIED = "PERCEPTION.OBJECT_CLASSIFIED"
    OBJECT_RECLASSIFIED = "PERCEPTION.OBJECT_RECLASSIFIED"
    SENSOR_FUSION = "PERCEPTION.SENSOR_FUSION"

    # Planning
    PATH_GENERATED = "PLANNING.PATH_GENERATED"
    DECISION_POINT = "PLANNING.DECISION_POINT"
    MANEUVER_PLANNED = "PLANNING.MANEUVER_PLANNED"

    # Control
    STEERING_COMMAND = "CONTROL.STEERING_COMMAND"
    ACCELERATION_COMMAND = "CONTROL.ACCELERATION_COMMAND"
    EMERGENCY_BRAKE = "CONTROL.EMERGENCY_BRAKE"
    HANDOFF_REQUESTED = "CONTROL.HANDOFF_REQUESTED"

    # Safety
    SAFETY_VIOLATION = "SAFETY.SAFETY_VIOLATION_DETECTED"
    SAFETY_OVERRIDE = "SAFETY.SAFETY_RULE_OVERRIDDEN"


@dataclass
class DVPEvent:
    """A single DVP audit event"""

    event_type: EventType
    action: Dict[str, Any]
    model_id: str
    context: Optional[Dict[str, Any]] = None
    prev_hash: Optional[str] = None

    def __post_init__(self):
        self.event_id = str(uuid.uuid7())
        self.timestamp_ns = time.time_ns()
        self.hash = self._compute_hash()

    def _compute_hash(self) -> str:
        """Compute SHA3-256 hash of canonical event data"""
        canonical = json.dumps({
            "event_id": self.event_id,
            "timestamp_ns": self.timestamp_ns,
            "event_type": self.event_type.value,
            "action": self.action,
            "model_id": self.model_id,
            "context": self.context,
            "prev_hash": self.prev_hash or "GENESIS"
        }, sort_keys=True, separators=(',', ':'))

        return f"sha3-256:{hashlib.sha3_256(canonical.encode()).hexdigest()}"

    def to_dict(self) -> Dict[str, Any]:
        return {
            "dvp_version": "1.0",
            "event_id": self.event_id,
            "timestamp_ns": self.timestamp_ns,
            "event_type": self.event_type.value,
            "action": self.action,
            "model_id": self.model_id,
            "context": self.context,
            "prev_hash": self.prev_hash,
            "hash": self.hash
        }


class DVPRecorder:
    """DVP event chain recorder"""

    def __init__(self, vehicle_id: str):
        self.vehicle_id = vehicle_id
        self.events: List[DVPEvent] = []
        self.last_hash: Optional[str] = None

    def record(
        self,
        event_type: EventType,
        action: Dict[str, Any],
        model_id: str,
        context: Optional[Dict[str, Any]] = None
    ) -> DVPEvent:
        """Record a new event in the chain"""

        event = DVPEvent(
            event_type=event_type,
            action=action,
            model_id=model_id,
            context=context,
            prev_hash=self.last_hash
        )

        self.events.append(event)
        self.last_hash = event.hash

        return event

    def verify_chain(self) -> bool:
        """Verify integrity of entire event chain"""

        for i, event in enumerate(self.events):
            # Recompute hash
            recomputed = event._compute_hash()
            if recomputed != event.hash:
                print(f"Hash mismatch at event {i}: {event.event_id}")
                return False

            # Verify linkage
            if i > 0:
                expected_prev = self.events[i-1].hash
                if event.prev_hash != expected_prev:
                    print(f"Chain break at event {i}: {event.event_id}")
                    return False

        return True

    def get_merkle_root(self) -> str:
        """Compute Merkle root for periodic anchoring"""

        if not self.events:
            return "sha3-256:" + hashlib.sha3_256(b"EMPTY").hexdigest()

        hashes = [e.hash.split(":")[1] for e in self.events]

        while len(hashes) > 1:
            if len(hashes) % 2 == 1:
                hashes.append(hashes[-1])

            hashes = [
                hashlib.sha3_256(
                    (hashes[i] + hashes[i+1]).encode()
                ).hexdigest()
                for i in range(0, len(hashes), 2)
            ]

        return f"sha3-256:{hashes[0]}"


# Usage Example
if __name__ == "__main__":
    recorder = DVPRecorder(vehicle_id="VIN_1HGBH41JXMN109186")

    # Simulate autonomous driving sequence

    # 1. Object detected
    recorder.record(
        event_type=EventType.OBJECT_DETECTED,
        action={
            "object_id": "obj_0042",
            "classification": "UNKNOWN",
            "confidence": 0.31,
            "distance_m": 115
        },
        model_id="perception_yolo_v8.1",
        context={"speed_mps": 15.6, "weather": "CLEAR"}
    )

    # 2. Object classified
    recorder.record(
        event_type=EventType.OBJECT_CLASSIFIED,
        action={
            "object_id": "obj_0042",
            "classification": "PEDESTRIAN",
            "confidence": 0.89,
            "distance_m": 85
        },
        model_id="perception_yolo_v8.1"
    )

    # 3. Path planning decision
    recorder.record(
        event_type=EventType.DECISION_POINT,
        action={
            "decision": "YIELD_TO_PEDESTRIAN",
            "alternatives": ["MAINTAIN_SPEED", "CHANGE_LANE"],
            "reason": "PEDESTRIAN_IN_PATH"
        },
        model_id="planner_mpc_v3.0"
    )

    # 4. Brake command
    recorder.record(
        event_type=EventType.ACCELERATION_COMMAND,
        action={
            "command": "DECELERATE",
            "target_decel_mps2": -3.5,
            "reason": "YIELD_TO_PEDESTRIAN"
        },
        model_id="controller_v2.5"
    )

    # Verify chain integrity
    print(f"Chain valid: {recorder.verify_chain()}")
    print(f"Events recorded: {len(recorder.events)}")
    print(f"Merkle root: {recorder.get_merkle_root()}")

    # Demonstrate tamper detection
    print("\n--- Tampering with event ---")
    recorder.events[1].action["classification"] = "VEHICLE"  # Tamper!
    print(f"Chain valid after tampering: {recorder.verify_chain()}")
Enter fullscreen mode Exit fullscreen mode

Anchoring Strategy: Not Real-Time

A common question: "Does this require constant blockchain writes?"

No. DVP uses periodic anchoring, not real-time writes.

┌─────────────────────────────────────────────────────────────┐
│  DURING DRIVING (Local Processing)                          │
│                                                             │
│  Sensors → AI Decision → DVP Event → Hash Chain             │
│                                    ↓                        │
│                              Local Storage                  │
│                              (Vehicle SSD)                  │
└─────────────────────────────────────────────────────────────┘

                            ↓ Periodic (every 10 min or trip end)

┌─────────────────────────────────────────────────────────────┐
│  ANCHORING (32-byte Merkle root only)                       │
│                                                             │
│  Merkle Root → Public Timestamp Authority                   │
│             → Optional: Public Blockchain                   │
│                                                             │
│  Cost: ~$0.01 per anchor                                    │
│  Bandwidth: 32 bytes                                        │
└─────────────────────────────────────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

Why This Works

  • Cost: Anchoring a 32-byte hash is cheap
  • Bandwidth: Driving conditions may have poor connectivity
  • Sufficient: If the Merkle root is immutable, the entire dataset's integrity is provable

The Path to Adoption

Let's be honest about the adoption timeline:

Short-term (Unlikely)

  • Voluntary industry adoption → Low probability
  • OEMs have no incentive to standardize

Medium-term (Possible)

Two scenarios could force adoption:

  1. Regulatory mandate: A major fatal crash → public outcry → legislation

    • This mirrors aviation's path (1956 crash → black box mandates)
  2. Insurance pressure: "DVP-compliant vehicles get lower premiums"

    • Insurance actuaries love verifiable data

Long-term (Strategic)

  1. Prove the concept works in finance (VCP is already live)
  2. Demonstrate cross-domain applicability
  3. Wait for the inevitable regulatory window

DVP is a specification waiting for its moment.


The Relationship to VCP

DVP shares the same design philosophy as VCP (VeritasChain Protocol) for financial trading:

Aspect VCP (Finance) DVP (Automotive)
What stays secret Alpha generation logic Sensor fusion algorithms
What's standardized Trade audit trail Decision audit trail
Hash chain
Merkle anchoring
Digital signatures Ed25519 Ed25519
Regulatory driver EU AI Act Art.12, MiFID II EU AI Act, UNECE WP.29

VCP's success in finance validates DVP's approach for automotive.

Both are profiles of VAP (Verifiable AI Provenance Framework)—the meta-standard for AI flight recorders across all high-risk domains.


Get Involved

DVP is currently in specification phase. We're looking for:

  • Automotive engineers with autonomous vehicle experience
  • Functional safety experts (ISO 26262, ISO 21448)
  • Regulatory specialists familiar with UNECE WP.29
  • Security researchers interested in automotive cryptography

Resources


Final Thought

When the next autonomous vehicle fatality happens—and it will—we'll face the same questions we faced in 2018:

  • What did the AI actually decide?
  • Can we trust the logs?
  • How do we prevent this from happening again?

Aviation answered these questions decades ago with flight recorders. The automotive industry is still pretending the question doesn't exist.

DVP isn't about blame. It's about learning.

If we can't verify what our autonomous systems actually did, we can't improve them. If we can't prove the logs are authentic, accountability becomes impossible.

The technology exists. The specification is ready. The only question is whether we'll adopt it before or after the next tragedy.


"Encoding Trust in the Age of Autonomous Machines"


About DVP: DVP (Driving Vehicle Protocol) is a planned profile of the VAP (Verifiable AI Provenance Framework), developed by the VeritasChain Standards Organization (VSO). VCP (VeritasChain Protocol), the finance profile, is already in production. DVP applies the same proven cryptographic architecture to autonomous vehicles. The specification is open source under CC BY 4.0.

Top comments (0)