The EU AI Act entered into force in August 2024. Article 12 mandates "automatic recording of events (logs) over the lifetime of the system" for high-risk AI—but provides zero technical specification for how to implement it.
No format. No integrity mechanism. No retention architecture.
This isn't oversight—it's intentional technology-neutral regulation. But it leaves us developers without clear guidance.
VeritasChain Protocol (VCP) v1.1 fills this gap. It's an open standard for tamper-proof audit trails that enables third-party verification without trusting the log producer. Think Certificate Transparency, but for AI trading decisions.
Let's build one.
TL;DR
- 🔐 Three-layer architecture: Event hashes → Merkle trees → External anchoring
- 📜 EU AI Act Article 12 compliant (and exceeds requirements)
- 🆓 Free anchoring options: OpenTimestamps, FreeTSA
- 🐍 Python implementation included below
- 📊 Three compliance tiers: Silver (retail), Gold (institutional), Platinum (HFT)
Why Your AI Needs a Flight Recorder
Every time your trading algorithm makes a decision, you need to prove:
- What decision was made (the event)
- When it was made (timestamp)
- That it hasn't been altered (integrity)
- That no events were deleted (completeness)
Traditional logging gives you #1 and maybe #2. VCP gives you all four—with cryptographic proof that third parties can verify without trusting you.
Traditional Audit Trail:
┌──────────────────────────────────────────────────┐
│ Your System → Your Logs → "Trust me, bro" │
└──────────────────────────────────────────────────┘
VCP Audit Trail:
┌──────────────────────────────────────────────────┐
│ Your System → Merkle Tree → Blockchain/TSA │
│ ↓ │
│ Third-party can verify │
│ WITHOUT trusting you │
└──────────────────────────────────────────────────┘
The Three-Layer Architecture
VCP v1.1 separates concerns into three distinct layers:
Layer 1: Event Integrity
Every event gets a SHA-256 hash. Simple.
import hashlib
import json
def canonicalize_json(obj: dict) -> str:
"""RFC 8785 JSON Canonicalization (simplified)"""
return json.dumps(obj, sort_keys=True, separators=(',', ':'))
def calculate_event_hash(header: dict, payload: dict) -> str:
"""Calculate SHA-256 hash of canonical event"""
canonical = canonicalize_json(header) + canonicalize_json(payload)
return hashlib.sha256(canonical.encode()).hexdigest()
Layer 2: Collection Integrity (Merkle Trees)
Individual hashes are great, but they don't prove completeness. Someone could delete events and you'd never know.
Merkle trees solve this. We batch events, build a tree, and get a single Merkle root that represents the entire batch. Delete one event? The root changes. Cryptographic proof of completeness.
def merkle_hash(data: bytes, is_leaf: bool = True) -> bytes:
"""
RFC 6962 compliant Merkle hashing with domain separation
Why domain separation? Prevents second preimage attacks.
Leaf nodes get 0x00 prefix, internal nodes get 0x01.
"""
prefix = b'\x00' if is_leaf else b'\x01'
return hashlib.sha256(prefix + data).digest()
def build_merkle_tree(event_hashes: list[str]) -> tuple[str, list]:
"""
Build RFC 6962 Merkle tree from event hashes
Returns: (merkle_root, tree_levels)
"""
if not event_hashes:
return "0" * 64, []
# Convert hex strings to bytes and hash as leaves
leaves = [merkle_hash(bytes.fromhex(h), is_leaf=True) for h in event_hashes]
tree = [leaves]
current_level = leaves
while len(current_level) > 1:
next_level = []
for i in range(0, len(current_level), 2):
left = current_level[i]
# Handle odd number of nodes by duplicating last
right = current_level[i + 1] if i + 1 < len(current_level) else left
parent = merkle_hash(left + right, is_leaf=False)
next_level.append(parent)
tree.append(next_level)
current_level = next_level
return current_level[0].hex(), tree
Layer 3: External Verifiability
Here's where VCP v1.1 gets interesting. We take that Merkle root and anchor it externally—to a blockchain, an RFC 3161 timestamp authority, or even OpenTimestamps (free!).
Now a third party can verify your logs against an independent timestamp. You can't modify history because the anchor exists outside your control.
import requests
from datetime import datetime
def anchor_to_opentimestamps(merkle_root: str) -> dict:
"""
Anchor Merkle root to Bitcoin via OpenTimestamps
FREE! Uses Bitcoin blockchain for immutable timestamping.
"""
# In production, use the opentimestamps-client library
# This is a simplified example
return {
"type": "PUBLIC_SERVICE",
"identifier": "opentimestamps.org",
"merkle_root": merkle_root,
"timestamp": datetime.utcnow().isoformat(),
"proof": "ots_proof_bytes_here"
}
def anchor_to_rfc3161_tsa(merkle_root: str, tsa_url: str) -> dict:
"""
Anchor to RFC 3161 Time Stamp Authority
Options:
- FreeTSA (free): https://freetsa.org/tsr
- DigiCert (commercial)
- GlobalSign (commercial)
"""
# Build timestamp request (simplified)
ts_request = build_timestamp_request(bytes.fromhex(merkle_root))
response = requests.post(
tsa_url,
data=ts_request,
headers={"Content-Type": "application/timestamp-query"}
)
return {
"type": "TSA",
"identifier": tsa_url,
"merkle_root": merkle_root,
"proof": response.content.hex()
}
Complete Implementation: Silver Tier
Silver tier is designed for retail traders (MT4/MT5, individual algos). Minimal infrastructure, daily anchoring, millisecond precision.
import hashlib
import json
import time
import os
from dataclasses import dataclass, field
from typing import Optional
from datetime import datetime
import base64
# Signature library (pip install pynacl)
from nacl.signing import SigningKey, VerifyKey
from nacl.encoding import HexEncoder
@dataclass
class VCPEvent:
"""A single VCP event"""
event_id: str
event_type: str # INIT, SIG, ORD, ACK, EXE, CXL, ERROR
timestamp: int # Microseconds since epoch
payload: dict
event_hash: str = ""
signature: str = ""
merkle_index: int = -1
merkle_root: str = ""
@dataclass
class AnchorRecord:
"""External anchor record"""
merkle_root: str
signature: str
timestamp: int
anchor_type: str
anchor_proof: str
event_count: int
first_event_id: str
last_event_id: str
policy_id: str
class SilverTierVCP:
"""
VCP v1.1 Silver Tier Implementation
Requirements:
- EventHash: REQUIRED
- PrevHash: OPTIONAL (not used in Silver for simplicity)
- Merkle Tree: REQUIRED (daily batches)
- Digital Signature: REQUIRED (Ed25519)
- External Anchor: REQUIRED (daily)
- Clock Sync: Best-effort
- Precision: Millisecond
"""
def __init__(self, policy_id: str, private_key_hex: Optional[str] = None):
self.policy_id = policy_id
# Generate or load Ed25519 key
if private_key_hex:
self.signing_key = SigningKey(bytes.fromhex(private_key_hex))
else:
self.signing_key = SigningKey.generate()
self.verify_key = self.signing_key.verify_key
self.pending_events: list[VCPEvent] = []
self.anchors: list[AnchorRecord] = []
self.event_counter = 0
def get_public_key_hex(self) -> str:
"""Get public key for verification"""
return self.verify_key.encode(encoder=HexEncoder).decode()
def _generate_event_id(self) -> str:
"""Generate UUIDv7-like event ID (time-ordered)"""
# Simplified: timestamp + counter
ts = int(time.time() * 1000)
self.event_counter += 1
return f"{ts:012x}-{self.event_counter:04x}"
def _canonicalize(self, obj: dict) -> str:
"""RFC 8785 JSON Canonicalization"""
return json.dumps(obj, sort_keys=True, separators=(',', ':'))
def _calculate_hash(self, header: dict, payload: dict) -> str:
"""Calculate SHA-256 event hash"""
canonical = self._canonicalize(header) + self._canonicalize(payload)
return hashlib.sha256(canonical.encode()).hexdigest()
def _sign(self, data: str) -> str:
"""Sign data with Ed25519"""
signature = self.signing_key.sign(data.encode())
return base64.b64encode(signature.signature).decode()
def log_event(
self,
event_type: str,
payload: dict,
timestamp_us: Optional[int] = None
) -> VCPEvent:
"""
Log a new VCP event
Args:
event_type: One of INIT, SIG, ORD, ACK, EXE, PRT, CXL, ERROR
payload: Event-specific data
timestamp_us: Optional timestamp in microseconds (uses current time if not provided)
Returns:
VCPEvent with hash and signature
"""
event_id = self._generate_event_id()
timestamp = timestamp_us or int(time.time() * 1_000_000)
header = {
"Version": "1.1",
"EventID": event_id,
"EventType": event_type,
"Timestamp": timestamp,
"TimestampISO": datetime.utcfromtimestamp(timestamp / 1_000_000).isoformat() + "Z",
"PolicyID": self.policy_id,
"ConformanceTier": "SILVER"
}
# Calculate event hash
event_hash = self._calculate_hash(header, payload)
# Sign the hash
signature = self._sign(event_hash)
event = VCPEvent(
event_id=event_id,
event_type=event_type,
timestamp=timestamp,
payload=payload,
event_hash=event_hash,
signature=signature
)
self.pending_events.append(event)
return event
def _build_merkle_tree(self, hashes: list[str]) -> tuple[str, list]:
"""Build RFC 6962 Merkle tree"""
if not hashes:
return "0" * 64, []
def merkle_hash(data: bytes, is_leaf: bool) -> bytes:
prefix = b'\x00' if is_leaf else b'\x01'
return hashlib.sha256(prefix + data).digest()
leaves = [merkle_hash(bytes.fromhex(h), is_leaf=True) for h in hashes]
tree = [leaves]
current = leaves
while len(current) > 1:
next_level = []
for i in range(0, len(current), 2):
left = current[i]
right = current[i + 1] if i + 1 < len(current) else left
next_level.append(merkle_hash(left + right, is_leaf=False))
tree.append(next_level)
current = next_level
return current[0].hex(), tree
def anchor_batch(self, anchor_func=None) -> Optional[AnchorRecord]:
"""
Anchor pending events to external service
MUST be called at least once every 24 hours for Silver tier
Args:
anchor_func: Optional custom anchoring function.
If None, prints anchor record (for testing).
Returns:
AnchorRecord or None if no pending events
"""
if not self.pending_events:
return None
# Build Merkle tree from event hashes
hashes = [e.event_hash for e in self.pending_events]
merkle_root, tree = self._build_merkle_tree(hashes)
# Sign the Merkle root
root_signature = self._sign(merkle_root)
# Create anchor record
anchor = AnchorRecord(
merkle_root=merkle_root,
signature=root_signature,
timestamp=int(time.time() * 1_000_000),
anchor_type="PENDING", # Will be updated by anchor_func
anchor_proof="",
event_count=len(self.pending_events),
first_event_id=self.pending_events[0].event_id,
last_event_id=self.pending_events[-1].event_id,
policy_id=self.policy_id
)
# Execute external anchoring
if anchor_func:
proof = anchor_func(merkle_root)
anchor.anchor_type = proof.get("type", "CUSTOM")
anchor.anchor_proof = proof.get("proof", "")
else:
# Testing mode: just print
print(f"[ANCHOR] Merkle Root: {merkle_root}")
print(f"[ANCHOR] Events: {anchor.event_count}")
anchor.anchor_type = "TEST"
# Update events with Merkle info
for i, event in enumerate(self.pending_events):
event.merkle_index = i
event.merkle_root = merkle_root
self.anchors.append(anchor)
self.pending_events = []
return anchor
def export_event_json(self, event: VCPEvent) -> dict:
"""Export event as VCP-compliant JSON"""
return {
"Header": {
"Version": "1.1",
"EventID": event.event_id,
"EventType": event.event_type,
"Timestamp": event.timestamp,
"TimestampISO": datetime.utcfromtimestamp(
event.timestamp / 1_000_000
).isoformat() + "Z"
},
"Payload": event.payload,
"Security": {
"Version": "1.1",
"EventHash": event.event_hash,
"HashAlgo": "SHA256",
"Signature": event.signature,
"SignAlgo": "ED25519",
"MerkleRoot": event.merkle_root,
"MerkleIndex": event.merkle_index
},
"PolicyIdentification": {
"PolicyID": self.policy_id,
"ConformanceTier": "SILVER",
"VerificationDepth": {
"HashChainValidation": False,
"MerkleProofRequired": True,
"ExternalAnchorRequired": True
}
}
}
# ============================================
# USAGE EXAMPLE
# ============================================
if __name__ == "__main__":
# Initialize VCP logger
vcp = SilverTierVCP(policy_id="com.example.trading:silver-algo-v1")
print(f"Public Key: {vcp.get_public_key_hex()}")
print()
# Log trading events
# 1. System initialization
init_event = vcp.log_event("INIT", {
"SystemID": "algo-001",
"Version": "2.3.1",
"Config": {"risk_limit": 1000, "max_position": 10}
})
print(f"INIT Event: {init_event.event_id}")
# 2. Signal generation (AI decision)
signal_event = vcp.log_event("SIG", {
"Symbol": "EURUSD",
"Direction": "BUY",
"Confidence": 0.87,
"ModelVersion": "gpt-trade-v3",
"Features": {
"rsi_14": 32.5,
"macd_signal": 0.0012,
"sentiment_score": 0.65
}
})
print(f"SIG Event: {signal_event.event_id}")
# 3. Order submission
order_event = vcp.log_event("ORD", {
"Symbol": "EURUSD",
"Side": "BUY",
"Type": "LIMIT",
"Price": 1.0850,
"Quantity": 10000,
"OrderID": "ORD-2025-001234"
})
print(f"ORD Event: {order_event.event_id}")
# 4. Execution
exec_event = vcp.log_event("EXE", {
"OrderID": "ORD-2025-001234",
"ExecPrice": 1.0851,
"ExecQuantity": 10000,
"Commission": 0.70
})
print(f"EXE Event: {exec_event.event_id}")
print()
# Anchor the batch (would use OpenTimestamps or TSA in production)
anchor = vcp.anchor_batch()
print()
print("=== Exported Event (JSON) ===")
print(json.dumps(vcp.export_event_json(exec_event), indent=2))
Compliance Tiers at a Glance
| Tier | Target | Clock Sync | Anchor Frequency | Free Options |
|---|---|---|---|---|
| Silver | MT4/MT5, retail algos | Best-effort | 24 hours | OpenTimestamps, FreeTSA |
| Gold | Prop firms, institutions | NTP (<1ms) | 1 hour | FreeTSA |
| Platinum | HFT, exchanges | PTP (<1µs) | 10 minutes | ❌ (blockchain/commercial TSA) |
EU AI Act: The Regulatory Context
Quick fact-check of what we know (January 2026):
✅ Verified: EBA Says "No Significant Contradictions"
The European Banking Authority's November 2025 mapping exercise confirmed:
"No significant contradictions have been found between the AI Act and EU banking and payment sector legislation."
Translation: If you're already MiFID II compliant, you have a head start on AI Act compliance.
✅ Verified: Algorithmic Trading NOT High-Risk (Yet)
Annex III high-risk classifications for finance are limited to:
- Creditworthiness assessment/credit scoring (5b)
- Life and health insurance risk assessment (5c)
Algorithmic trading is NOT listed. The Commission's February 2, 2026 guidelines will provide clarification, but current statutory text doesn't classify trading AI as high-risk.
⚠️ Article 12: The Logging Mandate
For high-risk AI (if your trading system qualifies), Article 12 requires:
"automatic recording of events ('logs') over the lifetime of the system"
Logs must enable:
- Identifying risk situations
- Facilitating post-market monitoring
- Monitoring operations
VCP v1.1 exceeds all three requirements—and works whether or not you're classified as high-risk.
VCP vs Article 12: Feature Mapping
| AI Act Requirement | VCP v1.1 Implementation | Status |
|---|---|---|
| Automatic event recording | Layer 1 EventHash | ✅ Exceeds |
| Traceability | Layer 2 Merkle Tree + Audit Paths | ✅ Exceeds |
| Risk situation identification | VCP-RISK module + Error events | ✅ Full |
| Post-market monitoring | Persistent storage + VCP-GOV | ✅ Full |
| Tampering resistance | Layer 3 External Anchoring | ✅ Exceeds |
| Completeness guarantees | Merkle-based omission detection | ✅ Beyond requirement |
The "Completeness guarantees" row is key—Article 12 doesn't require proving that no events were deleted. VCP provides this anyway because "Verify, Don't Trust" demands it.
VCP-XREF: Dual Logging for Disputes
Here's a scenario: You're trading through a prop firm. You claim you hit your profit target. They claim you didn't. Who's right?
With single-party logging, it's your word against theirs. With VCP-XREF dual logging, both parties maintain independent VCP streams that can be cross-referenced:
# Trader-side event
trader_event = {
"Header": {"EventType": "ORD", "EventID": "019abc..."},
"VCP-XREF": {
"CrossReferenceID": "550e8400-e29b-41d4-a716-446655440000",
"PartyRole": "INITIATOR",
"CounterpartyID": "propfirm.example.com",
"SharedEventKey": {
"OrderID": "ORD-2025-001234",
"Timestamp": 1735084800123456789,
"ToleranceMs": 100
}
}
}
# Prop firm-side event (logged independently)
propfirm_event = {
"Header": {"EventType": "ACK", "EventID": "019def..."},
"VCP-XREF": {
"CrossReferenceID": "550e8400-e29b-41d4-a716-446655440000",
"PartyRole": "COUNTERPARTY",
"CounterpartyID": "trader.example.com",
"SharedEventKey": {
"OrderID": "ORD-2025-001234",
"Timestamp": 1735084800123789012,
"ToleranceMs": 100
}
}
}
If the events match, both parties logged consistently. If they don't, you have cryptographic evidence of a discrepancy. Either way, manipulation requires collusion between both parties AND compromise of external anchors.
GDPR Compatibility: Crypto-Shredding
"But wait—immutable audit trails conflict with GDPR's right to erasure!"
Not with crypto-shredding:
def log_event_with_privacy(payload: dict, pii_fields: list[str]):
"""Log event with GDPR-compatible privacy protection"""
# Generate per-event encryption key
event_key = os.urandom(32)
# Encrypt PII fields
encrypted_payload = payload.copy()
for field in pii_fields:
if field in encrypted_payload:
encrypted_payload[field] = encrypt_aes_gcm(
payload[field],
event_key
)
# Store key in KMS with erasure capability
key_id = kms.store_key(
event_key,
retention_days=2555, # 7 years
erasure_eligible=True
)
return encrypted_payload, key_id
def execute_gdpr_erasure(key_id: str):
"""Delete encryption key = data becomes cryptographically unrecoverable"""
kms.delete_key(key_id)
# Encrypted data remains in audit trail but is now meaningless noise
The encrypted data stays in the immutable audit trail (for audit purposes), but personal data becomes cryptographically inaccessible when you delete the key.
Post-Quantum Ready
Current VCP signatures use Ed25519—vulnerable to future quantum computers. VCP v1.1 includes crypto-agility with reserved algorithm slots:
| Algorithm | VCP Enum | Status |
|---|---|---|
| Ed25519 | ED25519 |
DEFAULT |
| Dilithium | DILITHIUM2 |
RESERVED (NIST FIPS 204) |
| FALCON | FALCON512 |
RESERVED (NIST FIPS 206) |
When quantum computers become a threat (2030-2035 estimates), you upgrade the SignAlgo field—no protocol changes required.
Getting Started
1. Define Your PolicyID
# Format: reverse_domain:local_identifier
policy_id = "com.yourcompany.trading:silver-algo-v1"
2. Choose Your Anchoring Service
Free options:
- OpenTimestamps - Bitcoin-backed, completely free
- FreeTSA - RFC 3161 compliant, free
Commercial options:
- DigiCert TSA
- GlobalSign TSA
- AWS QLDB + SOC 2 attestation
3. Implement the Three Layers
Use the Silver tier implementation above as your starting point. Key checkpoints:
- [ ] Every event has an EventHash
- [ ] Events are batched into Merkle trees
- [ ] Merkle roots are signed (Ed25519)
- [ ] Roots are anchored externally (at least daily)
- [ ] PolicyIdentification included in every event
4. Test Verification
def verify_event(event: dict, public_key_hex: str, merkle_root: str) -> bool:
"""Verify a single event against its Merkle root"""
# 1. Recalculate event hash
recalculated_hash = calculate_event_hash(
event["Header"],
event["Payload"]
)
# 2. Verify hash matches
if recalculated_hash != event["Security"]["EventHash"]:
return False
# 3. Verify signature
verify_key = VerifyKey(bytes.fromhex(public_key_hex))
try:
verify_key.verify(
event["Security"]["EventHash"].encode(),
base64.b64decode(event["Security"]["Signature"])
)
except:
return False
# 4. Verify Merkle inclusion (simplified)
if event["Security"]["MerkleRoot"] != merkle_root:
return False
return True
Resources
- VCP Specification: github.com/veritaschain/vcp-spec
- IETF Internet-Draft: datatracker.ietf.org/doc/draft-kamimura-scitt-vcp/
- RFC 6962 (Merkle Trees): tools.ietf.org/html/rfc6962
- RFC 8785 (JSON Canonicalization): tools.ietf.org/html/rfc8785
Conclusion
The EU AI Act demands logging. It doesn't specify how. VCP v1.1 provides:
- Tamper-evidence through cryptographic hashing
- Completeness guarantees through Merkle trees
- External verifiability through anchoring
- Non-repudiation through dual logging
- GDPR compatibility through crypto-shredding
- Future-proofing through crypto-agility
Whether your trading AI gets classified as high-risk or not, building verifiable audit trails is just good engineering. When regulators ask "prove your AI made this decision," you'll have cryptographic proof—not just logs.
Questions? Reach out at technical@veritaschain.org or open an issue on GitHub.
VCP is an open standard licensed under CC BY 4.0. Contributions welcome.
Top comments (0)