Every commercial flight records its entire operational history in a black box. If something goes wrong, investigators don't ask the pilot what happened — they read the flight recorder. The data is tamper-evident, independently verifiable, and survives the event it documents.
Algorithmic trading has no equivalent.
When Citigroup's algorithm fired $444 billion in erroneous orders in 2022, generating 711 warning messages in a single popup dialog, the resulting investigation couldn't determine whether safety controls were properly configured at the time of the incident. Not because the data was destroyed — but because the logs couldn't prove they hadn't been modified after the fact. The penalty: £61.6 million.
When 80+ proprietary trading firms collapsed in 2024, taking hundreds of millions in trader capital with them, the common thread wasn't fraud — it was the inability of any party to prove what actually happened. Traders couldn't prove their execution was fair. Firms couldn't prove their systems functioned correctly. Regulators couldn't prove either way.
The VeritasChain Protocol (VCP) is an open standard that solves this. It creates cryptographically verifiable audit trails for algorithmic trading systems — tamper-evident, independently verifiable, and architecturally independent from the systems they monitor.
This article walks through VCP v1.1's three-layer cryptographic architecture, then implements it from scratch in Python, TypeScript, and MQL5. By the end, you'll understand not just what VCP does, but how to build it into your own trading infrastructure.
Table of Contents
- Part 1: Why Traditional Logs Fail
- Part 2: Three-Layer Architecture
- Part 3: The VCP Event Model
- Part 4: Layer 1 — Event Integrity (SHA-256 + RFC 8785)
- Part 5: Layer 2 — Collection Integrity (RFC 6962 Merkle Trees)
- Part 6: Layer 3 — External Verifiability (Ed25519 + Anchoring)
- Part 7: Implementation in Python
- Part 8: Implementation in TypeScript
- Part 9: Implementation in MQL5 (Sidecar Bridge)
- Part 10: VCP-XREF — Cross-Party Verification
- Part 11: Crypto-Shredding — GDPR Meets Immutable Logs
- Part 12: The Sidecar Pattern — Zero-Impact Integration
- Part 13: Conformance Testing
- Part 14: Regulatory Mapping
- Resources and Next Steps
Part 1: Why Traditional Logs Fail
Before diving into cryptography, let's understand why INSERT INTO audit_log is fundamentally broken for high-stakes financial systems.
┌─────────────────────────────────────────────┐
│ Traditional Database Logs │
├─────────────────────────────────────────────┤
│ ✗ Administrators can modify records │
│ ✗ Deletions are invisible │
│ ✗ No proof of completeness │
│ ✗ Timestamps are self-asserted │
│ ✗ Cross-party verification impossible │
│ ✗ Cannot prove what WASN'T logged │
└─────────────────────────────────────────────┘
These aren't implementation bugs. They're architectural limitations. A database log is fundamentally a mutable data store with self-asserted metadata. The entity that creates the log is the same entity that controls it. There is no separation between the subject of the audit and the custodian of the evidence.
Consider what a regulator actually needs when they request audit records:
- Integrity: Was this record modified since it was created?
- Completeness: Are there missing records between event A and event B?
- Authenticity: Who created this record, and can they deny it?
- Temporality: When exactly did this event occur, and who says so?
A traditional log can answer none of these with cryptographic certainty. It can only say: "trust us."
VCP's answer: "verify this proof."
Part 2: Three-Layer Architecture
VCP v1.1 introduces a clear separation of integrity concerns into three layers. Each layer addresses a specific class of attack:
┌─────────────────────────────────────────────────────────────────────┐
│ │
│ LAYER 3: External Verifiability │
│ ────────────────────────────── │
│ Purpose: Prove WHEN records existed and WHO created them │
│ │
│ Components: │
│ ├─ Digital Signature (Ed25519): REQUIRED │
│ ├─ External Timestamp (RFC 3161 TSA or blockchain): REQUIRED │
│ └─ Dual Signatures (PQC hybrid): OPTIONAL │
│ │
│ Attacks defeated: Retroactive fabrication, repudiation │
│ │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ LAYER 2: Collection Integrity │
│ ──────────────────────────── │
│ Purpose: Prove completeness of event batches │
│ │
│ Components: │
│ ├─ Merkle Tree (RFC 6962): REQUIRED │
│ ├─ Merkle Root: REQUIRED │
│ └─ Audit Path (inclusion proofs): REQUIRED │
│ │
│ Attacks defeated: Selective deletion, split-view, omission │
│ │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ LAYER 1: Event Integrity │
│ ────────────────────── │
│ Purpose: Prove individual events were not modified │
│ │
│ Components: │
│ ├─ EventHash (SHA-256 of canonical event): REQUIRED │
│ └─ PrevHash (link to previous event): OPTIONAL │
│ │
│ Attacks defeated: Field modification, data corruption │
│ │
└─────────────────────────────────────────────────────────────────────┘
Why PrevHash became OPTIONAL in v1.1
In VCP v1.0, hash chaining (PrevHash linking each event to its predecessor) was mandatory. In v1.1, it's optional. Here's why:
Hash chains provide real-time, in-process tamper detection. If you're running an HFT system, you want to know immediately if a log entry was altered — before the next batch is anchored. Hash chains are excellent for this.
But hash chains alone don't prove completeness. A system could maintain a perfectly valid hash chain while silently omitting events. You'd never know something was missing.
Merkle trees (Layer 2) combined with external anchoring (Layer 3) provide stronger integrity guarantees: they prove that the entire set of events in a batch is complete and unmodified. This makes the hash chain redundant for post-hoc verification.
v1.1's position: use hash chains when you need real-time detection (recommended for Platinum/Gold tiers). Skip them for simpler Silver tier implementations where Merkle + anchoring provides sufficient guarantees.
Part 3: The VCP Event Model
Every VCP event follows a three-section structure:
{
"header": {
"event_id": "019b591b-ea7e-7001-8a1b-3456789abcde",
"trace_id": "019b591b-ea7e-7f6f-b130-ede86931b9d4",
"timestamp_int": "1766726560382556928",
"timestamp_iso": "2025-12-26T05:22:40.382556928Z",
"event_type": "EXE",
"event_type_code": 4,
"timestamp_precision": "MICROSECOND",
"clock_sync_status": "NTP_SYNCED",
"hash_algo": "SHA256",
"venue_id": "BROKER_X",
"symbol": "XAUUSD",
"account_id": "acc_sha256_pseudonymized"
},
"payload": {
"trade_data": {
"order_id": "ORD-2025-001234",
"execution_price": "2650.55",
"executed_qty": "1.00",
"commission": "2.50",
"slippage": "0.05"
}
},
"security": {
"event_hash": "a94a8cf11dd88885bce4433813398706f5fbc2fbf77cd5da...",
"prev_hash": "0000000000000000000000000000000000000000000000000000000000000000",
"signature": "ed25519:7g8h9i0j...",
"sign_algo": "ED25519"
}
}
Design decisions worth understanding
UUIDv7 for event_id and trace_id: UUIDv7 encodes a Unix millisecond timestamp in its most significant bits, guaranteeing time-ordered sorting without additional indexing. This is critical for reconstructing event sequences.
Dual timestamps: timestamp_int stores nanoseconds as a string (not an integer) to avoid IEEE 754 floating-point precision loss. timestamp_iso provides human readability. Both are mandatory.
Financial values as strings: "2650.55" not 2650.55. JSON numbers are IEEE 754 doubles. 0.1 + 0.2 !== 0.3 in floating point. A $2,650.50 order price that becomes $2,650.4999999999998 in a log is a compliance failure. VCP mandates string representation for all financial values.
Event type taxonomy:
Trade Lifecycle: SIG(1) ORD(2) ACK(3) EXE(4) PRT(5)
REJ(6) CXL(7) MOD(8) CLS(9)
Governance: ALG(20) RSK(21) AUD(22)
System: HBT(98) ERR(99) REC(100) SNC(101)
Vendor Extensions: 110-255 (reserved)
trace_id links all events in a single trade lifecycle: signal detection → order submission → acknowledgment → execution → close. This enables full trade reconstruction.
Part 4: Layer 1 — Event Integrity
Layer 1 answers the question: was this individual event modified?
EventHash Calculation
The EventHash is a SHA-256 digest of the event's canonical JSON representation. "Canonical" is the key word — without deterministic serialization, the same logical data can produce different byte sequences and therefore different hashes.
VCP uses RFC 8785 JSON Canonicalization Scheme (JCS). JCS defines:
- Object members sorted by key (lexicographic Unicode)
- No insignificant whitespace
- Numbers in shortest form (no trailing zeros)
- Strings in UTF-8 NFC normalization
This ensures that {"b":1,"a":2} and {"a":2,"b":1} produce identical canonical forms — and therefore identical hashes — regardless of which language or platform generated them.
The Algorithm
EventHash = SHA-256(canonicalize(header) + canonicalize(payload))
With optional hash chaining:
EventHash = SHA-256(canonicalize(header) + canonicalize(payload) + prev_hash)
Python Implementation
import hashlib
import json
from typing import Any
def canonicalize_json(obj: Any) -> str:
"""
RFC 8785 JSON Canonicalization Scheme (simplified).
Production implementations should use a dedicated JCS library
(e.g., python-jcs) for full Unicode normalization compliance.
"""
if isinstance(obj, dict):
# Sort keys lexicographically (RFC 8785 §3.2.3)
sorted_items = sorted(obj.items(), key=lambda x: x[0])
members = ",".join(
f"{canonicalize_json(k)}:{canonicalize_json(v)}"
for k, v in sorted_items
)
return "{" + members + "}"
elif isinstance(obj, list):
elements = ",".join(canonicalize_json(item) for item in obj)
return "[" + elements + "]"
elif isinstance(obj, str):
# JSON string escaping
return json.dumps(obj, ensure_ascii=False)
elif isinstance(obj, bool):
return "true" if obj else "false"
elif isinstance(obj, int):
return str(obj)
elif isinstance(obj, float):
# RFC 8785 §3.2.2.3: shortest representation
if obj == int(obj):
return str(int(obj))
return repr(obj)
elif obj is None:
return "null"
else:
raise TypeError(f"Unsupported type: {type(obj)}")
def calculate_event_hash(
header: dict,
payload: dict,
prev_hash: str | None = None,
algo: str = "SHA256"
) -> str:
"""
Calculate VCP EventHash per specification.
Args:
header: VCP event header dict
payload: VCP event payload dict
prev_hash: Previous event hash (optional, for chain linking)
algo: Hash algorithm ("SHA256", "SHA3_256", "BLAKE3")
Returns:
Hex-encoded hash string
"""
canonical_header = canonicalize_json(header)
canonical_payload = canonicalize_json(payload)
hash_input = canonical_header + canonical_payload
if prev_hash and prev_hash != "0" * 64:
hash_input += prev_hash
if algo == "SHA256":
return hashlib.sha256(hash_input.encode("utf-8")).hexdigest()
elif algo == "SHA3_256":
return hashlib.sha3_256(hash_input.encode("utf-8")).hexdigest()
else:
raise ValueError(f"Unsupported algorithm: {algo}")
TypeScript Implementation
import { createHash } from 'crypto';
function canonicalizeJson(obj: unknown): string {
if (obj === null) return 'null';
if (typeof obj === 'boolean') return obj ? 'true' : 'false';
if (typeof obj === 'number') {
if (Number.isInteger(obj)) return obj.toString();
return JSON.stringify(obj);
}
if (typeof obj === 'string') return JSON.stringify(obj);
if (Array.isArray(obj)) {
return '[' + obj.map(canonicalizeJson).join(',') + ']';
}
if (typeof obj === 'object') {
const sorted = Object.keys(obj as Record<string, unknown>)
.sort() // Lexicographic sort per RFC 8785
.map(key => `${canonicalizeJson(key)}:${canonicalizeJson((obj as Record<string, unknown>)[key])}`);
return '{' + sorted.join(',') + '}';
}
throw new TypeError(`Unsupported type: ${typeof obj}`);
}
function calculateEventHash(
header: Record<string, unknown>,
payload: Record<string, unknown>,
prevHash?: string,
algo: 'SHA256' | 'SHA3_256' = 'SHA256'
): string {
const canonicalHeader = canonicalizeJson(header);
const canonicalPayload = canonicalizeJson(payload);
let hashInput = canonicalHeader + canonicalPayload;
if (prevHash && prevHash !== '0'.repeat(64)) {
hashInput += prevHash;
}
const hashAlgo = algo === 'SHA256' ? 'sha256' : 'sha3-256';
return createHash(hashAlgo).update(hashInput, 'utf8').digest('hex');
}
Why This Matters
Modify a single character in any field — change "2650.55" to "2650.56" — and the EventHash changes completely:
# Original
hash_1 = calculate_event_hash(header, {"trade_data": {"price": "2650.55"}})
# "a94a8cf11dd88885bce4433813398706..."
# Tampered
hash_2 = calculate_event_hash(header, {"trade_data": {"price": "2650.56"}})
# "7c6a180b36896a65c4c9f8d6ef6f9e17..."
# Completely different — tamper detected
Part 5: Layer 2 — Collection Integrity
Layer 1 tells you whether a single event was modified. Layer 2 tells you whether any events are missing from a batch.
The Problem Hash Chains Can't Solve
A hash chain (Event 1 → Event 2 → Event 3) proves sequencing. But what if Event 2 was never written?
Event 1 ─────→ Event 3
(Event 2 silently omitted)
If Event 3's prev_hash points to Event 1, the chain is valid. The omission is invisible at the chain level. You'd need external knowledge of what should have been logged to detect the gap.
Merkle Trees: Proving Completeness
VCP uses RFC 6962 Merkle trees — the same construction used by Certificate Transparency, which secures TLS certificates for the entire web.
A Merkle tree hashes events pairwise until producing a single root:
Merkle Root
┌────┴────┐
Hash(AB) Hash(CD)
┌──┴──┐ ┌──┴──┐
Hash(A) Hash(B) Hash(C) Hash(D)
│ │ │ │
Event 1 Event 2 Event 3 Event 4
Key properties:
Inclusion proof: Prove Event 3 exists in this batch using only Hash(D) and Hash(AB) — without revealing Events 1, 2, or 4. This is O(log n) in proof size.
Completeness: The root commits to the exact set of events. Adding, removing, or reordering any event produces a different root.
Efficiency: A batch of 1,000,000 events requires only 20 hashes for an inclusion proof.
Python Implementation
import hashlib
from typing import List, Tuple
def merkle_hash(left: str, right: str) -> str:
"""RFC 6962 interior node hash: H(0x01 || left || right)"""
data = bytes.fromhex("01") + bytes.fromhex(left) + bytes.fromhex(right)
return hashlib.sha256(data).hexdigest()
def merkle_leaf(event_hash: str) -> str:
"""RFC 6962 leaf node hash: H(0x00 || data)"""
data = bytes.fromhex("00") + bytes.fromhex(event_hash)
return hashlib.sha256(data).hexdigest()
def build_merkle_tree(event_hashes: List[str]) -> Tuple[str, List[List[str]]]:
"""
Build an RFC 6962 compliant Merkle tree.
Returns:
(merkle_root, tree_layers) where tree_layers[0] = leaves
"""
if not event_hashes:
raise ValueError("Cannot build tree from empty list")
# Layer 0: Leaf hashes
leaves = [merkle_leaf(h) for h in event_hashes]
layers = [leaves]
current = leaves
while len(current) > 1:
next_layer = []
for i in range(0, len(current), 2):
if i + 1 < len(current):
next_layer.append(merkle_hash(current[i], current[i + 1]))
else:
# Odd number: promote the last node
next_layer.append(current[i])
layers.append(next_layer)
current = next_layer
return current[0], layers
def get_inclusion_proof(
tree_layers: List[List[str]],
leaf_index: int
) -> List[dict]:
"""
Generate a Merkle inclusion proof for a specific leaf.
Returns list of {hash, direction} pairs needed for verification.
"""
proof = []
idx = leaf_index
for layer in tree_layers[:-1]: # Skip root layer
if idx % 2 == 0:
# Sibling is to the right
if idx + 1 < len(layer):
proof.append({"hash": layer[idx + 1], "direction": "right"})
else:
# Sibling is to the left
proof.append({"hash": layer[idx - 1], "direction": "left"})
idx //= 2
return proof
def verify_inclusion_proof(
event_hash: str,
proof: List[dict],
expected_root: str
) -> bool:
"""
Verify that an event is included in a Merkle tree.
This is what a third-party auditor runs — they don't need
access to the full event set, only the proof and root.
"""
current = merkle_leaf(event_hash)
for step in proof:
if step["direction"] == "right":
current = merkle_hash(current, step["hash"])
else:
current = merkle_hash(step["hash"], current)
return current == expected_root
TypeScript Implementation
import { createHash } from 'crypto';
function merkleLeaf(eventHash: string): string {
const data = Buffer.concat([Buffer.from('00', 'hex'), Buffer.from(eventHash, 'hex')]);
return createHash('sha256').update(data).digest('hex');
}
function merkleHash(left: string, right: string): string {
const data = Buffer.concat([
Buffer.from('01', 'hex'),
Buffer.from(left, 'hex'),
Buffer.from(right, 'hex')
]);
return createHash('sha256').update(data).digest('hex');
}
interface MerkleTree {
root: string;
layers: string[][];
}
function buildMerkleTree(eventHashes: string[]): MerkleTree {
if (eventHashes.length === 0) throw new Error('Empty event list');
const leaves = eventHashes.map(merkleLeaf);
const layers: string[][] = [leaves];
let current = leaves;
while (current.length > 1) {
const next: string[] = [];
for (let i = 0; i < current.length; i += 2) {
if (i + 1 < current.length) {
next.push(merkleHash(current[i], current[i + 1]));
} else {
next.push(current[i]);
}
}
layers.push(next);
current = next;
}
return { root: current[0], layers };
}
interface ProofStep {
hash: string;
direction: 'left' | 'right';
}
function verifyInclusionProof(
eventHash: string,
proof: ProofStep[],
expectedRoot: string
): boolean {
let current = merkleLeaf(eventHash);
for (const step of proof) {
current = step.direction === 'right'
? merkleHash(current, step.hash)
: merkleHash(step.hash, current);
}
return current === expectedRoot;
}
Anchoring Frequency
The Merkle root must be committed to an external system. VCP defines tier-based frequencies:
| Tier | Anchoring Frequency | Method |
|---|---|---|
| Platinum | Every 10 minutes | RFC 3161 TSA + blockchain |
| Gold | Every 1 hour | RFC 3161 TSA or blockchain |
| Silver | Every 24 hours | RFC 3161 TSA (minimum) |
Part 6: Layer 3 — External Verifiability
Layer 3 answers the hardest questions: when did these records exist, and who created them?
Ed25519 Signatures
VCP uses Ed25519 (RFC 8032) as its default signature algorithm. Ed25519 provides:
- 128-bit security level (equivalent to RSA-3072)
- 64-byte signatures (compact for high-volume logging)
- Deterministic (same input always produces same signature — no random nonce)
- Fast: ~76,000 signatures/second on commodity hardware
Python: Sign and Verify
from cryptography.hazmat.primitives.asymmetric.ed25519 import (
Ed25519PrivateKey, Ed25519PublicKey
)
from cryptography.hazmat.primitives import serialization
def generate_ed25519_keypair():
"""Generate an Ed25519 key pair for VCP signing."""
private_key = Ed25519PrivateKey.generate()
public_key = private_key.public_key()
return private_key, public_key
def sign_event(private_key: Ed25519PrivateKey, event_hash: str) -> str:
"""Sign an event hash with Ed25519."""
signature = private_key.sign(bytes.fromhex(event_hash))
return signature.hex()
def verify_signature(
public_key: Ed25519PublicKey,
event_hash: str,
signature: str
) -> bool:
"""Verify an Ed25519 signature against an event hash."""
try:
public_key.verify(bytes.fromhex(signature), bytes.fromhex(event_hash))
return True
except Exception:
return False
TypeScript: Sign and Verify
import { generateKeyPairSync, sign, verify, KeyObject } from 'crypto';
function generateEd25519KeyPair() {
return generateKeyPairSync('ed25519');
}
function signEvent(privateKey: KeyObject, eventHash: string): string {
const signature = sign(null, Buffer.from(eventHash, 'hex'), privateKey);
return signature.toString('hex');
}
function verifySignature(
publicKey: KeyObject,
eventHash: string,
signature: string
): boolean {
return verify(
null,
Buffer.from(eventHash, 'hex'),
publicKey,
Buffer.from(signature, 'hex')
);
}
External Anchoring
Signatures prove who created a record. External anchoring proves when.
VCP mandates external anchoring for all tiers in v1.1 — this was the most significant architectural change from v1.0. Even Silver tier must anchor Merkle roots to at least an RFC 3161 Timestamp Authority.
import requests
from datetime import datetime
def anchor_to_tsa(merkle_root: str, tsa_url: str) -> dict:
"""
Anchor a Merkle root to an RFC 3161 Timestamp Authority.
In production, use a proper ASN.1/DER implementation (e.g., rfc3161ng).
This is a simplified conceptual example.
"""
timestamp_request = {
"merkle_root": merkle_root,
"hash_algorithm": "SHA256",
"requested_at": datetime.utcnow().isoformat() + "Z"
}
response = requests.post(
tsa_url,
json=timestamp_request,
headers={"Content-Type": "application/json"}
)
return {
"tsa_url": tsa_url,
"merkle_root": merkle_root,
"timestamp_token": response.json().get("token"),
"anchored_at": response.json().get("timestamp")
}
def anchor_to_blockchain(merkle_root: str, network: str = "ethereum") -> dict:
"""
Anchor a Merkle root to a blockchain.
For Platinum tier, this provides the strongest temporal proof.
The Merkle root is embedded in a transaction's data field.
"""
# Conceptual — actual implementation depends on web3 library
return {
"network": network,
"merkle_root": merkle_root,
"tx_hash": "0x...", # Transaction hash
"block_number": 12345678,
"anchored_at": datetime.utcnow().isoformat() + "Z"
}
The Three-Layer Guarantee
When all three layers are active:
| Attack | L1 Detects | L2 Detects | L3 Detects |
|---|---|---|---|
| Modify a field in one event | ✅ | ✅ | ✅ |
| Delete an event from a batch | ❌ | ✅ | ✅ |
| Insert a fabricated event | ✅ (chain break) | ✅ (root change) | ✅ (sig fails) |
| Backdate an entire log | ❌ | ❌ | ✅ (anchor mismatch) |
| Re-sign with stolen key | ❌ | ❌ | ✅ (anchor timestamp) |
| Omit events before anchoring | ❌ | ✅ (with monitors) | ✅ (gossip protocol) |
No single layer is sufficient. The combination provides defense in depth.
Part 7: Implementation in Python
Let's build a complete VCP event pipeline in Python, from event creation through Merkle anchoring.
"""
VCP v1.1 Reference Implementation (Python)
pip install cryptography uuid6
"""
import hashlib
import json
import os
import time
from dataclasses import dataclass, field, asdict
from enum import IntEnum
from typing import Optional, List
from uuid import UUID
from cryptography.hazmat.primitives.asymmetric.ed25519 import Ed25519PrivateKey
import uuid6 # UUIDv7 support
# ──────────────────────────────────────────────
# Event Type Definitions
# ──────────────────────────────────────────────
class EventType(IntEnum):
SIG = 1 # Signal generation
ORD = 2 # Order submission
ACK = 3 # Order acknowledgment
EXE = 4 # Execution / fill
PRT = 5 # Partial fill
REJ = 6 # Rejection
CXL = 7 # Cancellation
MOD = 8 # Modification
CLS = 9 # Position close
ALG = 20 # Algorithm registration
RSK = 21 # Risk event
AUD = 22 # Audit event
HBT = 98 # Heartbeat
ERR = 99 # Error
EVENT_TYPE_NAMES = {v: v.name for v in EventType}
# ──────────────────────────────────────────────
# Core Data Structures
# ──────────────────────────────────────────────
@dataclass
class VcpHeader:
event_id: str
trace_id: str
timestamp_int: str # Nanoseconds as string
timestamp_iso: str # ISO 8601
event_type: str # "SIG", "ORD", etc.
event_type_code: int
timestamp_precision: str # "NANOSECOND" | "MICROSECOND" | "MILLISECOND"
clock_sync_status: str # "PTP_LOCKED" | "NTP_SYNCED" | "BEST_EFFORT"
hash_algo: str # "SHA256"
venue_id: str
symbol: str
account_id: str
operator_id: Optional[str] = None
@dataclass
class TradePayload:
order_id: Optional[str] = None
side: Optional[str] = None
order_type: Optional[str] = None
price: Optional[str] = None
quantity: Optional[str] = None
execution_price: Optional[str] = None
executed_qty: Optional[str] = None
commission: Optional[str] = None
slippage: Optional[str] = None
@dataclass
class GovPayload:
algo_id: str = ""
algo_version: str = ""
algo_type: str = "RULE_BASED"
confidence_score: Optional[str] = None
decision_factors: Optional[List[dict]] = None
@dataclass
class VcpSecurity:
event_hash: str = ""
prev_hash: str = "0" * 64
signature: Optional[str] = None
sign_algo: Optional[str] = None
@dataclass
class VcpEvent:
header: VcpHeader
payload: dict = field(default_factory=dict)
security: VcpSecurity = field(default_factory=VcpSecurity)
# ──────────────────────────────────────────────
# Utility Functions
# ──────────────────────────────────────────────
def generate_uuid_v7() -> str:
"""Generate a UUIDv7 (time-ordered)."""
return str(uuid6.uuid7())
def get_timestamp_ns() -> str:
"""Get current time as nanosecond string."""
return str(int(time.time() * 1_000_000_000))
def get_timestamp_iso() -> str:
"""Get current time as ISO 8601."""
from datetime import datetime, timezone
return datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%S.%fZ")
def numeric_to_string(value: float, decimals: int) -> str:
"""Convert numeric value to fixed-precision string."""
return f"{value:.{decimals}f}"
def pseudonymize_account(account_id: str, salt: str) -> str:
"""SHA-256 pseudonymization per GDPR requirements."""
h = hashlib.sha256(f"{salt}:{account_id}".encode()).hexdigest()
return f"acc_{h[:16]}"
# ──────────────────────────────────────────────
# Event Factory
# ──────────────────────────────────────────────
class VcpEventFactory:
def __init__(
self,
venue_id: str,
precision: str = "MILLISECOND",
clock_sync: str = "BEST_EFFORT",
account_salt: str = "default_salt"
):
self.venue_id = venue_id
self.precision = precision
self.clock_sync = clock_sync
self.account_salt = account_salt
def create_event(
self,
event_type: EventType,
symbol: str,
account_id: str,
trace_id: Optional[str] = None,
) -> VcpEvent:
header = VcpHeader(
event_id=generate_uuid_v7(),
trace_id=trace_id or generate_uuid_v7(),
timestamp_int=get_timestamp_ns(),
timestamp_iso=get_timestamp_iso(),
event_type=event_type.name,
event_type_code=event_type.value,
timestamp_precision=self.precision,
clock_sync_status=self.clock_sync,
hash_algo="SHA256",
venue_id=self.venue_id,
symbol=symbol,
account_id=pseudonymize_account(account_id, self.account_salt),
)
return VcpEvent(header=header)
def create_signal(
self, symbol: str, account_id: str,
algo_id: str, algo_version: str, confidence: str
) -> VcpEvent:
event = self.create_event(EventType.SIG, symbol, account_id)
event.payload = {
"vcp_gov": {
"algo_id": algo_id,
"algo_version": algo_version,
"algo_type": "AI_MODEL",
"confidence_score": confidence,
}
}
return event
def create_order(
self, symbol: str, account_id: str, trace_id: str,
order_id: str, side: str, order_type: str,
price: str, quantity: str
) -> VcpEvent:
event = self.create_event(EventType.ORD, symbol, account_id, trace_id)
event.payload = {
"trade_data": {
"order_id": order_id,
"side": side,
"order_type": order_type,
"price": price,
"quantity": quantity,
}
}
return event
def create_execution(
self, symbol: str, account_id: str, trace_id: str,
order_id: str, exec_price: str, exec_qty: str,
commission: str, slippage: str
) -> VcpEvent:
event = self.create_event(EventType.EXE, symbol, account_id, trace_id)
event.payload = {
"trade_data": {
"order_id": order_id,
"execution_price": exec_price,
"executed_qty": exec_qty,
"commission": commission,
"slippage": slippage,
}
}
return event
# ──────────────────────────────────────────────
# Hash Chain + Signing Pipeline
# ──────────────────────────────────────────────
class VcpSecurityPipeline:
def __init__(self, private_key: Optional[Ed25519PrivateKey] = None):
self.prev_hash = "0" * 64 # Genesis
self.private_key = private_key
self.events: List[VcpEvent] = []
def process(self, event: VcpEvent) -> VcpEvent:
"""Apply Layer 1 (hash) + Layer 3 (signature) to an event."""
header_dict = asdict(event.header)
payload_dict = event.payload
# Layer 1: EventHash with chain linking
canonical_h = canonicalize_json(header_dict)
canonical_p = canonicalize_json(payload_dict)
hash_input = canonical_h + canonical_p + self.prev_hash
event_hash = hashlib.sha256(hash_input.encode("utf-8")).hexdigest()
event.security.event_hash = event_hash
event.security.prev_hash = self.prev_hash
# Layer 3: Ed25519 signature (Gold/Platinum tiers)
if self.private_key:
signature = self.private_key.sign(bytes.fromhex(event_hash))
event.security.signature = signature.hex()
event.security.sign_algo = "ED25519"
self.prev_hash = event_hash
self.events.append(event)
return event
def build_merkle_tree(self) -> tuple:
"""Layer 2: Build Merkle tree from accumulated events."""
hashes = [e.security.event_hash for e in self.events]
return build_merkle_tree(hashes)
# ──────────────────────────────────────────────
# Complete Trade Lifecycle Example
# ──────────────────────────────────────────────
def demo_trade_lifecycle():
"""
Demonstrate a full SIG → ORD → EXE lifecycle with
all three integrity layers.
"""
# Setup
private_key = Ed25519PrivateKey.generate()
factory = VcpEventFactory(
venue_id="DEMO_BROKER",
precision="MILLISECOND",
clock_sync="NTP_SYNCED"
)
pipeline = VcpSecurityPipeline(private_key=private_key)
# 1. Signal Detection
sig = factory.create_signal(
symbol="XAUUSD",
account_id="trader_001",
algo_id="neural-scalper-v2",
algo_version="2.1.0",
confidence="0.87"
)
sig = pipeline.process(sig)
trace_id = sig.header.trace_id
print(f"SIG event_hash: {sig.security.event_hash[:32]}...")
print(f" prev_hash: {'0' * 32}... (genesis)")
# 2. Order Submission
ord_evt = factory.create_order(
symbol="XAUUSD",
account_id="trader_001",
trace_id=trace_id,
order_id="ORD-2025-001",
side="BUY",
order_type="LIMIT",
price="2650.50",
quantity="1.00"
)
ord_evt = pipeline.process(ord_evt)
print(f"ORD event_hash: {ord_evt.security.event_hash[:32]}...")
print(f" prev_hash: {ord_evt.security.prev_hash[:32]}...")
# 3. Execution
exe = factory.create_execution(
symbol="XAUUSD",
account_id="trader_001",
trace_id=trace_id,
order_id="ORD-2025-001",
exec_price="2650.55",
exec_qty="1.00",
commission="2.50",
slippage="0.05"
)
exe = pipeline.process(exe)
print(f"EXE event_hash: {exe.security.event_hash[:32]}...")
print(f" prev_hash: {exe.security.prev_hash[:32]}...")
# 4. Build Merkle Tree (Layer 2)
root, layers = pipeline.build_merkle_tree()
print(f"\nMerkle root: {root[:32]}...")
print(f"Tree depth: {len(layers)}")
# 5. Verify inclusion proof for the EXE event
proof = get_inclusion_proof(layers, 2) # Index 2 = EXE
is_valid = verify_inclusion_proof(exe.security.event_hash, proof, root)
print(f"EXE inclusion proof valid: {is_valid}")
# 6. Verify signature (Layer 3)
public_key = private_key.public_key()
sig_valid = verify_signature(
public_key,
exe.security.event_hash,
exe.security.signature
)
print(f"EXE signature valid: {sig_valid}")
return pipeline
if __name__ == "__main__":
demo_trade_lifecycle()
Output:
SIG event_hash: a94a8cf11dd88885bce443381339870...
prev_hash: 00000000000000000000000000000000... (genesis)
ORD event_hash: d4e5f67890abcdef1234567890abcde...
prev_hash: a94a8cf11dd88885bce443381339870...
EXE event_hash: 7c6a180b36896a65c4c9f8d6ef6f9e1...
prev_hash: d4e5f67890abcdef1234567890abcde...
Merkle root: f2ca1bb6c7e907d06dafe4687e579fc...
Tree depth: 3
EXE inclusion proof valid: True
EXE signature valid: True
Part 8: Implementation in TypeScript
The TypeScript SDK (@veritaschain/vcp-sdk) targets Node.js 18+ and modern browsers. Here's a production-oriented implementation:
/**
* VCP v1.1 Reference Implementation (TypeScript)
*
* npm install uuid @types/uuid
*/
import { createHash, generateKeyPairSync, sign, verify, KeyObject } from 'crypto';
import { v7 as uuidv7 } from 'uuid';
// ──────────────────────────────────────────────
// Type Definitions
// ──────────────────────────────────────────────
type EventType = 'SIG' | 'ORD' | 'ACK' | 'EXE' | 'PRT' | 'REJ' | 'CXL' | 'MOD' | 'CLS'
| 'ALG' | 'RSK' | 'AUD' | 'HBT' | 'ERR' | 'REC' | 'SNC';
const EventTypeCode: Record<EventType, number> = {
SIG: 1, ORD: 2, ACK: 3, EXE: 4, PRT: 5, REJ: 6, CXL: 7, MOD: 8, CLS: 9,
ALG: 20, RSK: 21, AUD: 22,
HBT: 98, ERR: 99, REC: 100, SNC: 101
} as const;
type TimestampPrecision = 'NANOSECOND' | 'MICROSECOND' | 'MILLISECOND';
type ClockSyncStatus = 'PTP_LOCKED' | 'NTP_SYNCED' | 'BEST_EFFORT' | 'UNRELIABLE';
type HashAlgo = 'SHA256' | 'SHA3_256' | 'BLAKE3';
interface VcpHeader {
event_id: string;
trace_id: string;
timestamp_int: string;
timestamp_iso: string;
event_type: EventType;
event_type_code: number;
timestamp_precision: TimestampPrecision;
clock_sync_status: ClockSyncStatus;
hash_algo: HashAlgo;
venue_id: string;
symbol: string;
account_id: string;
operator_id?: string;
}
interface VcpSecurity {
event_hash: string;
prev_hash: string;
signature?: string;
sign_algo?: string;
merkle_root?: string;
}
interface VcpEvent {
header: VcpHeader;
payload: Record<string, unknown>;
security: VcpSecurity;
}
// ──────────────────────────────────────────────
// Utility Functions
// ──────────────────────────────────────────────
function getTimestampNs(): string {
// BigInt for nanosecond precision
return (BigInt(Date.now()) * 1_000_000n).toString();
}
function getTimestampIso(): string {
return new Date().toISOString();
}
function numericToString(value: number, decimals: number): string {
return value.toFixed(decimals);
}
function pseudonymizeAccount(accountId: string, salt: string): string {
const hash = createHash('sha256').update(`${salt}:${accountId}`).digest('hex');
return `acc_${hash.slice(0, 16)}`;
}
// ──────────────────────────────────────────────
// Event Factory
// ──────────────────────────────────────────────
class VcpEventFactory {
constructor(
private venueId: string,
private precision: TimestampPrecision = 'MILLISECOND',
private clockSync: ClockSyncStatus = 'BEST_EFFORT',
private salt: string = 'default_salt'
) {}
createEvent(
eventType: EventType,
symbol: string,
accountId: string,
traceId?: string
): VcpEvent {
return {
header: {
event_id: uuidv7(),
trace_id: traceId ?? uuidv7(),
timestamp_int: getTimestampNs(),
timestamp_iso: getTimestampIso(),
event_type: eventType,
event_type_code: EventTypeCode[eventType],
timestamp_precision: this.precision,
clock_sync_status: this.clockSync,
hash_algo: 'SHA256',
venue_id: this.venueId,
symbol,
account_id: pseudonymizeAccount(accountId, this.salt),
},
payload: {},
security: {
event_hash: '',
prev_hash: '0'.repeat(64),
},
};
}
createSignal(
symbol: string, accountId: string,
algoId: string, algoVersion: string, confidence: string
): VcpEvent {
const event = this.createEvent('SIG', symbol, accountId);
event.payload = {
vcp_gov: {
algo_id: algoId,
algo_version: algoVersion,
algo_type: 'AI_MODEL',
confidence_score: confidence,
}
};
return event;
}
createOrder(
symbol: string, accountId: string, traceId: string,
orderId: string, side: string, orderType: string,
price: string, quantity: string
): VcpEvent {
const event = this.createEvent('ORD', symbol, accountId, traceId);
event.payload = {
trade_data: { order_id: orderId, side, order_type: orderType, price, quantity }
};
return event;
}
createExecution(
symbol: string, accountId: string, traceId: string,
orderId: string, execPrice: string, execQty: string,
commission: string, slippage: string
): VcpEvent {
const event = this.createEvent('EXE', symbol, accountId, traceId);
event.payload = {
trade_data: {
order_id: orderId, execution_price: execPrice,
executed_qty: execQty, commission, slippage
}
};
return event;
}
}
// ──────────────────────────────────────────────
// Security Pipeline
// ──────────────────────────────────────────────
class VcpSecurityPipeline {
private prevHash: string = '0'.repeat(64);
private events: VcpEvent[] = [];
constructor(private privateKey?: KeyObject) {}
process(event: VcpEvent): VcpEvent {
// Layer 1: EventHash
const canonicalH = canonicalizeJson(event.header);
const canonicalP = canonicalizeJson(event.payload);
const hashInput = canonicalH + canonicalP + this.prevHash;
const eventHash = createHash('sha256').update(hashInput, 'utf8').digest('hex');
event.security.event_hash = eventHash;
event.security.prev_hash = this.prevHash;
// Layer 3: Signature
if (this.privateKey) {
const sig = sign(null, Buffer.from(eventHash, 'hex'), this.privateKey);
event.security.signature = sig.toString('hex');
event.security.sign_algo = 'ED25519';
}
this.prevHash = eventHash;
this.events.push(event);
return event;
}
buildMerkleTree(): MerkleTree {
const hashes = this.events.map(e => e.security.event_hash);
return buildMerkleTree(hashes);
}
getEvents(): VcpEvent[] {
return [...this.events];
}
}
// ──────────────────────────────────────────────
// Usage: Full Trade Lifecycle
// ──────────────────────────────────────────────
function demoTradeLifecycle() {
const { privateKey, publicKey } = generateKeyPairSync('ed25519');
const factory = new VcpEventFactory('DEMO_BROKER', 'MILLISECOND', 'NTP_SYNCED');
const pipeline = new VcpSecurityPipeline(privateKey);
// SIG → ORD → EXE
const sig = pipeline.process(
factory.createSignal('XAUUSD', 'trader_001', 'neural-v2', '2.1.0', '0.87')
);
const traceId = sig.header.trace_id;
const ord = pipeline.process(
factory.createOrder('XAUUSD', 'trader_001', traceId,
'ORD-001', 'BUY', 'LIMIT', '2650.50', '1.00')
);
const exe = pipeline.process(
factory.createExecution('XAUUSD', 'trader_001', traceId,
'ORD-001', '2650.55', '1.00', '2.50', '0.05')
);
// Merkle tree
const tree = pipeline.buildMerkleTree();
console.log(`Merkle root: ${tree.root.slice(0, 32)}...`);
// Verify signature
const sigValid = verify(
null,
Buffer.from(exe.security.event_hash, 'hex'),
publicKey,
Buffer.from(exe.security.signature!, 'hex')
);
console.log(`Signature valid: ${sigValid}`);
}
Part 9: Implementation in MQL5
The MQL5 implementation is fundamentally different from Python and TypeScript. MetaTrader 5 runs Expert Advisors (EAs) in a constrained environment: no native HTTP libraries, no hashlib, no Ed25519. VCP addresses this with a sidecar bridge architecture.
Architecture: The EA Doesn't Sign
┌───────────────────────────────────────────────────────┐
│ MetaTrader 5 │
│ │
│ ┌──────────────┐ │
│ │ EA / Algo │ (Trading Logic — unchanged) │
│ └──────┬───────┘ │
│ │ Hook Points (OnTrade, OnTick) │
│ ▼ │
│ ┌──────────────┐ ┌──────────────┐ │
│ │vcp-mql-bridge│────▶│ Local Queue │ │
│ │ (VCPLogger) │ │ (Memory/File) │ │
│ └──────────────┘ └──────┬───────┘ │
│ │ Timer (async batch) │
└───────────────────────────────┼───────────────────────┘
│ HTTPS
▼
┌────────────────────┐
│ VeritasChain Cloud │
│ (VCC) │
│ • Hash chain │
│ • Merkle tree │
│ • Ed25519 sign │
│ • External anchor │
└────────────────────┘
The EA captures events and queues them. VCC handles all cryptographic operations. This is the Silver Tier model — delegated signatures.
MQL5 Implementation
//+------------------------------------------------------------------+
//| vcp_bridge.mqh - VCP Sidecar Bridge for MetaTrader 5 |
//| Part of: vcp-mql-bridge v1.0 |
//| License: Apache 2.0 |
//+------------------------------------------------------------------+
#property copyright "VeritasChain Standards Organization"
#property link "https://veritaschain.org"
#property version "1.00"
#define VCP_SPEC_VERSION "1.1"
#define VCP_SDK_VERSION "1.0.0"
// ──────────────────────────────────────────────
// Event Type Codes
// ──────────────────────────────────────────────
enum VCP_EVENT_TYPE
{
VCP_SIG = 1, // Signal
VCP_ORD = 2, // Order
VCP_ACK = 3, // Acknowledgment
VCP_EXE = 4, // Execution
VCP_PRT = 5, // Partial fill
VCP_REJ = 6, // Rejection
VCP_CXL = 7, // Cancellation
VCP_MOD = 8, // Modification
VCP_CLS = 9, // Close
VCP_RSK = 21, // Risk
VCP_HBT = 98, // Heartbeat
VCP_ERR = 99 // Error
};
// ──────────────────────────────────────────────
// Configuration
// ──────────────────────────────────────────────
struct VCP_CONFIG
{
string api_key;
string endpoint;
string venue_id;
string tier; // "SILVER", "GOLD", "PLATINUM"
bool async_mode;
int queue_size;
int batch_size;
int batch_interval_ms;
bool enable_cache;
string cache_path;
};
// ──────────────────────────────────────────────
// VCP Event Structure
// ──────────────────────────────────────────────
struct VCP_EVENT
{
string event_id;
string trace_id;
string timestamp_int;
string timestamp_iso;
int event_type_code;
string event_type_name;
string symbol;
string account_id;
// Trade data
string order_id;
string side;
string order_type;
string price;
string quantity;
string execution_price;
string executed_qty;
string commission;
string slippage;
// Gov data
string algo_id;
string algo_version;
string confidence;
};
// ──────────────────────────────────────────────
// VCPLogger Class
// ──────────────────────────────────────────────
class CVCPLogger
{
private:
VCP_CONFIG m_config;
VCP_EVENT m_queue[];
int m_queue_count;
bool m_initialized;
string m_account_hash;
string GenerateUUID()
{
// Simplified UUID generation for MQL5
// Production: use VCPBridge.dll for proper UUIDv7
string uuid = "";
for(int i = 0; i < 32; i++)
{
int r = MathRand() % 16;
uuid += StringFormat("%x", r);
if(i == 7 || i == 11 || i == 15 || i == 19)
uuid += "-";
}
return uuid;
}
string GetTimestampNs()
{
// Milliseconds * 1,000,000 for nanosecond representation
long ms = (long)TimeTradeServer() * 1000;
// In MQL5, GetTickCount64() provides millisecond precision
return IntegerToString(GetTickCount64() * 1000000);
}
string GetTimestampISO()
{
datetime dt = TimeCurrent();
return TimeToString(dt, TIME_DATE | TIME_SECONDS);
}
string PseudonymizeAccount()
{
// SHA-256 pseudonymization via account number
// Note: MQL5 lacks native SHA-256; use DLL or pre-computed hash
long acct = AccountInfoInteger(ACCOUNT_LOGIN);
return "acc_" + IntegerToString(acct); // Simplified
}
string EventToJSON(const VCP_EVENT &event)
{
string json = "{";
json += "\"header\":{";
json += "\"event_id\":\"" + event.event_id + "\",";
json += "\"trace_id\":\"" + event.trace_id + "\",";
json += "\"timestamp_int\":\"" + event.timestamp_int + "\",";
json += "\"timestamp_iso\":\"" + event.timestamp_iso + "\",";
json += "\"event_type\":\"" + event.event_type_name + "\",";
json += "\"event_type_code\":" + IntegerToString(event.event_type_code) + ",";
json += "\"timestamp_precision\":\"MILLISECOND\",";
json += "\"clock_sync_status\":\"BEST_EFFORT\",";
json += "\"hash_algo\":\"SHA256\",";
json += "\"venue_id\":\"" + m_config.venue_id + "\",";
json += "\"symbol\":\"" + event.symbol + "\",";
json += "\"account_id\":\"" + m_account_hash + "\"";
json += "},";
// Payload
json += "\"payload\":{";
if(event.order_id != "")
{
json += "\"trade_data\":{";
json += "\"order_id\":\"" + event.order_id + "\"";
if(event.side != "")
json += ",\"side\":\"" + event.side + "\"";
if(event.price != "")
json += ",\"price\":\"" + event.price + "\"";
if(event.quantity != "")
json += ",\"quantity\":\"" + event.quantity + "\"";
if(event.execution_price != "")
json += ",\"execution_price\":\"" + event.execution_price + "\"";
if(event.executed_qty != "")
json += ",\"executed_qty\":\"" + event.executed_qty + "\"";
if(event.commission != "")
json += ",\"commission\":\"" + event.commission + "\"";
if(event.slippage != "")
json += ",\"slippage\":\"" + event.slippage + "\"";
json += "}";
}
if(event.algo_id != "")
{
if(event.order_id != "") json += ",";
json += "\"vcp_gov\":{";
json += "\"algo_id\":\"" + event.algo_id + "\",";
json += "\"algo_version\":\"" + event.algo_version + "\",";
json += "\"confidence_score\":\"" + event.confidence + "\"";
json += "}";
}
json += "},";
// Security (computed by VCC for Silver tier)
json += "\"security\":{\"event_hash\":\"\",\"prev_hash\":\"\"}";
json += "}";
return json;
}
bool SendBatch()
{
if(m_queue_count == 0) return true;
// Build JSON array
string batch = "[";
for(int i = 0; i < m_queue_count; i++)
{
if(i > 0) batch += ",";
batch += EventToJSON(m_queue[i]);
}
batch += "]";
// HTTP POST to VCC
string headers = "Content-Type: application/json\r\n";
headers += "Authorization: Bearer " + m_config.api_key + "\r\n";
headers += "X-VCP-Version: " + VCP_SPEC_VERSION + "\r\n";
char post_data[];
char result_data[];
string result_headers;
StringToCharArray(batch, post_data, 0, WHOLE_ARRAY, CP_UTF8);
int res = WebRequest(
"POST",
m_config.endpoint + "/events/batch",
headers,
5000, // 5 second timeout
post_data,
result_data,
result_headers
);
if(res == 200 || res == 201)
{
m_queue_count = 0;
return true;
}
else
{
PrintFormat("VCP: Batch send failed, HTTP %d", res);
// Cache locally if enabled
if(m_config.enable_cache)
CacheToFile(batch);
return false;
}
}
void CacheToFile(string json_data)
{
string filename = m_config.cache_path;
if(filename == "")
filename = "VCP_cache_" + IntegerToString(AccountInfoInteger(ACCOUNT_LOGIN)) + ".jsonl";
int handle = FileOpen(filename, FILE_WRITE | FILE_READ | FILE_TXT | FILE_ANSI);
if(handle != INVALID_HANDLE)
{
FileSeek(handle, 0, SEEK_END);
FileWriteString(handle, json_data + "\n");
FileClose(handle);
}
}
public:
bool Initialize(VCP_CONFIG &config)
{
m_config = config;
m_queue_count = 0;
m_initialized = true;
m_account_hash = PseudonymizeAccount();
ArrayResize(m_queue, config.queue_size > 0 ? config.queue_size : 1000);
PrintFormat("VCP: Initialized for %s (tier: %s)", config.venue_id, config.tier);
return true;
}
void Deinitialize()
{
if(m_queue_count > 0)
SendBatch(); // Flush remaining events
m_initialized = false;
}
string LogSignal(string symbol, string algo_id, string version, string confidence)
{
if(!m_initialized) return "";
VCP_EVENT event;
event.event_id = GenerateUUID();
event.trace_id = GenerateUUID();
event.timestamp_int = GetTimestampNs();
event.timestamp_iso = GetTimestampISO();
event.event_type_code = VCP_SIG;
event.event_type_name = "SIG";
event.symbol = symbol;
event.algo_id = algo_id;
event.algo_version = version;
event.confidence = confidence;
m_queue[m_queue_count++] = event;
if(m_queue_count >= m_config.batch_size)
SendBatch();
return event.trace_id;
}
void LogOrder(string symbol, string trace_id, long ticket,
string side, string order_type, double price, double volume)
{
if(!m_initialized) return;
VCP_EVENT event;
event.event_id = GenerateUUID();
event.trace_id = trace_id;
event.timestamp_int = GetTimestampNs();
event.timestamp_iso = GetTimestampISO();
event.event_type_code = VCP_ORD;
event.event_type_name = "ORD";
event.symbol = symbol;
event.order_id = IntegerToString(ticket);
event.side = side;
event.order_type = order_type;
event.price = DoubleToString(price, _Digits);
event.quantity = DoubleToString(volume, 2);
m_queue[m_queue_count++] = event;
if(m_queue_count >= m_config.batch_size)
SendBatch();
}
void LogExecution(string symbol, string trace_id, long ticket,
double exec_price, double exec_qty,
double commission, double slippage)
{
if(!m_initialized) return;
VCP_EVENT event;
event.event_id = GenerateUUID();
event.trace_id = trace_id;
event.timestamp_int = GetTimestampNs();
event.timestamp_iso = GetTimestampISO();
event.event_type_code = VCP_EXE;
event.event_type_name = "EXE";
event.symbol = symbol;
event.order_id = IntegerToString(ticket);
event.execution_price = DoubleToString(exec_price, _Digits);
event.executed_qty = DoubleToString(exec_qty, 2);
event.commission = DoubleToString(commission, 2);
event.slippage = DoubleToString(slippage, _Digits);
m_queue[m_queue_count++] = event;
if(m_queue_count >= m_config.batch_size)
SendBatch();
}
void ProcessQueue()
{
// Called from OnTimer() — non-blocking batch send
if(m_queue_count > 0)
SendBatch();
}
};
// ──────────────────────────────────────────────
// EA Integration Example
// ──────────────────────────────────────────────
CVCPLogger g_vcp;
input string InpVcpApiKey = ""; // VCP API Key
input string InpVcpEndpoint = "https://api.veritaschain.org/v1"; // VCC Endpoint
input string InpVenueId = "MY_PROP_FIRM"; // Venue ID
int OnInit()
{
VCP_CONFIG config;
config.api_key = InpVcpApiKey;
config.endpoint = InpVcpEndpoint;
config.venue_id = InpVenueId;
config.tier = "SILVER";
config.async_mode = true;
config.queue_size = 1000;
config.batch_size = 50;
config.batch_interval_ms = 1000;
config.enable_cache = true;
config.cache_path = "";
if(!g_vcp.Initialize(config))
return INIT_FAILED;
EventSetMillisecondTimer(config.batch_interval_ms);
return INIT_SUCCEEDED;
}
void OnDeinit(const int reason)
{
g_vcp.Deinitialize();
EventKillTimer();
}
void OnTimer()
{
g_vcp.ProcessQueue();
}
void OnTick()
{
// Your existing trading logic here
if(ShouldBuy())
{
// 1. Log signal
string trace_id = g_vcp.LogSignal(
_Symbol, "MA_CROSS_EA", "1.0.0", "0.82"
);
// 2. Send order
MqlTradeRequest request = {};
MqlTradeResult result = {};
request.action = TRADE_ACTION_DEAL;
request.symbol = _Symbol;
request.volume = 0.1;
request.type = ORDER_TYPE_BUY;
request.price = SymbolInfoDouble(_Symbol, SYMBOL_ASK);
request.deviation = 10;
// 3. Log order
g_vcp.LogOrder(
_Symbol, trace_id, 0,
"BUY", "MARKET",
request.price, request.volume
);
if(OrderSend(request, result))
{
if(result.retcode == TRADE_RETCODE_DONE)
{
// 4. Log execution
g_vcp.LogExecution(
_Symbol, trace_id, result.order,
result.price, result.volume,
0.0, // Commission (query later)
result.price - request.price // Slippage
);
}
}
}
}
bool ShouldBuy()
{
// Your signal logic
return false;
}
Key Difference: Silver Tier Delegation
In the MQL5 implementation, the security section is empty when events are sent. VCC (VeritasChain Cloud) handles:
- Hash calculation (SHA-256 with RFC 8785 canonicalization)
- Hash chain linking
- Ed25519 signing (delegated signature)
- Merkle tree construction
- External anchoring
The EA focuses on what it does best — capturing events at the point of decision — while cryptographic operations are handled by infrastructure designed for that purpose.
Part 10: VCP-XREF — Cross-Party Verification
VCP-XREF solves the fundamental problem of multi-party disputes: when a trader says "my order was filled at the wrong price" and the broker says "the fill was correct," who do you believe?
Traditional answer: whoever has better lawyers.
VCP-XREF answer: whoever has better cryptographic evidence.
How It Works
Both parties maintain independent VCP event streams. When a transaction occurs, they create linked events using a shared CrossReferenceID:
Trader's VCP Stream Broker's VCP Stream
┌──────────────────────┐ ┌──────────────────────┐
│ EventType: ORD │ │ EventType: ACK │
│ XREF: { │ │ XREF: { │
│ CrossRefID: "abc" │◄────────────►│ CrossRefID: "abc" │
│ PartyRole: INIT │ │ PartyRole: COUNTER │
│ OrderID: "ORD-001" │ │ OrderID: "ORD-001" │
│ Price: "2650.50" │ │ Price: "2650.50" │
│ Status: PENDING │ │ Status: MATCHED │
│ } │ │ } │
│ Signed: ed25519:xxx │ │ Signed: ed25519:yyy │
│ Anchored: RFC3161 │ │ Anchored: blockchain │
└──────────────────────┘ └──────────────────────┘
Verification Implementation
from dataclasses import dataclass
from enum import Enum
from typing import List, Any, Dict, Optional
class ReconciliationStatus(Enum):
PENDING = "PENDING"
MATCHED = "MATCHED"
DISCREPANCY = "DISCREPANCY"
TIMEOUT = "TIMEOUT"
class DiscrepancySeverity(Enum):
INFO = "INFO"
WARNING = "WARNING"
CRITICAL = "CRITICAL"
@dataclass
class DiscrepancyDetail:
field: str
local_value: Any
counterparty_value: Any
severity: DiscrepancySeverity
@dataclass
class VerificationResult:
status: ReconciliationStatus
discrepancies: List[DiscrepancyDetail]
confidence_score: float
def verify_cross_reference(
initiator_event: Dict[str, Any],
counterparty_event: Dict[str, Any],
strict_mode: bool = True
) -> VerificationResult:
"""
Verify cross-reference between two VCP events from different parties.
Returns MATCHED if all critical fields agree within tolerance.
Returns DISCREPANCY with details if any field disagrees.
"""
discrepancies: List[DiscrepancyDetail] = []
xref_init = initiator_event["VCP-XREF"]
xref_counter = counterparty_event["VCP-XREF"]
# 1. CrossReferenceID must match exactly
if xref_init["CrossReferenceID"] != xref_counter["CrossReferenceID"]:
return VerificationResult(
status=ReconciliationStatus.DISCREPANCY,
discrepancies=[DiscrepancyDetail(
"CrossReferenceID",
xref_init["CrossReferenceID"],
xref_counter["CrossReferenceID"],
DiscrepancySeverity.CRITICAL
)],
confidence_score=0.0
)
# 2. OrderID must match
init_key = xref_init["SharedEventKey"]
counter_key = xref_counter["SharedEventKey"]
if init_key["OrderID"] != counter_key["OrderID"]:
discrepancies.append(DiscrepancyDetail(
"OrderID", init_key["OrderID"], counter_key["OrderID"],
DiscrepancySeverity.CRITICAL
))
# 3. Timestamp within tolerance
time_diff_ns = abs(init_key["Timestamp"] - counter_key["Timestamp"])
tolerance_ns = init_key["ToleranceMs"] * 1_000_000
if time_diff_ns > tolerance_ns:
discrepancies.append(DiscrepancyDetail(
"Timestamp", init_key["Timestamp"], counter_key["Timestamp"],
DiscrepancySeverity.WARNING if time_diff_ns < tolerance_ns * 2
else DiscrepancySeverity.CRITICAL
))
# 4. Compare trade-specific fields
init_payload = initiator_event.get("payload", {}).get("trade_data", {})
counter_payload = counterparty_event.get("payload", {}).get("trade_data", {})
critical_fields = ["execution_price", "executed_qty"]
for field_name in critical_fields:
init_val = init_payload.get(field_name)
counter_val = counter_payload.get(field_name)
if init_val and counter_val and init_val != counter_val:
discrepancies.append(DiscrepancyDetail(
field_name, init_val, counter_val,
DiscrepancySeverity.CRITICAL
))
# Determine result
critical_count = sum(
1 for d in discrepancies if d.severity == DiscrepancySeverity.CRITICAL
)
if critical_count > 0:
return VerificationResult(
status=ReconciliationStatus.DISCREPANCY,
discrepancies=discrepancies,
confidence_score=0.0
)
elif discrepancies:
return VerificationResult(
status=ReconciliationStatus.MATCHED, # Soft match
discrepancies=discrepancies,
confidence_score=0.8
)
else:
return VerificationResult(
status=ReconciliationStatus.MATCHED,
discrepancies=[],
confidence_score=1.0
)
The Security Guarantee
| Attack Scenario | Detection |
|---|---|
| Party A modifies their log after the fact | Party B's independent log provides counter-evidence |
| Both parties collude to modify records | External anchoring timestamps expose the modification |
| Party B denies receiving an order | Party A's XREF record with anchored timestamp is evidence |
| Missing counterparty response |
TIMEOUT status is itself evidence of non-cooperation |
Full undetectable manipulation requires compromising: both parties' logs, both external anchoring systems, and all observer subscriptions. For public blockchains and commercial TSAs, this is economically infeasible.
Part 11: Crypto-Shredding
Here's a real regulatory conflict: GDPR Article 17 gives individuals the right to have their personal data erased. MiFID II requires trading records to be retained for 5-7 years. VCP creates immutable audit trails.
How do you delete data from something that's designed to be undeletable?
The Solution: Encrypt, Then Destroy the Key
from cryptography.fernet import Fernet
import hashlib
class CryptoShredding:
"""
VCP Crypto-Shredding: GDPR Article 17 compliance for immutable logs.
Strategy:
1. Encrypt PII with a per-subject key BEFORE logging
2. Log the ciphertext (immutable audit trail preserved)
3. On erasure request, destroy the encryption key
4. Ciphertext without key = functionally anonymous data
"""
def __init__(self):
self.key_store: dict[str, bytes] = {} # subject_id → encryption_key
def get_or_create_key(self, subject_id: str) -> bytes:
"""Get or create an encryption key for a data subject."""
if subject_id not in self.key_store:
self.key_store[subject_id] = Fernet.generate_key()
return self.key_store[subject_id]
def encrypt_pii(self, subject_id: str, plaintext: str) -> str:
"""Encrypt PII before it enters the VCP event stream."""
key = self.get_or_create_key(subject_id)
f = Fernet(key)
return f.encrypt(plaintext.encode()).decode()
def decrypt_pii(self, subject_id: str, ciphertext: str) -> str:
"""Decrypt PII (only works while key exists)."""
key = self.key_store.get(subject_id)
if not key:
raise KeyError(f"Key destroyed for subject {subject_id}")
f = Fernet(key)
return f.decrypt(ciphertext.encode()).decode()
def execute_erasure(self, subject_id: str) -> dict:
"""
GDPR Article 17: Right to erasure.
Destroys the encryption key. The ciphertext remains in the
immutable audit trail, but is now functionally anonymous —
indistinguishable from random bytes.
Per EDPB guidance: encrypted data with a destroyed key
is effectively anonymized data.
"""
if subject_id in self.key_store:
del self.key_store[subject_id]
return {
"status": "ERASED",
"subject_id_hash": hashlib.sha256(subject_id.encode()).hexdigest(),
"method": "CRYPTO_SHREDDING",
"key_destroyed": True,
"audit_trail_intact": True
}
return {"status": "NOT_FOUND"}
# Usage in VCP pipeline
shredder = CryptoShredding()
# Before logging: encrypt PII
account_name_encrypted = shredder.encrypt_pii("trader_42", "John Smith")
email_encrypted = shredder.encrypt_pii("trader_42", "john@example.com")
# The VCP event contains only ciphertext
event_payload = {
"trade_data": {
"order_id": "ORD-001", # Not PII
"price": "2650.55", # Not PII
"quantity": "1.00", # Not PII
},
"vcp_privacy": {
"encryption_key_id": "trader_42_key_v1",
"anonymization_level": "PSEUDONYMIZED",
"gdpr_basis": "legitimate_interest",
"account_name": account_name_encrypted, # Ciphertext
"email": email_encrypted, # Ciphertext
}
}
# On GDPR erasure request: destroy the key
# The VCP hash chain remains intact
# The Merkle tree remains valid
# The external anchoring remains verified
# But the PII is now irrecoverable
result = shredder.execute_erasure("trader_42")
# {"status": "ERASED", "key_destroyed": True, "audit_trail_intact": True}
The hash chain, Merkle tree, and external anchoring all remain valid because they were computed over the ciphertext, not the plaintext. The cryptographic structure is preserved. Only the human-readable PII is destroyed.
Part 12: The Sidecar Pattern
The sidecar pattern is VCP's answer to the integration cost problem. Modifying a production trading system to add audit logging is expensive, risky, and often politically impossible. The sidecar avoids all of this.
Architecture Principles
┌────────────────────────────────────────────────────────────┐
│ Trading System │
│ │
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
│ │ Algo │────▶│ Order │────▶│ FIX │─────▶ Venue
│ │ Engine │ │ Manager │ │ Engine │ │
│ └────┬─────┘ └────┬─────┘ └────┬─────┘ │
│ │ │ │ │
│ │ ┌───────────┴────────────────┘ │
│ │ │ Event Tap (async, non-blocking) │
│ ▼ ▼ │
│ ┌─────────────────────────────────────┐ │
│ │ VCP Sidecar Process │ │
│ │ ┌─────────┐ ┌─────────┐ ┌───────┐ │ │
│ │ │ Collect │▶│ Hash │▶│ Queue │ │───▶ VCC/Anchor │
│ │ │ Events │ │ Chain │ │ Batch │ │ │
│ │ └─────────┘ └─────────┘ └───────┘ │ │
│ └─────────────────────────────────────┘ │
└────────────────────────────────────────────────────────────┘
Key: FIX messages flow unchanged. Zero latency impact.
Like installing a dashcam — doesn't change how you drive.
Properties
- Non-invasive: No modification to existing trading logic
- Asynchronous: Event capture is non-blocking
- Independent: Sidecar crash doesn't affect trading
- Platform-agnostic: Works with MT4/MT5, cTrader, Interactive Brokers, FIX engines
- Offline-capable: Local caching during network outages
FIX Protocol Integration (C++ Example)
// QuickFIX callback — add VCP tap without touching order logic
void Application::onMessage(
const FIX44::ExecutionReport& msg,
const FIX::SessionID& session)
{
// ── Normal processing (unchanged) ──
processExecution(msg);
// ── VCP sidecar tap (async, non-blocking) ──
vcpSidecar.emitAsync("EXE", extractVcpPayload(msg));
}
VcpPayload extractVcpPayload(const FIX44::ExecutionReport& msg) {
VcpPayload p;
msg.get(p.orderId); // Tag 37
msg.get(p.execId); // Tag 17
msg.get(p.lastPx); // Tag 31
msg.get(p.lastQty); // Tag 32
msg.get(p.transactTime); // Tag 60
return p;
}
Zero impact on your critical path. The emitAsync call enqueues the event and returns immediately. The sidecar process handles hashing, signing, batching, and anchoring in its own thread.
Part 13: Conformance Testing
VCP defines 125 conformance tests organized into 9 categories. Passing these tests is a prerequisite for VC-Certified status.
┌───────────────────────────────────────────────┐
│ VCP Conformance Test Suite │
├───────────────────┬───────────────────────────┤
│ Category │ Tests │ Coverage │
├───────────────────┼───────┼───────────────────┤
│ Schema Validation │ 25 │ JSON structure │
│ UUID v7 │ 10 │ Format, ordering │
│ Timestamp │ 12 │ Dual format, sync │
│ Hash Chain │ 15 │ Genesis, continuity│
│ Signature │ 10 │ Ed25519 validity │
│ Merkle Proof │ 8 │ Inclusion proofs │
│ Event Type │ 20 │ Trade lifecycle │
│ Integration │ 15 │ End-to-end API │
│ Performance │ 10 │ Throughput, latency│
├───────────────────┼───────┼───────────────────┤
│ Total │ 125 │ │
└───────────────────┴───────┴───────────────────┘
Certification Pass Rates:
Silver: 95% (119/125)
Gold: 98% (123/125)
Platinum: 100% (125/125)
Example: Hash Chain Verification Test
def test_hash_chain_continuity():
"""
Verify that each event's prev_hash matches the
preceding event's event_hash.
"""
events = load_event_chain("test_fixtures/trade_lifecycle.jsonl")
for i, event in enumerate(events):
if i == 0:
# Genesis event: prev_hash must be 64 zeros
assert event["security"]["prev_hash"] == "0" * 64, \
f"Genesis prev_hash must be 64 zeros, got: {event['security']['prev_hash']}"
else:
# Chain continuity: prev_hash == previous event_hash
expected = events[i - 1]["security"]["event_hash"]
actual = event["security"]["prev_hash"]
assert actual == expected, \
f"Chain break at event {i}: expected {expected[:16]}..., got {actual[:16]}..."
# Verify event_hash is correctly computed
computed = calculate_event_hash(
event["header"],
event["payload"],
event["security"]["prev_hash"]
)
assert computed == event["security"]["event_hash"], \
f"Hash mismatch at event {i}: computed {computed[:16]}..., stored {event['security']['event_hash'][:16]}..."
print(f"✓ Hash chain valid: {len(events)} events, no breaks")
Running Tests
# Install conformance suite
pip install vcp-conformance
# Run all tests for Silver tier
vcp-test run --tier silver
# Run specific category
vcp-test run --category hash-chain
# Generate certification report
vcp-test report --format pdf --output certification_report.pdf
Part 14: Regulatory Mapping
VCP isn't just a technical exercise — it maps directly to regulatory requirements that are converging globally:
EU AI Act Article 12 (Effective 2026-2027)
| Requirement | VCP Component |
|---|---|
| Automatic recording of events | VCP-CORE mandatory event capture |
| Traceability to identify risks |
trace_id + hash chain + Merkle proofs |
| Post-market monitoring | VCP Explorer API for real-time queries |
| Facilitate monitoring of operation | VCP-RISK, VCP-GOV governance metadata |
MiFID II RTS 25
| Requirement | VCP Component |
|---|---|
| Clock synchronization (100μs HFT, 1ms algo) |
clock_sync_status: PTP_LOCKED, NTP_SYNCED |
| Timestamp granularity |
timestamp_precision: NANOSECOND/MICROSECOND/MILLISECOND |
| Record keeping (5-7 years) | Immutable hash chain + external anchoring |
| Annual traceability | Conformance test reports |
GDPR Article 17
| Requirement | VCP Component |
|---|---|
| Right to erasure | Crypto-shredding (VCP-PRIVACY) |
| Pseudonymization | SHA-256 account_id hashing |
| Data minimization | Configurable payload fields |
| Retention limits | Key lifecycle management |
SEC Rule 17a-4 (US)
| Requirement | VCP Component |
|---|---|
| WORM (Write Once, Read Many) | Hash chain + Merkle tree + external anchoring |
| Non-rewritable, non-erasable | Cryptographic immutability |
| Index and access capability | VCP Explorer API queries |
VCP achieves approximately 87% coverage across these four regulatory frameworks with its current component set. The remaining gaps relate to real-time alerting thresholds and AI explainability depth requirements.
Resources and Next Steps
Specification and Code
- VCP v1.1 Specification: github.com/veritaschain/vcp-spec/tree/main/spec/v1.1
- IETF Internet-Draft: draft-kamimura-scitt-vcp
- GitHub Organization: github.com/veritaschain
- SDK Specification: vcp-sdk-spec
Quick Start
# Python
pip install vcp-sdk
python -c "from vcp_sdk import VcpEventFactory; print('VCP ready')"
# TypeScript
npm install @veritaschain/vcp-sdk
npx ts-node -e "import {VcpEventFactory} from '@veritaschain/vcp-sdk'; console.log('VCP ready')"
# MQL5
# Copy vcp-mql-bridge to MQL5/Include/VCP/
# Add #include <VCP/VCPLogger.mqh> to your EA
Licensing
- Specification: CC BY 4.0 International (free to use, modify, distribute)
- Code: Apache 2.0 (free for commercial use)
No vendor lock-in. No proprietary formats. No licensing fees.
Get Involved
- Technical questions: technical@veritaschain.org
- GitHub Issues: github.com/veritaschain/vcp-spec/issues
- VC-Certified program: certification@veritaschain.org
TL;DR
| Layer | What It Proves | How |
|---|---|---|
| L1: Event Integrity | Individual events weren't modified | SHA-256 hash of RFC 8785 canonical JSON |
| L2: Collection Integrity | No events were deleted or added | RFC 6962 Merkle trees with inclusion proofs |
| L3: External Verifiability | When records existed and who created them | Ed25519 signatures + RFC 3161 TSA/blockchain anchoring |
| Integration | Language | Architecture |
|---|---|---|
| Backend services | Python, TypeScript | Full SDK (self-signing) |
| MT5 Expert Advisors | MQL5 | Sidecar bridge (delegated) |
| FIX trading engines | C++ hook | Async event tap |
The next time an auditor asks "can you prove it?", your answer changes from "trust us" to "here's the Merkle proof."
This article describes the VeritasChain Protocol (VCP), an open standard developed by the VeritasChain Standards Organization (VSO). VCP is published under open licenses and welcomes community contribution.
Found a bug in the code? We'd rather know now than after deployment. Open an issue on GitHub or reach out at technical@veritaschain.org.
Top comments (0)