Building a Tamper-Evident Audit Log with SHA-256 Hash Chains (Zero Dependencies)
Ever wondered how blockchains detect tampering? The core mechanism is surprisingly simple: hash chains. Each record includes the hash of the previous record, creating a mathematical dependency that breaks if anything changes.
In this post, I'll walk through a complete implementation of tamper-evident logging using only vanilla JavaScript and the Web Crypto API. No npm packages, no build tools, no external dependencies.
Live Demo: veritaschain.org/regtech/demo
Source Code: All code shown below is production-ready and copy-pasteable.
Table of Contents
- The Problem We're Solving
- Hash Chain Architecture
- Implementation: SHA-256 with Web Crypto
- Implementation: Canonical JSON
- Implementation: Event Generation
- Implementation: Chain Verification
- Implementation: Tampering Detection
- Bonus: ZIP File Generation (No Libraries)
- Full Source Code
The Problem We're Solving {#the-problem}
Traditional audit logs are just text. Anyone with database access can:
// The attack
log[5].price = "1.09999"; // Changed from "1.08523"
delete log[3]; // Evidence removed
[log[7], log[8]] = [log[8], log[7]]; // Order swapped
After these changes, the log looks identical to a legitimate one. No visible corruption. No format errors. Undetectable.
We want a system where any modification — even a single character — becomes mathematically detectable.
Hash Chain Architecture {#architecture}
The solution is elegant:
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Event 0 │ │ Event 1 │ │ Event 2 │
├──────────────┤ ├──────────────┤ ├──────────────┤
│ prev: GENESIS│────▶│ prev: a3f2.. │────▶│ prev: 7bc1.. │
│ data: {...} │ │ data: {...} │ │ data: {...} │
│ hash: a3f2.. │ │ hash: 7bc1.. │ │ hash: e9f0.. │
└──────────────┘ └──────────────┘ └──────────────┘
Each event stores:
-
prev_hash: The hash of the previous event (or "GENESIS" for the first) -
event_hash: SHA-256 ofprev_hash + canonical(event_data)
If Event 1's data changes:
- Its hash changes from
7bc1..to something else - Event 2's
prev_hashno longer matches - Chain breaks → Tampering detected
Implementation: SHA-256 with Web Crypto {#sha256}
The Web Crypto API gives us native SHA-256. No libraries needed.
async function sha256(message) {
// Encode string to Uint8Array
const encoder = new TextEncoder();
const data = encoder.encode(message);
// Compute SHA-256 digest
const hashBuffer = await crypto.subtle.digest('SHA-256', data);
// Convert ArrayBuffer to hex string
const hashArray = Array.from(new Uint8Array(hashBuffer));
return hashArray.map(b => b.toString(16).padStart(2, '0')).join('');
}
// Usage
const hash = await sha256('hello world');
// Returns: "b94d27b9934d3e08a52e52d7da7dabfac484efe37a5380ee9088f7ace2efcde9"
Key points:
-
crypto.subtleis available in all modern browsers (Chrome 60+, Firefox 57+, Safari 11+) - Returns a Promise (async operation)
- Output is 64 hex characters (256 bits)
Implementation: Canonical JSON {#canonical-json}
Here's the gotcha: JSON.stringify() doesn't guarantee key order.
JSON.stringify({b: 2, a: 1}); // Could be '{"b":2,"a":1}' or '{"a":1,"b":2}'
Different strings → different hashes → false tampering alerts.
We need canonical JSON: deterministic stringification with sorted keys.
function canonicalStringify(obj) {
// Primitives: use standard stringify
if (obj === null || typeof obj !== 'object') {
return JSON.stringify(obj);
}
// Arrays: preserve order, recurse on elements
if (Array.isArray(obj)) {
return '[' + obj.map(canonicalStringify).join(',') + ']';
}
// Objects: sort keys alphabetically, recurse on values
const sortedKeys = Object.keys(obj).sort();
const pairs = sortedKeys.map(key =>
JSON.stringify(key) + ':' + canonicalStringify(obj[key])
);
return '{' + pairs.join(',') + '}';
}
// Test it
canonicalStringify({zebra: 1, apple: {banana: 2, avocado: 1}});
// Always returns: '{"apple":{"avocado":1,"banana":2},"zebra":1}'
Key points:
- Keys sorted alphabetically at every nesting level
- Arrays maintain their order (only objects get sorted)
- No whitespace (compact format)
- Recursively handles nested structures
Implementation: Event Generation {#event-generation}
Now let's create a chain of events:
const GENESIS_HASH = 'GENESIS';
// Compute hash for a single event
async function computeEventHash(event, prevHash) {
// Only hash the core fields (exclude prev_hash and event_hash)
const eventCore = {
event_id: event.event_id,
trace_id: event.trace_id,
timestamp_ms: event.timestamp_ms,
event_type: event.event_type,
payload: event.payload
};
const canonical = canonicalStringify(eventCore);
const input = prevHash + canonical;
return await sha256(input);
}
// Generate a chain of events
async function generateEventChain(eventDataArray) {
const chain = [];
let prevHash = GENESIS_HASH;
for (const data of eventDataArray) {
const event = {
event_id: generateEventId(),
trace_id: data.trace_id,
timestamp_ms: Date.now(),
event_type: data.type,
payload: data.payload,
prev_hash: prevHash,
event_hash: '' // Will be computed
};
event.event_hash = await computeEventHash(event, prevHash);
chain.push(event);
prevHash = event.event_hash;
}
return chain;
}
// UUIDv7-like event ID (time-sortable)
function generateEventId() {
const timestamp = Date.now();
const hex = timestamp.toString(16).padStart(12, '0');
const random = Array.from({ length: 4 }, () =>
Math.floor(Math.random() * 65536).toString(16).padStart(4, '0')
).join('');
return `${hex.slice(0, 8)}-${hex.slice(8, 12)}-7${random.slice(0, 3)}-${random.slice(3, 7)}-${random.slice(7, 19)}`;
}
Why UUIDv7-like IDs?
- First 48 bits are timestamp → lexicographic sort = chronological sort
-
7indicates version 7 format - Remaining bits are random for uniqueness
Implementation: Chain Verification {#verification}
The verification algorithm checks two things for each event:
-
Linkage: Does
prev_hashmatch the previous event'sevent_hash? -
Integrity: Does
event_hashmatch a fresh computation from the data?
async function verifyChain(events) {
const results = {
valid: true,
checks: [],
firstFailure: null
};
for (let i = 0; i < events.length; i++) {
const event = events[i];
// Check 1: prev_hash linkage
const expectedPrevHash = i === 0 ? GENESIS_HASH : events[i - 1].event_hash;
const linkageValid = event.prev_hash === expectedPrevHash;
results.checks.push({
index: i,
check: 'prev_hash linkage',
pass: linkageValid,
detail: linkageValid
? `Links to ${i === 0 ? 'GENESIS' : 'Event #' + (i-1)}`
: `Expected ${expectedPrevHash.slice(0,16)}..., got ${event.prev_hash.slice(0,16)}...`
});
if (!linkageValid && !results.firstFailure) {
results.valid = false;
results.firstFailure = { index: i, reason: 'prev_hash mismatch' };
}
// Check 2: event_hash integrity
const recomputed = await computeEventHash(event, event.prev_hash);
const integrityValid = event.event_hash === recomputed;
results.checks.push({
index: i,
check: 'event_hash integrity',
pass: integrityValid,
detail: integrityValid
? `Hash verified: ${event.event_hash.slice(0,24)}...`
: `Content modified - hash mismatch`
});
if (!integrityValid && !results.firstFailure) {
results.valid = false;
results.firstFailure = {
index: i,
reason: 'event_hash mismatch',
expected: recomputed,
actual: event.event_hash
};
}
}
return results;
}
Verification complexity: O(n) — one SHA-256 computation per event.
Implementation: Tampering Detection {#tampering}
Let's implement three attack scenarios:
1. Payload Modification (Post-hoc Edit)
function tamperModify(events, targetIndex, field, newValue) {
const tampered = JSON.parse(JSON.stringify(events)); // Deep clone
// Modify the field
tampered[targetIndex].payload[field] = newValue;
// Note: We're NOT recalculating the hash
// This simulates an attacker who changes data but can't update hashes
// (because they don't know the future events' data)
return tampered;
}
// Example: Change execution price
const tampered = tamperModify(originalEvents, 3, 'price', '1.09999');
const result = await verifyChain(tampered);
// result.valid = false
// result.firstFailure = { index: 3, reason: 'event_hash mismatch' }
2. Event Deletion (Omission)
function tamperDelete(events, targetIndex) {
const tampered = JSON.parse(JSON.stringify(events));
tampered.splice(targetIndex, 1); // Remove event
return tampered;
}
// Example: Delete ACK event
const tampered = tamperDelete(originalEvents, 2);
const result = await verifyChain(tampered);
// result.valid = false
// result.firstFailure = { index: 2, reason: 'prev_hash mismatch' }
// (Event #3's prev_hash points to deleted Event #2's hash)
3. Event Reordering (Sequence Tampering)
function tamperSwap(events, index1, index2) {
const tampered = JSON.parse(JSON.stringify(events));
[tampered[index1], tampered[index2]] = [tampered[index2], tampered[index1]];
return tampered;
}
// Example: Swap events 2 and 3
const tampered = tamperSwap(originalEvents, 2, 3);
const result = await verifyChain(tampered);
// result.valid = false
// (Both events now have incorrect prev_hash values)
Bonus: ZIP File Generation (No Libraries) {#zip}
For the evidence pack feature, I needed ZIP generation without JSZip or similar. Here's a pure JS implementation:
async function createZip(files) {
const encoder = new TextEncoder();
let offset = 0;
const localHeaders = [];
const centralDirectory = [];
// Build local file headers and content
for (const file of files) {
const content = encoder.encode(file.content);
const nameBytes = encoder.encode(file.name);
// Local file header (30 bytes + filename)
const localHeader = new Uint8Array(30 + nameBytes.length);
const view = new DataView(localHeader.buffer);
view.setUint32(0, 0x04034b50, true); // Local file signature
view.setUint16(4, 20, true); // Version needed
view.setUint16(6, 0, true); // Flags
view.setUint16(8, 0, true); // Compression (none)
view.setUint16(10, 0, true); // Mod time
view.setUint16(12, 0, true); // Mod date
view.setUint32(14, crc32(content), true); // CRC-32
view.setUint32(18, content.length, true); // Compressed size
view.setUint32(22, content.length, true); // Uncompressed size
view.setUint16(26, nameBytes.length, true); // Filename length
view.setUint16(28, 0, true); // Extra field length
localHeader.set(nameBytes, 30);
localHeaders.push({ header: localHeader, content, name: nameBytes, offset });
offset += localHeader.length + content.length;
}
// Build central directory
const cdStart = offset;
for (const item of localHeaders) {
const cdHeader = new Uint8Array(46 + item.name.length);
const view = new DataView(cdHeader.buffer);
view.setUint32(0, 0x02014b50, true); // Central directory signature
view.setUint16(4, 20, true); // Version made by
view.setUint16(6, 20, true); // Version needed
// ... (set remaining fields)
view.setUint32(42, item.offset, true); // Offset to local header
cdHeader.set(item.name, 46);
centralDirectory.push(cdHeader);
offset += cdHeader.length;
}
// End of central directory
const eocd = new Uint8Array(22);
const eocdView = new DataView(eocd.buffer);
eocdView.setUint32(0, 0x06054b50, true); // EOCD signature
eocdView.setUint16(8, files.length, true); // Entries
eocdView.setUint32(12, offset - cdStart, true); // CD size
eocdView.setUint32(16, cdStart, true); // CD offset
// Combine all parts
const result = new Uint8Array(offset + 22);
let pos = 0;
for (const item of localHeaders) {
result.set(item.header, pos);
pos += item.header.length;
result.set(item.content, pos);
pos += item.content.length;
}
for (const cd of centralDirectory) {
result.set(cd, pos);
pos += cd.length;
}
result.set(eocd, pos);
return result;
}
// CRC-32 implementation
function crc32(data) {
const table = new Uint32Array(256);
for (let i = 0; i < 256; i++) {
let c = i;
for (let j = 0; j < 8; j++) {
c = (c & 1) ? (0xEDB88320 ^ (c >>> 1)) : (c >>> 1);
}
table[i] = c;
}
let crc = 0xFFFFFFFF;
for (let i = 0; i < data.length; i++) {
crc = table[(crc ^ data[i]) & 0xFF] ^ (crc >>> 8);
}
return (crc ^ 0xFFFFFFFF) >>> 0;
}
Why bother? Zero dependencies means the demo works offline, loads instantly, and has no supply chain risk.
Full Source Code {#full-source}
The complete implementation is ~500 lines of vanilla JS:
- Demo: veritaschain.org/regtech/demo
- GitHub: github.com/veritaschain
File structure:
├── index.html (80 lines)
├── style.css (300 lines)
└── app.js (500 lines)
Key Takeaways
Hash chains are simple. The core concept is ~50 lines of code.
Canonical JSON is essential. Without deterministic serialization, you get false positives.
Web Crypto API is underrated. SHA-256, AES, ECDSA — all available natively.
Zero dependencies is achievable. Even ZIP generation can be done in pure JS.
Tamper-evidence ≠ tamper-proof. Anyone can still modify data. But now it's detectable.
What's Next?
This demo covers the basics. A production system would add:
- Digital signatures (Ed25519) for non-repudiation
- Merkle trees for efficient batch verification
- External anchoring (timestamp authority or blockchain)
- PTP time synchronization for microsecond precision
The VeritasChain Protocol (VCP) specification covers all of these. If you're building audit infrastructure for trading systems, check out the IETF draft.
Try It Yourself
🔗 veritaschain.org/regtech/demo
Generate events. Apply tampering. Watch verification fail. Download the evidence pack.
All in your browser. No setup required.
Questions? Find me at technical@veritaschain.org or open an issue on GitHub.
Tags: #javascript #cryptography #webdev #security #sha256 #auditlog #webapi #vanillajs
Top comments (0)