The New Standard in Workplace Surveillance: Meta's Project Cadence
Meta has just implemented a workplace monitoring program that makes previous surveillance efforts look tame. On April 22, 2026, an internal memo codenamed "Project Cadence" leaked, revealing plans to capture every mouse movement, keystroke cadence, scroll action, and application switch from the entire workforce. While framed as "training internal productivity AI," this represents the most comprehensive workplace surveillance ever deployed by a major American tech company. The program is not optional in any meaningful sense—opting out requires manager approval and will deny engineers access to AI-augmented tools that will soon be essential for their jobs.
What Project Cadence Actually Captures
The technical scope of Project Cadence is alarmingly broad. According to internal specifications reviewed by the author, the system collects:
// Hypothetical telemetry data structure
const telemetryEvent = {
timestamp: 1677496800000,
eventType: "keystroke",
userId: "hashed_id",
interKeyTiming: [45, 120, 80], // milliseconds between keystrokes
keyCode: "Shift+Control+P",
windowFocus: "VS Code",
application: "ide",
mousePosition: { x: 1204, y: 768 },
clickLatency: 234 // ms since last mouse movement
};
This includes raw keystroke streams with millisecond timing precision, mouse movement traces sampled at 60Hz, application focus events, IDE telemetry (including file operations and copy-paste events), meeting presence signals, and workplace communication metadata. Meta claims "no raw content is used for training," but timing and cadence data alone can identify individuals, infer health conditions, and reconstruct work patterns with disturbing fidelity. The distinction between "content" and "metadata" is legally meaningless in this context.
The "AI Training" Rhetorical Laundromat
Meta's framing of this as an "AI training" initiative is a deliberate rhetorical strategy to legitimize surveillance that would otherwise face immediate backlash. This isn't about building better code completion models—one senior Meta ML researcher confirmed, "You don't need a billion mouse traces to build a code completion model. You need them to build behavioral profiles." The company learned from Microsoft's Productivity Score fiasco in 2020, which was rapidly abandoned after public outcry. By repackaging surveillance as AI training, Meta hopes to bypass the established norms and legal precedents that have constrained workplace monitoring for decades.
Historical Precedents and Why This Time Is Different
Meta's executives appear to be ignoring a graveyard of failed surveillance programs. Barclays' 2020 deployment of heat sensors under desks in London triggered immediate backlash and was removed within weeks. Activision Blizzard's employee monitoring contributed to one of the largest tech unionization waves in recent history. Even Xerox's keystroke logging in the 1997 ultimately backfired, damaging engineering productivity and morale.
What makes Project Cadence different is its scale and the AI laundering technique. Previous surveillance was typically limited to specific departments or narrowly focused metrics. Meta's system captures comprehensive behavioral data across the entire workforce, and the AI framing creates new legal and ethical challenges. Unlike past incidents, this program affects highly skilled workers who understand exactly what data is being collected and how it could be used against them.
The author spoke with three Meta engineers on condition of anonymity. All used the same phrase: "This is the moment I update my resume." Given Meta's dominance in the tech industry and the precedent this sets, the industry will be watching closely to see how this unfolds. The most likely outcome may not be regulatory intervention, but rather a talent exodus as engineers vote with their feet against a workplace they no longer trust.
Read the full article at novvista.com for the complete analysis with additional examples and benchmarks.
Originally published at NovVista
Top comments (0)