DEV Community

Cover image for Canvas, Audio and WebGL: analysis of fingerprinting technologies
Sid Wudraq for Octo Browser

Posted on

Canvas, Audio and WebGL: analysis of fingerprinting technologies

Today, websites don’t rely on cookies alone to recognize visitors. Even if you clear your history, switch IPs or use private browsing mode, your browser can still be identified through a digital fingerprint. This is a hidden set of technical signals that gets generated automatically on every visit.

A browser fingerprint is a combination of parameters that describe your device and runtime environment. Individual attributes like browser version, screen resolution or language are shared by many users. But when combined, they form a highly distinctive profile. This profile is calculated in real time as the page loads, so it doesn’t depend on stored data. That’s why clearing cookies or using incognito mode doesn’t really change much.

Beyond basic indicators such as user agent, timezone or interface language, modern fingerprinting relies on more advanced techniques like Canvas, AudioContext and WebGL. These methods use the browser’s graphics and audio stack to extract deeper information about your hardware and system. What makes them so accurate, and how do they actually distinguish one device from another? Let’s take a closer look.

Canvas fingerprinting

The Canvas API allows browsers to render graphics using JavaScript. In fingerprinting, this feature is used in a less obvious way. A script on a webpage creates an invisible canvas element and executes a series of drawing commands, such as rendering text, shapes, shadows, and gradients. The resulting image is then read as pixel data and converted into a hash that acts as a unique identifier.

Here’s how the process works in practice:

  • Canvas creation. The page dynamically creates a canvas element, usually outside the visible area.

  • Drawing. JavaScript draws test graphics. This often includes text with specific fonts, geometric shapes, lines, shadows, and gradients to trigger different rendering behaviors.

  • Reading the pixels. The script reads the output using methods like toDataURL(), which converts the canvas content into a string representation of the image.

  • Hashing. The data is passed through a hashing function, producing a fingerprint that gets sent to the server.

Let’s draw the word “Fingerprint” on a Canvas and compute a primitive hash:

let canvas = document.createElement("canvas");
let ctx = canvas.getContext("2d");
canvas.width = 200; canvas.height = 50;

ctx.textBaseline = "top";
ctx.font = "20px Arial";
ctx.fillStyle = "#f60";
ctx.fillText("Fingerprint", 10, 10);

let data = canvas.toDataURL();
let hash = 0;
for (let i = 0; i < data.length; i++) {
  hash = (hash << 5) - hash + data.charCodeAt(i);
  hash |= 0;
}
console.log("Canvas hash:", hash);
Enter fullscreen mode Exit fullscreen mode

This code outputs a 32-bit number (it can be negative due to overflow) that depends on how exactly the browser renders the text. In a different browser, like Opera in this example, the result may change. Sometimes it can совпадать, but that’s relatively rare.

On a different machine, the chances of getting a different result are even higher, although совпадения are still possible in edge cases.

As shown in the screenshots below, even on the same device but across different browsers, the hash is not the same. In one case, the word “Fingerprint” was rendered in Chrome, and in the other, in Opera. You can try running this on your own machine and compare the results.

canvas fingerprinting technology explained

canvas fingerprinting technology explained

The differences in a Canvas hash don’t come from the text itself. Visually, the word “Fingerprint” looks the same on almost any device. The uniqueness comes from how it’s rendered under the hood.

Every OS and browser handles font smoothing, curves, hinting, and rasterization a bit differently. Then you add GPU differences, drivers, and hardware quirks, and you end up with tiny variations in the final image. On one device a pixel might be slightly lighter, on another slightly darker, or edges may be smoothed in a different way. You won’t notice it visually, but the hash will.

To increase entropy, websites don’t just render simple text. They often use strings that cover most of the alphabet, like “Cwm fjordbank glyphs mute quiz”. On top of that, they may add shadows, gradients, or colored shapes to extract more pixel-level differences.

The result is a stable hash string (something like e3d52382d0…) that stays consistent across visits, as long as the browser environment doesn’t change. In theory, two machines with identical fonts, GPUs, and drivers could produce the same fingerprint, but in reality that almost never happens. The combination of rendering factors is almost always unique, which makes Canvas a very effective tracking method.

AudioContext fingerprinting

Another way to generate a unique device identifier is audio fingerprinting. This method uses the Web Audio API, but not for recording from a microphone or playing sound. Everything happens entirely inside the browser, and the user doesn’t notice a thing.

The script generates and processes a synthetic audio signal, then extracts numeric values that reflect characteristics of the hardware and software environment. The result is a stable “audio” identifier, similar to a Canvas hash, but based on sound data.

Here’s the idea behind it:

  1. AudioContext. The script initializes a hidden audio context, usually an OfflineAudioContext. Think of it as a virtual sound processor that runs calculations in memory without sending anything to your speakers.

  2. Signal generation. Inside this context, an oscillator (like OscillatorNode) produces a signal at a fixed frequency, often around 1000 Hz. This is a clean, synthetic tone generated directly in the browser.

  3. Effects processing. To make differences more noticeable, the signal is passed through effects. A common choice is a compressor (DynamicsCompressorNode), which alters the waveform. By tweaking parameters like threshold, ratio, attack, and release, the output becomes dependent on the system.

  4. Rendering and reading. The audio context processes a short chunk of sound. Once rendering is done, the script gets an array of sample values. For example, at 44,100 Hz over ~0.1 seconds, that’s around 5,000 data samples.

  5. Fingerprint computation. The sample array is reduced to a compact number. One simple approach is summing the absolute values and taking the most significant digits. That number becomes the audio fingerprint.

Here’s a simple example you can run in your browser console to generate your own audio fingerprint:

(async () => {
  const AC = window.OfflineAudioContext || window.webkitOfflineAudioContext;
  const ctx = new AC(1, 5000, 44100);

  const osc = ctx.createOscillator();
  osc.type = 'triangle';
  osc.frequency.value = 1000;

  const comp = ctx.createDynamicsCompressor();
  comp.threshold.value = -50;
  comp.knee.value = 40;
  comp.ratio.value = 12;
  comp.attack.value = 0;
  comp.release.value = 0.25;

  osc.connect(comp);
  comp.connect(ctx.destination);
  osc.start(0);

  const rendered = await ctx.startRendering();

  const samples = rendered.getChannelData(0);

  let acc = 0;
  for (let i = 0; i < samples.length; i++) acc += Math.abs(samples[i]);
  const demo = Math.round(acc * 1e6) / 1e6;

  const buf = samples.buffer.slice(
    samples.byteOffset,
    samples.byteOffset + samples.byteLength
  );
  const hashBuf = await crypto.subtle.digest('SHA-256', buf);
  const hashHex = Array.from(new Uint8Array(hashBuf))
    .map(b => b.toString(16).padStart(2, '0'))
    .join('');

  console.log('Audio demo sum:', demo);
  console.log('Audio SHA-256 :', hashHex);
})();
Enter fullscreen mode Exit fullscreen mode

The same device usually produces the same audio fingerprint across different browsers. For example, in Chrome and Opera the resulting value matched (in our case, 953.152941). You can run the script yourself and check your own result.

audio fingerprinting explained

audio fingerprinting explained

Browsers have started to introduce protections against this kind of tracking. Apple was one of the first to implement it: начиная с Safari 17, the browser adds slight randomness to audio processing in private mode, which makes the fingerprint change from session to session.

In most other browsers, audio fingerprints can still be collected without much difficulty. When combined with Canvas and WebGL, they significantly increase the accuracy of device identification.

WebGL fingerprinting

HTML5 WebGL gives the browser access to hardware-accelerated 3D rendering through a canvas element with a WebGL context. It was originally designed for interactive graphics, but over time it has also become a powerful source of fingerprinting data.

While Canvas captures differences in 2D rendering, WebGL goes deeper and operates at the GPU level. Because of this, it can reveal much more about the system, including the graphics card model, driver version, and subtle details of how a specific GPU behaves. In some cases, it can even distinguish between two machines with the same graphics hardware.

A typical WebGL fingerprint is built in a few steps.

First, the script initializes a WebGL context, for example via canvas.getContext("webgl2"). At this stage, it can already access some basic environment data like the GPU vendor and renderer, driver details, and supported extensions.

Next comes rendering. A hidden canvas is used to draw a 3D scene or a set of primitives. This usually involves shapes, shaders, lighting, and textures — enough to engage different parts of the graphics pipeline.

Once rendering is complete, the script collects the output. It reads pixel data using methods like gl.readPixels and queries various WebGL parameters such as supported extensions, maximum texture size, shader precision, and VENDOR/RENDERER strings. Together, these values form a kind of snapshot of the graphics environment.

Finally, all collected data is combined and hashed, often using something like SHA-256. The resulting hash becomes the WebGL fingerprint, which can then be sent to the server and reused to identify the same device across sessions.

Here’s a simplified example of how such a hash can be generated:

(async () => {
  const cv = document.createElement('canvas');
  cv.width = 400; cv.height = 200;
  const gl = cv.getContext('webgl2', {antialias:true})
        || cv.getContext('webgl', {antialias:true});
  if (!gl) { console.log('WebGL недоступен'); return; }

  const info = {};
  const dbg = gl.getExtension('WEBGL_debug_renderer_info');
  if (dbg) {
    info.vendor = gl.getParameter(dbg.UNMASKED_VENDOR_WEBGL);
    info.renderer = gl.getParameter(dbg.UNMASKED_RENDERER_WEBGL);
  } else {
    info.vendor = '(masked)';
    info.renderer = '(masked)';
  }
  info.version = gl.getParameter(gl.VERSION);
  info.glsl = gl.getParameter(gl.SHADING_LANGUAGE_VERSION);
  info.maxTex = gl.getParameter(gl.MAX_TEXTURE_SIZE);

  const vs = `
    attribute vec2 p;
    void main(){ gl_Position = vec4(p,0.0,1.0); }
  `;
  const fs = `
    precision highp float;
    uniform vec2 u_res;
    float h(vec2 v){
      float s = sin(dot(v, vec2(12.9898,78.233))) * 43758.5453;
      return fract(s);
    }
    void main(){
      vec2 uv = gl_FragCoord.xy / u_res;
      float r = h(uv + vec2(0.11,0.21));
      float g = h(uv*1.3 + vec2(0.31,0.41));
      float b = h(uv*1.7 + vec2(0.51,0.61));
      gl_FragColor = vec4(pow(vec3(r,g,b)*(0.6+0.4*uv.x), vec3(1.1)), 1.0);
    }
  `;

  function sh(type, src){
    const s = gl.createShader(type);
    gl.shaderSource(s, src); gl.compileShader(s);
    if (!gl.getShaderParameter(s, gl.COMPILE_STATUS))
      throw new Error(gl.getShaderInfoLog(s)||'shader error');
    return s;
  }
  const pr = gl.createProgram();
  gl.attachShader(pr, sh(gl.VERTEX_SHADER, vs));
  gl.attachShader(pr, sh(gl.FRAGMENT_SHADER, fs));
  gl.linkProgram(pr);
  if (!gl.getProgramParameter(pr, gl.LINK_STATUS))
    throw new Error(gl.getProgramInfoLog(pr)||'link error');
  gl.useProgram(pr);

  const buf = gl.createBuffer();
  gl.bindBuffer(gl.ARRAY_BUFFER, buf);
  gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([-1,-1, 3,-1, -1,3]), gl.STATIC_DRAW);
  const loc = gl.getAttribLocation(pr, 'p');
  gl.enableVertexAttribArray(loc);
  gl.vertexAttribPointer(loc, 2, gl.FLOAT, false, 0, 0);

  const ures = gl.getUniformLocation(pr, 'u_res');
  gl.uniform2f(ures, gl.drawingBufferWidth, gl.drawingBufferHeight);

  gl.viewport(0,0,gl.drawingBufferWidth, gl.drawingBufferHeight);
  gl.clearColor(0,0,0,1); gl.clear(gl.COLOR_BUFFER_BIT);
  gl.drawArrays(gl.TRIANGLES, 0, 3);

  const w = gl.drawingBufferWidth, h = gl.drawingBufferHeight;
  const px = new Uint8Array(w*h*4);
  gl.readPixels(0,0,w,h, gl.RGBA, gl.UNSIGNED_BYTE, px);

  const enc = new TextEncoder();
  const meta = enc.encode(JSON.stringify(info));
  const full = new Uint8Array(meta.length + px.length);
  full.set(meta, 0); full.set(px, meta.length);

  const hashBuf = await crypto.subtle.digest('SHA-256', full.buffer);
  const hex = Array.from(new Uint8Array(hashBuf))
    .map(b=>b.toString(16).padStart(2,'0')).join('');

  let hi = 0>>>0, lo = 0>>>0;
  for (let i=0;i<px.length;i+=16){
    const a = px[i] | (px[i+1]<<8) | (px[i+2]<<16) | (px[i+3]<<24);
    hi = ((hi ^ a) + 0x9e3779b9) >>> 0;
    lo = ((lo ^ ((a<<7)|(a>>>25))) + 0x85ebca6b) >>> 0;
    hi ^= (hi<<13)>>>0; lo ^= (lo<<15)>>>0;
  }
  const sample64 = ('00000000'+hi.toString(16)).slice(-8)+('00000000'+lo.toString(16)).slice(-8);

  console.log('WebGL vendor  :', info.vendor);
  console.log('WebGL renderer:', info.renderer);
  console.log('WebGL version :', info.version, '| GLSL:', info.glsl);
  console.log('MAX_TEXTURE_SIZE:', info.maxTex);
  console.log('SHA-256(meta+pixels):', hex);
  console.log('Sample64:', sample64);
})();
Enter fullscreen mode Exit fullscreen mode

Here's what we get:

webgl fingerprinting explained

So what exactly does WebGL capture?

First of all, GPU identifiers. Different graphics cards expose different capabilities. Integrated Intel Graphics differ from discrete NVIDIA GeForce or AMD Radeon in terms of memory, supported extensions, and driver behavior. All of this is reflected in the WebGL context parameters.

Then there are variations between seemingly identical devices. Even if two machines use the same GPU model, there can still be small hardware-level differences. WebGL can pick these up by measuring things like shader execution or by detecting subtle rendering artifacts at the pixel level.

The browser itself also plays a role. Different engines such as Blink, WebKit, and Gecko handle WebGL calls slightly differently, which leads to small but consistent differences in the final output. That’s why the same WebGL fingerprint can vary across browsers, even on the same machine.

webgl fingerprinting

WebGL fingerprints are often combined with Canvas and audio data for more robust tracking, but even on their own, they can reveal a surprisingly detailed picture of a user’s system.

How to reduce fingerprinting risks

Completely eliminating all fingerprinting vectors is almost impossible. There are simply too many signals a website can collect. Still, you can reduce how unique your setup looks and make tracking more difficult.

One approach is using hardened browsers and privacy modes. Tor Browser is designed so that all users look the same. It restricts Canvas and WebGL and standardizes many environment parameters. This is effective, but often breaks modern websites.

Brave Browser blocks many common fingerprinting techniques.
Safari, in private browsing, adds noise to audio processing, making audio fingerprints less stable.

Another option is blocking or spoofing extensions like CanvasBlocker. They prevent websites from reading Canvas data or return modified images. Similar tools exist for AudioContext. The downside is that very few users run these setups, so this kind of behavior can actually stand out. A completely blank Canvas or constantly changing noisy hash may look suspicious rather than anonymous.

There’s also the idea of reducing uniqueness by standardizing the environment. For example, running a browser inside a virtual machine or cloud environment where multiple users share the same configuration. Some anti-fraud systems use similar approaches. In practice, though, this is not very convenient for everyday use.

A more balanced approach is controlled spoofing. Anti-detect browsers allow you to configure your environment in a way that looks realistic rather than random. Instead of blocking signals, they make them consistent and plausible.

For example, Octo Browser lets you control what websites see in terms of Canvas, WebGL, and audio data. It introduces subtle, realistic variations and aligns parameters so that each profile look plausibly unique without standing out among millions of other users.

You can try Octo Browser for free using promo code DEVTO.

If your goal is maximum anonymity, you would need to constantly rotate devices, browsers, and configurations, while keeping everything updated and minimizing extra signals. Even then, it only reduces risk to a certain extent. In practice, using specialized tools that manage fingerprints in a consistent and realistic way is often a more effective approach.

Top comments (0)