DEV Community

x0101010011
x0101010011

Posted on

Building a Production WebGPU Engine... for a psychotherapy practice?

The Challenge: Where Noise Becomes Flow

Flow field as psyche metaphor

When building the digital presence for Therapy Warsaw, we faced an unusual requirement. We didn't want stock photos or static illustrations. We wanted something that felt alive—a generative texture that was always changing, but never demanding attention.

The visual metaphor was simple: complex patterns finding clarity. A field of noise, slowly organizing itself into coherent, flowing lines.

The technical requirements were less simple:

  1. Organic & Dense: ~10,000 interacting particles.
  2. Performance Critical: 60FPS on mobile while users scroll.
  3. Resilient: Must work on 10-year-old laptops (WebGL2) and bleeding-edge devices (WebGPU).
  4. Framework-Free: No React, no Three.js. Just controlled, fluid logic.

Here is how we built a dual-stack WebGPU + WebGL2 engine to solve this.


Physics transitions

Architecture: Keeping the UI Responsive

The first rule of heavy graphics on the web: Get off the main thread.

We strictly separated the application:

  • Main Thread: DOM, accessibility, routing, UI state.
  • Worker Thread: Physics, geometry generation, rendering via OffscreenCanvas.

Even if the physics simulation hiccups, page scrolling stays smooth. Communication happens via a dedicated messaging system that syncs visual "Presets" (colors, speed, turbulence) without blocking.

// main.js
const worker = new Worker(new URL('./worker.js', import.meta.url), { type: 'module' });
const offscreen = canvas.transferControlToOffscreen();

// Hand ownership to the worker
worker.postMessage({ type: 'init', canvas: offscreen }, [offscreen]);
Enter fullscreen mode Exit fullscreen mode

#genart

Why WebGPU? (And Why We Still Needed WebGL2)

We started with WebGPU because Compute Shaders are a natural fit for particle systems.

The WebGPU Pipeline

We use Compute Shaders for the heavy lifting:

  1. Map Pass: Generates noise textures (Burn, Density, Void maps).
  2. Flow Pass: Calculates the vector field.
  3. Life Pass: Updates particle ages and handles resets.
  4. Physics Pass: Moves particles based on flow vectors.

The key performance win: avoiding CPU-GPU round trips. The entire simulation stays on the GPU.

The WebGL2 Fallback

WebGPU support is growing but not universal. We had to support WebGL2—but we didn't want a "dumb" fallback.

To achieve feature parity without destroying the CPU, we used Transform Feedback. This allows WebGL2 to update particle positions in the Vertex Shader and write them back to a buffer, mimicking compute shaders.


The Spring Physics System

When a user navigates between pages, the visualization morphs:

  • Colors shift (e.g., warm orange → deep blue).
  • Chaos decreases or increases.
  • Speed adjusts.

We couldn't just lerp these values; it looks robotic. We implemented a Critical Damping Spring System.

function updateSpring(state, target, dt) {
    const tension = 120;
    const friction = 20;

    const displacement = target - state.value;
    const force = tension * displacement - friction * state.velocity;

    state.velocity += force * dt;
    state.value += state.velocity * dt;
}
Enter fullscreen mode Exit fullscreen mode

Every frame, we update ~20 spring-driven parameters and upload them to a Uniform Buffer Object (UBO). The result: transitions that feel physical, not computed.


Optimization: Procedural Vertex Integration

Rendering thick lines usually means generating 2 triangles (6 vertices) per segment. For long trails, that's expensive memory bandwidth.

Our approach: store only the head position of each line.

Inside the Vertex Shader, we run a for loop (~60 iterations) to re-trace the path backwards through the flow field, reconstructing the trail on the fly.

  • Pros: Massive bandwidth reduction (1 point per line, not thousands of vertices).
  • Cons: Higher ALU cost per vertex.

On modern GPUs, ALU is cheap; bandwidth is expensive. This trade-off let us render thousands of long, smooth trails on mobile.


The Result

The result is therapywarsaw.com—a site where the background is a living simulation, a quiet texture that reflects the nature of the work.

The engine is open source:

Repo: github.com/23x2/generative-flow-field


Questions about the shader pipeline or Transform Feedback? Ask below.

Top comments (0)