Draw anything. Watch a chain of spinning circles reconstruct it — exactly as Fourier intended.
What Is It?
Epicycle Doodler is an interactive web app that takes anything you draw — a scribble, a star, your name — and reconstructs it using a chain of rotating circles called epicycles. The mathematics behind it is the Discrete Fourier Transform (DFT), one of the most fundamental algorithms in all of signal processing.
The result is genuinely mesmerizing: dozens of circles of different sizes spin at different speeds, and the tip of the outermost one traces out your original drawing almost perfectly.
But it goes beyond just visualization. The app also:
- Animates 18+ famous mathematical curves (Koch Snowflake, Butterfly Curve, Trefoil Knot…) with historical annotations
- Lets you type text and watch epicycles write it
- Features three game modes (Duel, Challenge, Guess the Curve)
- Sonifies the Fourier components via the Web Audio API
-
Records the animation as a
.webmvideo - Ships with three visual themes: Dark, Paper, and Synthwave
This post is a deep technical walkthrough of how it works, what decisions were made, and how all the pieces fit together.
Architecture Overview
┌─────────────────────────────────────────────────────────┐
│ EpicycleDoodler.tsx │
│ (Single Component) │
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ Path Input │ │ DFT Engine │ │ Audio Engine │ │
│ │ │ │ │ │ (audio.ts) │ │
│ │ • Mouse/Touch│ │ computeDFT() │ │ │ │
│ │ • Text→Path │ │ resample() │ │ Web Audio API│ │
│ │ • Famous Eqs │ │ smooth() │ │ Additive │ │
│ │ │ │ fit() │ │ Synthesis │ │
│ └──────┬───────┘ └──────┬───────┘ └──────────────┘ │
│ │ │ │
│ └────────┬────────┘ │
│ ▼ │
│ ┌───────────────────────────────┐ │
│ │ Canvas 2D Renderer │ │
│ │ │ │
│ │ requestAnimationFrame loop │ │
│ │ • Background + grid │ │
│ │ • Epicycle arms │ │
│ │ • Glowing trace trail │ │
│ │ • Tip dot + halo │ │
│ │ • Tutor HUD overlay │ │
│ └───────────────────────────────┘ │
│ │ │
│ ┌────────┴────────┐ │
│ ▼ ▼ │
│ ┌─────────────┐ ┌──────────────┐ │
│ │ React UI │ │ MediaRecorder│ │
│ │ (Controls) │ │ (Export) │ │
│ └─────────────┘ └──────────────┘ │
└─────────────────────────────────────────────────────────┘
The key architectural decision: React is the shell, Canvas 2D is the engine. React manages all the discrete UI states (modes, menus, settings) while everything animation-critical — DFT components, trace buffers, frame counts — lives in useRef to avoid triggering re-renders during the animation loop.
Stack
| Layer | Technology |
|---|---|
| Framework | React 18 + Vite |
| Rendering | HTML Canvas 2D API |
| Audio | Web Audio API |
| Video Export | MediaRecorder API |
| UI Components | shadcn/ui + Radix UI |
| Styling | Tailwind CSS + inline Canvas styles |
| Build System | pnpm workspace monorepo |
| Language | TypeScript (strict) |
Core Algorithm: The Discrete Fourier Transform
The DFT is the mathematical heart of the app. Given a sequence of 2D points from your drawing, it decomposes the path into a sum of circular motions — each with a frequency, amplitude, and phase.
The implementation treats each point as a complex number: z_n = x_n + i·y_n.
function computeDFT(pts: Point[]): FreqComponent[] {
const N = pts.length;
const result: FreqComponent[] = [];
for (let k = 0; k < N; k++) {
let re = 0, im = 0;
for (let n = 0; n < N; n++) {
const a = (2 * Math.PI * k * n) / N;
re += pts[n].x * Math.cos(a) + pts[n].y * Math.sin(a);
im += -pts[n].x * Math.sin(a) + pts[n].y * Math.cos(a);
}
re /= N; im /= N;
result.push({
freq: k,
amplitude: Math.sqrt(re * re + im * im),
phase: Math.atan2(im, re),
});
}
// Sort largest amplitude first — biggest circles lead the chain
return result.sort((a, b) => b.amplitude - a.amplitude);
}
The output is an array of FreqComponent objects — one per frequency. Each component tells the animation loop: "spin a circle of this radius (amplitude), at this speed (freq), starting at this angle (phase)."
Why sort by amplitude? Because the largest circles approximate the overall "shape" of the drawing. The first few circles give you the blob; subsequent ones carve out finer and finer details. This also maps naturally to the Circles slider — set it to 4 and you get a crude approximation; crank it to 512 and you get the original drawing traced perfectly.
Complexity
Naïve DFT is O(N²). For N=512 sample points this is ~262,000 multiply-add operations — fast enough in JS to complete in under a millisecond on modern hardware. For larger N, a Fast Fourier Transform (FFT) would be O(N log N), but the UX caps samples at 512 so naïve DFT is perfectly adequate.
Path Processing Pipeline
Raw mouse/touch input is noisy and variable-density. Before feeding it to the DFT, the path goes through three processing stages:
1. Smoothing
Removes jitter from shaky drawing via a sliding window moving average:
function smoothPath(pts: Point[], w = 7): Point[] {
if (pts.length < w) return pts;
const half = Math.floor(w / 2);
return pts.map((_, i) => {
let sx = 0, sy = 0, c = 0;
for (let j = Math.max(0, i - half); j <= Math.min(pts.length - 1, i + half); j++) {
sx += pts[j].x;
sy += pts[j].y;
c++;
}
return { x: sx / c, y: sy / c };
});
}
2. Resampling
The DFT requires N evenly spaced samples. Raw drawing has dense clusters where the mouse moved slowly and sparse points where it moved fast. Resampling fixes this with arc-length parameterization:
function resamplePath(pts: Point[], N: number): Point[] {
if (pts.length < 2) return pts;
// Build cumulative arc-length lookup table
const lens: number[] = [0];
for (let i = 1; i < pts.length; i++) {
const dx = pts[i].x - pts[i - 1].x;
const dy = pts[i].y - pts[i - 1].y;
lens.push(lens[i - 1] + Math.sqrt(dx * dx + dy * dy));
}
const total = lens[lens.length - 1];
// Binary search to find the correct segment for each target distance
return Array.from({ length: N }, (_, i) => {
const target = (i / N) * total;
let lo = 0, hi = lens.length - 1;
while (lo < hi - 1) {
const m = (lo + hi) >> 1;
if (lens[m] <= target) lo = m; else hi = m;
}
const t = lens[lo] === lens[hi] ? 0 : (target - lens[lo]) / (lens[hi] - lens[lo]);
return {
x: pts[lo].x + t * (pts[hi].x - pts[lo].x),
y: pts[lo].y + t * (pts[hi].y - pts[lo].y),
};
});
}
3. Fitting
Centers and scales the path so it fills the canvas without overflowing:
function fitPath(pts: Point[], cW: number, cH: number, frac = 0.75, pad = 40): Point[] {
let minX = Infinity, maxX = -Infinity, minY = Infinity, maxY = -Infinity;
for (const p of pts) {
if (p.x < minX) minX = p.x; if (p.x > maxX) maxX = p.x;
if (p.y < minY) minY = p.y; if (p.y > maxY) maxY = p.y;
}
const cx = (minX + maxX) / 2, cy = (minY + maxY) / 2;
const bW = maxX - minX || 1, bH = maxY - minY || 1;
const scale = Math.min((cW * frac - pad * 2) / bW, (cH * frac - pad * 2) / bH);
return pts.map(p => ({ x: (p.x - cx) * scale, y: (p.y - cy) * scale }));
}
The Animation Loop
The rendering loop uses requestAnimationFrame and is deliberately kept outside React's render cycle. All mutable rendering state lives in refs:
const frameRef = useRef(0); // current frame (fractional for smooth speed)
const compsRef = useRef<FreqComponent[]>([]); // DFT output
const traceRef = useRef<Point[]>([]); // tip trail history
const loopGlowRef = useRef(0); // glow pulse on loop completion
At each frame, the loop:
- Clears and redraws the background + grid
- Iterates through the active
FreqComponents, computing each circle's tip position:
x_k = amplitude_k × cos(2π × freq_k × frame / N + phase_k)
y_k = amplitude_k × sin(2π × freq_k × frame / N + phase_k)
- Draws the arm from center to tip, and a circle outline
- Accumulates the final tip position into the trace buffer
- Renders the trace as a gradient polyline with per-segment alpha and width
- Advances
frameRef.currentby the speed multiplier - Triggers a glow pulse when
framewraps from N-1 back to 0 (loop complete)
The trace rendering is the most visually impactful piece — each segment gets its own color (from the theme's traceRgb function), shadow blur for glow, and lineWidth — all varying with how far through the trace the segment is:
for (let i = 1; i < trace.length; i++) {
const p = i / trace.length; // 0 = tail, 1 = current tip
let alpha = p < fadeEnd ? p / fadeEnd : 1.0;
alpha *= Math.pow(p, 0.55); // smooth fade-in at the tail
const strokeW = 1.5 + p * 1.5;
const rgb = T.traceRgb(p); // theme-defined gradient
g.strokeStyle = `rgba(${rgb},${alpha})`;
g.lineWidth = strokeW;
// ... draw segment
}
Audio: Additive Synthesis from Fourier Components
Every Fourier component is also a sinusoidal frequency. The AudioManager class maps those directly to OscillatorNodes in the Web Audio API:
export class AudioManager {
private ctx: AudioContext | null = null;
private masterGain: GainNode | null = null;
private oscillators: OscillatorNode[] = [];
setCircles(comps: FreqComp[], numActive: number) {
this.clear();
if (!this.ctx || !this.masterGain) return;
const active = comps.slice(0, Math.min(numActive, 10));
const maxAmp = active.reduce((m, c) => Math.max(m, c.amplitude), 1);
active.forEach(({ freq, amplitude }) => {
if (freq === 0 || amplitude < 1) return;
const hz = Math.min(110 * (freq + 1), 1760); // map freq → Hz, clamp to audible range
const osc = this.ctx!.createOscillator();
const gain = this.ctx!.createGain();
osc.type = "sine";
osc.frequency.value = hz;
gain.gain.value = (amplitude / maxAmp) * 0.045; // normalize by largest component
osc.connect(gain);
gain.connect(this.masterGain!);
osc.start();
});
}
}
This is pure additive synthesis: each circle's frequency maps to a tone, its amplitude maps to the tone's volume. The result is an eerie harmonic chord that changes whenever you adjust the Circles slider — because you're literally changing which frequency components are active.
The audio is always initialized on a user gesture (a click/touch) to comply with browser autoplay policies.
Famous Curves Library
One of the most educational parts of the app is its library of 18+ mathematically famous curves. Each is defined parametrically and includes a rich annotation explaining its history and significance. A few highlights:
Butterfly Curve (Temple H. Fay, 1989)
generate: () => Array.from({ length: 1024 }, (_, i) => {
const t = (12 * Math.PI * i) / 1024;
const r = 28 * (Math.exp(Math.sin(t)) - 2 * Math.cos(4 * t)
+ Math.pow(Math.sin((2 * t - Math.PI) / 24), 5));
return { x: r * Math.cos(t), y: r * Math.sin(t) };
}),
The mix of exponential and trigonometric terms produces a strikingly organic shape that requires 12π of rotation — six full turns — before closing.
Koch Snowflake (iterative fractal)
generate: () => {
let pts: [number, number][] = [[0, -120], [104, 60], [-104, 60], [0, -120]];
for (let iter = 0; iter < 4; iter++) {
const next: [number, number][] = [pts[0]];
for (let i = 0; i < pts.length - 1; i++) {
const [ax, ay] = pts[i], [bx, by] = pts[i + 1];
const dx = bx - ax, dy = by - ay;
const p1x = ax + dx / 3, p1y = ay + dy / 3;
const p2x = ax + 2 * dx / 3, p2y = ay + 2 * dy / 3;
const px = p1x + (dx / 3) * 0.5 - (dy / 3) * (Math.sqrt(3) / 2);
const py = p1y + (dx / 3) * (Math.sqrt(3) / 2) + (dy / 3) * 0.5;
next.push([p1x, p1y], [px, py], [p2x, p2y], [bx, by]);
}
pts = next;
}
return pts.slice(0, -1).map(([x, y]) => ({ x, y }));
},
After 4 iterations, the boundary has 768 segments. The DFT needs many epicycles to carve the sharp fractal corners — a perfect live demonstration of why more harmonics = more detail.
Hypotrochoid (Spirograph)
generate: () => {
const R = 5, r = 3, d = 5, s = 22;
return Array.from({ length: FC_N }, (_, i) => {
const t = (6 * Math.PI * i) / FC_N;
return {
x: s * ((R - r) * Math.cos(t) + d * Math.cos(((R - r) * t) / r)),
y: s * ((R - r) * Math.sin(t) - d * Math.sin(((R - r) * t) / r)),
};
});
},
The classic Spirograph mechanism — a point on a small circle rolling inside a larger one. The gear ratio R:r = 5:3 requires 3 full revolutions before the path closes.
Text-to-Path: A Custom Single-Stroke Font
To let users type text for the epicycles to draw, the app implements a minimal single-stroke vector font where every character is defined as a sequence of (x, y) waypoints forming a continuous path:
const LETTER_PATH: Record<string, readonly (readonly [number, number])[]> = {
A: [[0,10],[3,0],[6,10],[5,7],[1,7]],
B: [[0,10],[0,0],[4,0],[5.5,1.5],[5.5,4],[4,5],[0,5],[4.5,5.5],[5.5,7],[5.5,9],[4,10],[0,10]],
// ... 26 letters
};
function textToPath(text: string): Point[] {
const CHAR_W = 7; // cell width (6 units wide + 1 gap)
const SCALE = 12; // px per unit — fitPath rescales to fill canvas
const toWorld = (raw: readonly (readonly [number, number])[], ci: number): Point[] =>
raw.map(([x, y]) => ({ x: (ci * CHAR_W + x) * SCALE, y: y * SCALE }));
const result: Point[] = [];
for (let ci = 0; ci < text.length; ci++) {
const ch = text[ci].toUpperCase();
const raw = LETTER_PATH[ch];
if (!raw) continue;
// ... connect characters into one continuous path
}
return result;
}
The trick is that the epicycles must trace a single continuous curve. Multi-character text is concatenated with bridging segments between the last point of one letter and the first point of the next, so the pen never lifts. The result feeds directly into the standard DFT pipeline.
Visual Themes
The app supports three themes, each defined as a record of color-generating functions:
type ThemeId = "dark" | "paper" | "synthwave";
const THEMES: Record<ThemeId, {
traceRgb: (p: number) => string; // gradient along the trace
circleRgb: (a: number) => string; // circle outline
armRgb: (a: number) => string; // connecting arm
tipHalo: string; // glow around the tip dot
// ...
}> = {
dark: {
traceRgb: (p) => `${~~(40+60*p)},${~~(140+100*p)},${~~(200+55*p)}`,
// deep blue → electric blue gradient
},
paper: {
traceRgb: (p) => `${~~(26+14*(1-p))},${~~(26+14*(1-p))},${~~(60+30*(1-p))}`,
traceShadowAlpha: 0, // no glow — keeps the "ink on paper" look
},
synthwave: {
traceRgb: (p) => `${~~(255*(1-p))},${~~(0+212*p)},${~~(110+145*p)}`,
// hot pink → electric cyan gradient
},
};
The traceRgb(p) function receives a value from 0 (trail tail) to 1 (current tip) and returns an r,g,b string. This makes it trivially easy to define smooth two-point gradients for the trail without pre-allocating gradient objects.
State Management Strategy
With ~2,500 lines in a single component, state management is carefully structured:
React useState for — things that drive re-renders:
appMode: "draw" | "playing"-
numCircles,speed -
guessPhase,duelPhase,challengePhase - All modal open/close states
React useRef for — things the animation loop reads every frame:
-
compsRef— DFT output array -
frameRef— current animation frame (fractional) -
traceRef— tip position history buffer -
numCirclesRef,speedRef— mirror of sliders (refs read faster in RAF) -
themeRef— current theme (avoids closure capture issues) -
canvasRef— the canvas element
This split ensures that moving a slider doesn't cause React to re-render the component (which would interrupt the animation), but the animation loop still always reads the latest slider value.
Game Modes
Challenge Mode
Player 1 draws something and "locks" it — generating a Base64-encoded URL containing their DFT data. Player 2 opens the URL and must trace the ghost of the original. Their tracing accuracy is scored by computing an area-based similarity between the two paths.
Duel of Circles
Two players each draw a shape. The app runs both DFT animations simultaneously, spawning collision particles where the two traces intersect. The winner is determined by canvas coverage — whose trace fills more area.
Guess the Curve
The app selects a famous mathematical curve at random and animates it. The player must identify it, with the curve's name progressively revealed as more circles are enabled.
Video Export
The app uses the MediaRecorder API to capture the canvas stream directly:
const stream = canvasRef.current.captureStream(60);
const recorder = new MediaRecorder(stream, { mimeType: "video/webm;codecs=vp9" });
const chunks: Blob[] = [];
recorder.ondataavailable = e => chunks.push(e.data);
recorder.onstop = () => {
const blob = new Blob(chunks, { type: "video/webm" });
const url = URL.createObjectURL(blob);
const a = document.createElement("a");
a.href = url;
a.download = "epicycle.webm";
a.click();
};
recorder.start();
No server involved — the entire recording, encoding, and download happens client-side.
Performance Notes
- N=512 sample points gives high-quality reconstruction. The DFT at N=512 runs in ~0.3ms on a modern CPU — well within the 16ms frame budget.
- Canvas 2D, not WebGL. The epicycle chain is at most ~512 circles, each needing a circle arc and a line stroke. The trace is at most 516 segments. Canvas 2D handles this comfortably at 60fps.
- Shadow blur is the most expensive Canvas 2D operation. It's disabled in Paper theme (no glow) and capped at reasonable values in other themes.
- The animation is a single
requestAnimationFrameloop — nosetInterval, no React state updates during playback.
What I Learned
DFT is surprisingly intuitive once you see it live. The "sort by amplitude descending" trick is non-obvious in the equations but immediately obvious visually — big circles get the rough shape right, small ones add the sharp corners.
Canvas 2D is underrated. WebGL is often the first instinct for "smooth 60fps graphics," but Canvas 2D is perfectly adequate for 2D line rendering at this scale. The API surface is far simpler.
Audio-visual synchrony is powerful. Mapping Fourier components to oscillator frequencies makes the math feel tangible in a completely different sensory channel.
useRefis the right tool for animation state. Reaching foruseStatefor every piece of rendering state leads to janky animations. If the animation loop needs it, it goes in a ref.
Try It
The app is live. Draw something. Turn up the circle count slowly. Watch it reconstruct. Switch to Synthwave theme. Enable audio. Then open the Equations menu and watch the Butterfly Curve emerge from 12 full rotations of spinning circles.
Screenshots







Top comments (0)