DEV Community

Arthur
Arthur

Posted on • Originally published at pickles.news

One 200-Year-Old Math Trick Powers Almost Every Pixel and Sound You Touch

In December 1807, a French mathematician named Joseph Fourier presented a memoir to the Paris Academy of Sciences claiming that any reasonable signal — any sound, any temperature distribution, any periodic process — could be written as a sum of sines and cosines. Lagrange, who had spent decades on trigonometric series, objected so forcefully that publication was blocked. The manuscript sat for fifteen years before it appeared in book form as Théorie analytique de la chaleur in 1822. Fourier was trying to model heat flow in a metal bar.

Two centuries later, every JPEG image, every MP3 track, every Wi-Fi packet, and every MRI scan in routine clinical use leans on the same idea. Fourier did not aim at any of that. The trick generalized in ways nobody alive in 1807 could have predicted, and the chain from a heat-equation paper to a 5G modem is short enough to walk in a single article.

What the trick actually is

Take a signal — a string of audio samples, a row of pixel intensities, a slice of MRI sensor data. The Fourier transform writes that signal as a sum of pure tones, each at a specific frequency, with a specific amplitude and phase. The inverse transform takes you back. Both directions lose nothing.

That sounds like an analytical curiosity. The reason it underpins so much engineering is that most signals worth caring about are sparse in the frequency domain even when they're dense in the time or space domain. A 30-second song is hundreds of thousands of audio samples; the same song, transformed, is dominated by a few hundred frequencies. Modify the frequency-domain version (zero out the inaudible bands, drop the small coefficients, pack different bits onto different frequencies) and transform back, and you've done compression, filtering, denoising, or modulation depending on what you modified.

Strictly: the Fourier transform is a basis change. It projects the signal onto an orthogonal set of basis functions — sines and cosines, or close relatives — and once you have a basis where the signal is sparse, every downstream operation gets cheaper.

The chain from 1822 to your phone

Two milestones did most of the heavy lifting between Fourier's manuscript and modern silicon.

The first was the Cooley–Tukey FFT algorithm, published in Mathematics of Computation in 1965. James Cooley and John Tukey reduced the cost of computing the discrete Fourier transform from O(n²) to O(n log n). For a million-sample signal, the difference is roughly 50,000× fewer operations. (Carl Friedrich Gauss had described essentially the same recursive structure around 1805 while interpolating the orbits of the asteroids Pallas and Juno; he didn't publish, the work appeared posthumously in Neo-Latin, and was rediscovered as having predated Cooley–Tukey only after the 1965 paper. Gauss had reasons to be modest about his side projects.)

The second was the discrete cosine transform, proposed by Nasir Ahmed at Kansas State University to the NSF in 1972 and developed with T. Raj Natarajan and K. R. Rao in a January 1974 paper. The DCT is a Fourier-transform variant tailored for real-valued data and natural-image statistics. Eighteen years later, the JPEG standard (ISO/IEC 10918-1, published 1992) used 8×8 DCT blocks at its core; the next year, the MP3 standard wrapped a modified DCT in a psychoacoustic filterbank to throw out audio frequencies the ear couldn't hear. Both compression schemes are, mechanically, the same move: transform, drop the coefficients you can afford to lose, transform back.

The same FFT silicon that powers JPEG also runs the wireless stack. OFDM (orthogonal frequency-division multiplexing) packs data onto hundreds or thousands of separate sub-carriers, each carrying a small piece of the bitstream. The receiver pulls the streams apart with an FFT. Wi-Fi 6 (802.11ax) uses up to 2,048 sub-carriers in a 160 MHz channel and modulation up to 1024-QAM. 4G LTE, 5G NR, DSL, DAB digital radio, and DVB-T digital television are all OFDM. Every wireless packet on most of the planet's home and mobile networks is the same trick at the physical layer.

MRI uses the transform more directly: the scanner does not collect a picture. It collects the spatial-frequency components of a slice of tissue (the k-space data), and the standard image-reconstruction step is the inverse Fourier transform of that array. Other reconstruction methods exist for special cases; the routine clinical pipeline is built on the inverse transform.

Why one math fits all

The reason this trick works on audio, images, radio, and bodies is that physical reality is wave-shaped. Sound is air-pressure oscillation. Light and radio are electromagnetic oscillation. The molecules in your body absorb and re-emit radio at frequencies determined by their nuclear magnetic moments. None of these systems are modeled by sinusoids as a convenience; they are sinusoidal, and Fourier gave us the language to read them.

A 1965 algorithm made the language cheap to speak in real time. A 1974 paper specialized it for natural data. After that, the rest is engineering.

Two centuries of compounding interest

Most of what looks distinctly 21st century — your phone, your wireless connection, your medical imaging, your streaming music — traces back to a 1807 manuscript that was blocked from publication by the most respected mathematician in Europe. The applications change every decade. The math underneath has been stable since Cooley and Tukey made it cheap.

Fourier died in 1830. He never saw a JPEG, an MP3, an MRI scan, or a Wi-Fi handshake. He never saw the inside of a transistor, a vacuum tube, or any computational device more sophisticated than a logarithm table. The trick was complete before any of those things existed.

The interesting question is not what the next compression standard or wireless modulation will look like. Those will be small refinements on a settled idea. The interesting question is whether anyone is currently working on a piece of mathematics that will, in 2226, still be doing this much work — and whether the people doing it have the same low expectations Fourier was, in 1807, given.

Top comments (0)