<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Aditya Anuragi</title>
    <description>The latest articles on DEV Community by Aditya Anuragi (@adityaanuragi00128).</description>
    <link>https://dev.to/adityaanuragi00128</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/adityaanuragi00128"/>
    <language>en</language>
    <item>
      <title>Audio-to-haptics: perfectly syncing phone vibration to audio on the web — how I made it</title>
      <dc:creator>Aditya Anuragi</dc:creator>
      <pubDate>Tue, 05 May 2026 14:14:30 +0000</pubDate>
      <link>https://dev.to/adityaanuragi00128/audio-to-haptics-perfectly-syncing-phone-vibration-to-audio-on-the-web-how-i-made-it-3gip</link>
      <guid>https://dev.to/adityaanuragi00128/audio-to-haptics-perfectly-syncing-phone-vibration-to-audio-on-the-web-how-i-made-it-3gip</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6zejhbpqpry942mv0tm.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6zejhbpqpry942mv0tm.gif" alt=" " width="480" height="853"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The circle fills and pulses in sync with the audio — this is what your phone is feeling. The GIF shows it, but you won't really get it until you feel it. Open this on Android and &lt;a href="https://audio-to-haptics.pages.dev/" rel="noopener noreferrer"&gt;try it yourself →&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Other links -&lt;br&gt;
&lt;a href="https://github.com/AdityaAnuragi/audio-to-haptics" rel="noopener noreferrer"&gt;View on Github&lt;/a&gt;&lt;br&gt;
&lt;a href="https://www.npmjs.com/package/audio-to-haptics" rel="noopener noreferrer"&gt;View on npm&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  The haptics landscape
&lt;/h2&gt;

&lt;p&gt;Native platforms have solid haptics support, and if haptics are the core of your product, the native APIs are worth learning. But there are very few apps where haptics are the focus, instead most haptics are an addition to polish the UX instead. While native APIs on iOS and Android can create a more polished experience, they come with their own constraints.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;iOS Core Haptics&lt;/strong&gt; lets you author precise AHAP files — you define the exact timing, intensity, and sharpness of every pulse. That level of control is what makes native iOS haptics feel polished, both in the API and in what the user actually feels. The Taptic Engine is high-quality hardware, and Core Haptics is built to take full advantage of it — the result, when authored well, is haptics that feel genuinely premium. The trade-off is that it's entirely manual: deriving patterns from audio isn't something the API does for you, so syncing haptics to arbitrary audio means authoring by hand and re-authoring whenever the audio changes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Android 12+&lt;/strong&gt; ships &lt;code&gt;HapticGenerator&lt;/code&gt; — hardware-level automatic analysis, no code required. The HAL derives vibration patterns from audio directly, and the timing is exact. It's the most capable approach to audio-driven haptics that exists. It's also native-only.&lt;/p&gt;

&lt;p&gt;A few other gotchas worth knowing. Cross-platform coverage means two separate native implementations — though frameworks like Expo partially address this. &lt;code&gt;expo-haptics&lt;/code&gt; gives you a unified JS API that maps to the right native backend under the hood, which is a genuine improvement. The catch is that it only exposes preset haptic types: impact (light/medium/heavy), notification (success/warning/error), selection. It's designed for UI feedback — a tap, a confirmation, an error state — not audio analysis. If you want haptics to sync with what's actually playing, you'd still be manually triggering calls based on audio events, which is back to the same hand-authored timing problem. Audio-derived pattern analysis isn't in scope for any of these APIs. Beyond that: any audio change on iOS means re-authoring AHAP files from scratch, and every tweak ships through the app store review cycle.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The web&lt;/strong&gt; gets &lt;code&gt;navigator.vibrate(pattern)&lt;/code&gt;. You pass an array of millisecond durations alternating between vibrate and pause — &lt;code&gt;[200, 100, 200]&lt;/code&gt; means "on 200ms, off 100ms, on 200ms." The motor fires at full power for each on-duration. No amplitude parameter, no intensity control, no automatic analysis. If you want haptics to match specific moments in the audio, you write that pattern array yourself.&lt;/p&gt;

&lt;p&gt;None of that is a criticism of the web platform — &lt;code&gt;navigator.vibrate&lt;/code&gt; does exactly what it says. The gap is that there's no equivalent of HapticGenerator for the web: nothing that takes audio and derives a pattern automatically. That gap is what this library fills.&lt;/p&gt;

&lt;p&gt;A few places where it comes up:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Landing pages and product launches&lt;/strong&gt; — your Show HN link, Product Hunt page, or landing page opens in a mobile browser. There's no app to install. If you want haptics, you're building them yourself from timing arrays.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Web games&lt;/strong&gt; — browser games already have audio: explosions, impacts, pickups. The audio element is already there. Without a library, syncing haptics to it means manually mapping game events to &lt;code&gt;vibrate()&lt;/code&gt; calls and maintaining that mapping every time the audio changes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PWAs&lt;/strong&gt; — technically web, live on the home screen, run in the browser engine. &lt;code&gt;navigator.vibrate&lt;/code&gt; works identically. You could wrap a PWA in Capacitor or a native shell to access HapticGenerator and Core Haptics — but that means app store submissions, separate iOS/Android builds, and native maintenance overhead just to add haptics. Whether that trade-off is worth it comes down to one question: are haptics &lt;em&gt;the&lt;/em&gt; feature of your product, or are they an enhancement?&lt;/p&gt;

&lt;p&gt;If haptics are your core product — haptics &lt;em&gt;are&lt;/em&gt; why someone is using the app — the native path is worth the investment. The quality difference is real and it will matter to your users.&lt;/p&gt;

&lt;p&gt;But most apps aren't "haptics apps." A music player, a game, a product demo — haptics are the layer of polish on top, not the reason someone is there. A well-timed vibration on a beat, a gunshot, a UI interaction adds to the experience. For that use case, the overhead of native builds, AHAP authoring, and cross-platform implementation costs more than the enhancement is worth. Two lines of JS gets you there.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Web-based audio and video players&lt;/strong&gt; — any site that embeds audio or video with impactful sound. The &lt;code&gt;&amp;lt;audio&amp;gt;&lt;/code&gt; or &lt;code&gt;&amp;lt;video&amp;gt;&lt;/code&gt; element is already there.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Rapid prototyping for native&lt;/strong&gt; — even if you're eventually shipping iOS Core Haptics with hand-authored AHAP files, iterating in a browser first is much faster. Use the library to find what feels right, then port those timings to AHAP once you know the answer. The manual authoring step becomes a lot less painful when the creative work is already done.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;audio-to-haptics&lt;/code&gt; fills that gap: analyze any audio URL or file, get a derived vibration pattern, attach it to a &lt;code&gt;&amp;lt;video&amp;gt;&lt;/code&gt; or &lt;code&gt;&amp;lt;audio&amp;gt;&lt;/code&gt; element — haptics fire automatically in sync with playback. Swap the audio and re-analyze; the haptics update to match. No manual authoring required.&lt;/p&gt;

&lt;p&gt;The Web Audio API gives you a &lt;code&gt;Float32Array&lt;/code&gt; of per-sample amplitude data across the full frequency spectrum to work with. Turning that into something that actually feels right on an on/off motor turns out to be harder than it looks — and produced some browser behavior findings that aren't in any documentation.&lt;/p&gt;


&lt;h2&gt;
  
  
  The algorithm
&lt;/h2&gt;

&lt;p&gt;The core problem: the Web Audio API gives you per-sample amplitude data across the full frequency spectrum, and &lt;code&gt;navigator.vibrate&lt;/code&gt; fires a motor. Deciding &lt;em&gt;when&lt;/em&gt; to fire — and &lt;em&gt;how hard&lt;/em&gt; to simulate — from raw amplitude data turns out to have a lot of wrong answers before you find a right one.&lt;/p&gt;

&lt;p&gt;The approach went through a few iterations. Fixed amplitude thresholds, bass filtering, frame-by-frame FFT analysis — each worked for some audio and broke on others. What eventually worked is a combination of three things: past-only neighbor comparison for onset detection, a sustain mechanism for decay tails, and PWM for intensity simulation. The sections below cover each in order, including why the earlier approaches failed.&lt;/p&gt;
&lt;h2&gt;
  
  
  Why fixed thresholds don't work
&lt;/h2&gt;

&lt;p&gt;The obvious first approach: fire &lt;code&gt;navigator.vibrate()&lt;/code&gt; whenever amplitude exceeds a fixed threshold. Works fine for isolated sounds. Here's what it does on real audio:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Audio&lt;/th&gt;
&lt;th&gt;Duration&lt;/th&gt;
&lt;th&gt;Haptic events (fixed threshold)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Bike rev&lt;/td&gt;
&lt;td&gt;~10s&lt;/td&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Chainsaw&lt;/td&gt;
&lt;td&gt;~15s&lt;/td&gt;
&lt;td&gt;52&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Death metal drumming&lt;/td&gt;
&lt;td&gt;~30s&lt;/td&gt;
&lt;td&gt;396&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Chippin' In — Refused / Cyberpunk 2077 (guitar)&lt;/td&gt;
&lt;td&gt;~3min&lt;/td&gt;
&lt;td&gt;290&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The goal is to fire haptics on distinct moments — a beat, an impact, a transient. Haptics get their value from contrast: the motor punching on a loud moment feels meaningful because there was silence before it. For a 10-second bike rev, 10 events is reasonable — a handful of distinct peaks. For a 30-second clip, 396 means the motor is running almost continuously. There's no contrast, no punch — just a constant buzz that most users will turn off within seconds.&lt;/p&gt;

&lt;p&gt;A bass filter helps marginally but doesn't fix the root problem: sustained bass guitar still triggers constantly.&lt;/p&gt;

&lt;p&gt;Root cause: music has no absolute quiet. An absolute threshold answers "is this loud?" when the real question is "is this &lt;em&gt;louder than what just happened&lt;/em&gt;?"&lt;/p&gt;


&lt;h2&gt;
  
  
  Why FFT doesn't work either
&lt;/h2&gt;

&lt;p&gt;The next instinct is to use the Web Audio API's &lt;code&gt;AnalyserNode&lt;/code&gt; for real-time FFT — read &lt;code&gt;getByteFrequencyData()&lt;/code&gt; each RAF frame, average the bass bins (the low-frequency buckets where kicks and impacts live), and vibrate when that energy is high. Targeting specific frequency ranges seemed like a cleaner approach than raw amplitude — you could isolate the bass frequencies that actually feel like impacts and ignore the mids and highs.&lt;/p&gt;

&lt;p&gt;It helped a little. But it introduced a new failure mode alongside the old one.&lt;/p&gt;

&lt;p&gt;The old problem: sustained bass notes and bass transients look identical in a single frequency frame. Both show high energy in the low bins. The static threshold problem was still there, just shifted to a different signal.&lt;/p&gt;

&lt;p&gt;The new problem: anything outside the targeted frequency range doesn't register at all, even when it clearly should. Testing against a simple beep-beep audio made this obvious — four beeps, same audio, same perceived loudness. The first three sent haptics. The fourth didn't, because it sat at a slightly different frequency and fell outside the bass bins being watched. From a haptic standpoint all four beeps are the same event. The frequency-based approach treated them as fundamentally different things.&lt;/p&gt;

&lt;p&gt;This is the core issue with targeting specific frequency ranges: you're deciding what &lt;em&gt;kind&lt;/em&gt; of sound triggers haptics, when what actually matters is whether a sound is &lt;em&gt;louder than what just happened&lt;/em&gt;. That's a relative amplitude question, not a frequency question.&lt;/p&gt;

&lt;p&gt;The next variant added frame-to-frame spike detection — instead of asking "is bass energy high?" it asked "did bass energy jump from the previous frame?" That's closer to the right question. But at 16ms per RAF frame, a transient spans 3–4 frames. The jump you're looking for arrives spread across multiple frames, and comparing frame &lt;em&gt;n&lt;/em&gt; to frame &lt;em&gt;n-1&lt;/em&gt; doesn't reliably catch it. Extending to a running average of recent frames helped at the edges but introduced its own lag.&lt;/p&gt;

&lt;p&gt;Five variants in total were tested. They all had the same structural problem: real-time frame-by-frame analysis can only see a tiny window at a time. It can't see the full &lt;em&gt;shape&lt;/em&gt; of a transient — the rise and fall across 60–120ms that makes a kick drum identifiable as a spike relative to what came before it. Each frame only knows its immediate neighbors.&lt;/p&gt;

&lt;p&gt;Pre-computed analysis sidesteps this entirely. Before playback starts, the entire audio file is processed into 60ms buckets. When the RAF loop runs, it's reading from a pre-built map — it knows exactly what the last 240ms of audio looked like before firing a single vibration. You could theoretically build something similar with real-time FFT using a rolling look-back buffer, but by this point the more fundamental problem was already clear: frequency targeting was the wrong frame for the question we were asking. Even with perfect historical context, watching bass bins specifically would still miss the fourth beep.&lt;/p&gt;


&lt;h2&gt;
  
  
  Past-only neighbor comparison
&lt;/h2&gt;

&lt;p&gt;Split the audio into ~60ms buckets (2646 samples at 44100Hz — small enough for timing precision, large enough for the motor to spin up). For each bucket, compare its peak amplitude to the mean of the last 4 buckets (~240ms of past audio):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ratio = trend.max / mean(last 4 buckets)
vibrate if ratio &amp;gt;= 1.5
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Two scenarios that illustrate why this works:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scenario A&lt;/strong&gt; — quiet past, big spike: &lt;code&gt;pastAvg = 0.08&lt;/code&gt;, &lt;code&gt;current.max = 0.29&lt;/code&gt;, ratio = 3.6x → fires&lt;br&gt;&lt;br&gt;
&lt;strong&gt;Scenario B&lt;/strong&gt; — loud past, modest rise: &lt;code&gt;pastAvg = 0.41&lt;/code&gt;, &lt;code&gt;current.max = 0.45&lt;/code&gt;, ratio = 1.1x → silent&lt;/p&gt;

&lt;p&gt;This is local edge detection applied to audio. A kick drum in a wall of guitar stands out &lt;em&gt;locally&lt;/em&gt; even when everything is globally loud.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why past-only, not symmetric neighbors?&lt;/strong&gt; Symmetric averaging pulls future silence backward into the window — it attenuates a spike's ratio by averaging in the quiet that follows. Past-only: a genuine onset always clears the threshold regardless of what comes after it. With symmetric neighbors, &lt;code&gt;spikeRatio&lt;/code&gt; needed to be ~2.0 for music but ~1.5 for isolated sounds — the same formula couldn't serve both. With past-only, 1.5 works for everything.&lt;/p&gt;

&lt;p&gt;Results after switching:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Audio&lt;/th&gt;
&lt;th&gt;Duration&lt;/th&gt;
&lt;th&gt;Symmetric neighbors&lt;/th&gt;
&lt;th&gt;Past-only&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Bike rev&lt;/td&gt;
&lt;td&gt;~10s&lt;/td&gt;
&lt;td&gt;22&lt;/td&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Chainsaw&lt;/td&gt;
&lt;td&gt;~15s&lt;/td&gt;
&lt;td&gt;108&lt;/td&gt;
&lt;td&gt;52&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Death metal&lt;/td&gt;
&lt;td&gt;~30s&lt;/td&gt;
&lt;td&gt;396&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;71&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Chippin' In (Refused / Cyberpunk 2077)&lt;/td&gt;
&lt;td&gt;~3min&lt;/td&gt;
&lt;td&gt;290&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;82&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The death metal clip is the clearest case: 396 events in 30 seconds is roughly 13 per second — the motor never stops, it's just on. Past-only brings that to 71, which is about 2–3 per second. For a fast drum track that's actually right: you feel distinct hits, not a constant buzz. Chippin' In tells a similar story from the other direction — 290 over 3 minutes sounds manageable until you realize that's still one haptic every 0.6 seconds for an entire song. At 82 it's roughly one every 2 seconds, which is what punctuation feels like.&lt;/p&gt;


&lt;h2&gt;
  
  
  Sustain: catching decay tails
&lt;/h2&gt;

&lt;p&gt;Past-only comparison drops off immediately after a spike's peak. But a thunderclap or an explosion has a natural decay tail — there's the initial bang, and then the rumble that follows for a second or two afterward. You want the haptics to fade with it, not cut off the moment the peak bucket passes.&lt;/p&gt;

&lt;p&gt;Natural decays are geometric: each bucket is roughly &lt;code&gt;prev × k&lt;/code&gt; for some constant &lt;code&gt;k &amp;lt; 1&lt;/code&gt;. That shape is scale-invariant, which means a percentage threshold works at any absolute amplitude. So before the spike check:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if current.max &amp;gt;= prev.max × 0.75 → sustain (continue vibrating)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;sustainLowerBound=0.75&lt;/code&gt; catches the full decay tail. It's also self-terminating: as the chain decays geometrically, eventually &lt;code&gt;current.max &amp;lt; prev.max × 0.75&lt;/code&gt; and the chain ends naturally.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;sustainUpperBound=1.01&lt;/code&gt; blocks rising sections from being sustained through — only true decay tails qualify. The sustain check runs before the noise floor gate so quiet tails still fire.&lt;/p&gt;




&lt;h2&gt;
  
  
  PWM intensity simulation
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;navigator.vibrate(duration)&lt;/code&gt; has no amplitude parameter. Simulating intensity requires PWM: shorter duty cycles for soft vibration, longer for hard.&lt;/p&gt;

&lt;p&gt;Each 20ms cycle is split into on and off based on intensity:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Intensity&lt;/th&gt;
&lt;th&gt;on&lt;/th&gt;
&lt;th&gt;off&lt;/th&gt;
&lt;th&gt;Pattern (one cycle)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;100%&lt;/td&gt;
&lt;td&gt;20ms&lt;/td&gt;
&lt;td&gt;0ms&lt;/td&gt;
&lt;td&gt;
&lt;code&gt;[20, 0, ...]&lt;/code&gt; — effectively solid on&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;75%&lt;/td&gt;
&lt;td&gt;15ms&lt;/td&gt;
&lt;td&gt;5ms&lt;/td&gt;
&lt;td&gt;&lt;code&gt;[15, 5, 15, 5, ...]&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;50%&lt;/td&gt;
&lt;td&gt;10ms&lt;/td&gt;
&lt;td&gt;10ms&lt;/td&gt;
&lt;td&gt;&lt;code&gt;[10, 10, 10, 10, ...]&lt;/code&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;25%&lt;/td&gt;
&lt;td&gt;5ms&lt;/td&gt;
&lt;td&gt;15ms&lt;/td&gt;
&lt;td&gt;
&lt;code&gt;[5, 15, ...]&lt;/code&gt; — motor stalls, too short to spin up&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;That pattern repeats to fill the full chain duration. A 200ms chain at 75% becomes &lt;code&gt;[15, 5, 15, 5, 15, 5, ...]&lt;/code&gt; — 10 entries, the motor cycling on and off fast enough that inertia smooths it into perceived partial amplitude. The same principle as hardware PWM motor control, implemented in JS.&lt;/p&gt;

&lt;p&gt;Two firing modes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Short chains&lt;/strong&gt; (&amp;lt; 4 buckets, ~240ms): solid &lt;code&gt;[remainingMs]&lt;/code&gt; pulse. Transients need impact; there's no room for PWM texture in a 1–2 bucket chain.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Long chains&lt;/strong&gt;: PWM pattern. Sustained audio gets textured vibration at &lt;code&gt;intensity × duty cycle&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;intensityFloor=0.5&lt;/code&gt; because below ~40% duty cycle the motor stalls rather than spinning weakly.&lt;/p&gt;

&lt;p&gt;Credit: PWM for pseudo-intensity on the web was first demonstrated by &lt;a href="https://haptics.lochie.me" rel="noopener noreferrer"&gt;web-haptics by Lochie&lt;/a&gt;. &lt;code&gt;audio-to-haptics&lt;/code&gt; applies the same technique to audio-driven timing rather than manually authored patterns.&lt;/p&gt;




&lt;h2&gt;
  
  
  Browser behavior worth knowing
&lt;/h2&gt;

&lt;p&gt;The mute window is queryable via &lt;code&gt;AudioContext&lt;/code&gt;, the &lt;code&gt;decodeAudioData&lt;/code&gt; transfer behaviour is in the Web Audio spec, and MDN does acknowledge that pattern arrays get truncated — all documented, all easy to miss. The specific truncation limit isn't stated anywhere; you only find the number by hitting it. All of it was found through testing on Android Chrome on a Samsung Galaxy budget phone.&lt;/p&gt;

&lt;h3&gt;
  
  
  The mute window
&lt;/h3&gt;

&lt;p&gt;When you call &lt;code&gt;audio.play()&lt;/code&gt;, &lt;code&gt;audioEl.currentTime&lt;/code&gt; starts advancing immediately — but the audio hardware pipeline has latency. The sound doesn't physically reach the speakers for &lt;code&gt;outputLatency + baseLatency&lt;/code&gt; milliseconds. If you fire haptics the moment &lt;code&gt;currentTime&lt;/code&gt; hits a spike in the waveform, the vibration fires before the associated sound does. Early haptics feel awful — instead of a satisfying punch that lands with the audio, you get a phantom vibration followed by the sound a beat later. The sync illusion breaks completely. The library suppresses all &lt;code&gt;navigator.vibrate()&lt;/code&gt; calls for exactly &lt;code&gt;outputLatency + baseLatency&lt;/code&gt; ms after every play, seek, and pause to keep haptics aligned with the audio.&lt;/p&gt;

&lt;p&gt;The following examples illustrate when sound is being played and when haptics are being played. However since gifs don't support audio, the gif essentially visualises when the audio and haptics are being fired in the UI. However you can test it out yourself by cloning &lt;a href="https://github.com/AdityaAnuragi/audio-to-haptics/tree/audio-haptic-desync-issue" rel="noopener noreferrer"&gt;issue&lt;/a&gt; and the &lt;a href="https://github.com/AdityaAnuragi/audio-to-haptics/tree/mute-window-to-sync-audio-and-haptics" rel="noopener noreferrer"&gt;mute window&lt;/a&gt; and execute the following commands to try it yourself.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install
&lt;/span&gt;npm run dev
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8nctajfpmypfk9nbg6n0.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8nctajfpmypfk9nbg6n0.gif" alt=" " width="480" height="853"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see in the gif above, the haptics are being fired even before the sound is being played.&lt;/p&gt;

&lt;p&gt;For an even more exaggerated example. You can rapidly press play/pause and the audio is never even heard yet somehow the haptics will still be fired nonetheless.&lt;/p&gt;

&lt;h2&gt;
  
  
  Now here's the same audio with the mute window implemented
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmszvlpiaf0ti16lfj5qw.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmszvlpiaf0ti16lfj5qw.gif" alt=" " width="560" height="996"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here with the mute window implemented, the haptics are much more in sync with the audio being played.&lt;/p&gt;

&lt;p&gt;This isn't gonna be a noticble difference on a flagship phone. However, older devices can have more of a delay which can cause an extremely poor UX. The phone shown above is a Samsung Galaxy M32 (Appx 4 years old now).&lt;/p&gt;

&lt;p&gt;[PS - Why did I record this on another phone instead of screen capture you might ask? I recorded in a mp4 format with another device so you can hear the vibration by setting the phone on a table and hear some rattling noise, only to then realise I can't upload mp4 here 😅, not directly I mean]&lt;/p&gt;

&lt;p&gt;Both values are properties on the same &lt;code&gt;AudioContext&lt;/code&gt; you're already using to decode audio — &lt;code&gt;ctx.outputLatency&lt;/code&gt; is the latency added by the browser's audio rendering pipeline, &lt;code&gt;ctx.baseLatency&lt;/code&gt; is the latency from the underlying hardware audio buffer. Together they give you the total delay between &lt;code&gt;currentTime&lt;/code&gt; advancing and sound reaching the speakers:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;ctx&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;AudioContext&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;muteWindowMs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;outputLatency&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;baseLatency&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This varies significantly by device. On the budget Samsung tested: ~168ms + ~171ms = ~339ms total. Modern flagship Android devices can sit at sub-3ms — over 100× lower. Without this compensation, haptics that feel roughly synced on a flagship will fire noticeably early on older budget hardware. More noticeable on older devices, but worth fixing regardless.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;decodeAudioData&lt;/code&gt; transfers the ArrayBuffer
&lt;/h3&gt;

&lt;p&gt;The Web Audio API &lt;em&gt;transfers&lt;/em&gt; the &lt;code&gt;ArrayBuffer&lt;/code&gt; into the decoder rather than copying it. The original reference is zeroed out once &lt;code&gt;decodeAudioData&lt;/code&gt; completes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;buffer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetchArrayBuffer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;decodeAudioData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;buffer&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="c1"&gt;// buffer.byteLength === 0 — it's empty now&lt;/span&gt;

&lt;span class="c1"&gt;// Re-analyzing with different options requires cloning first:&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;decodeAudioData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;decodeAudioData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;buffer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is spec-compliant behavior but easy to miss. The symptom is silent or garbage analysis on the second call with no error thrown — the kind of bug that takes a while to trace.&lt;/p&gt;

&lt;h3&gt;
  
  
  Android pattern array limit
&lt;/h3&gt;

&lt;p&gt;Chrome caps &lt;code&gt;navigator.vibrate()&lt;/code&gt; pattern arrays at around 128 entries in practice (the W3C spec says 10, which Chrome clearly ignores — MDN acknowledges truncation happens but leaves the limit as "implementation-dependent"). A 45-bucket sustained chain (45 × 60ms = 2700ms) at 50% intensity generates &lt;code&gt;[10, 10] × 135&lt;/code&gt; = 270 entries — well over the ~128 limit, so the motor stops mid-chain.&lt;/p&gt;

&lt;p&gt;In practice this is fine — nobody wants 10 seconds of continuous haptics anyway. But if you're working with long sustained events, cap the pattern at 190 entries or re-fire mid-chain.&lt;/p&gt;




&lt;h2&gt;
  
  
  The API
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useRef&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;useEffect&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useHaptics&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;audio-to-haptics/react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;VideoPlayer&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;src&lt;/span&gt; &lt;span class="p"&gt;}:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;src&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;videoRef&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;useRef&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;HTMLVideoElement&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;analyze&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useHaptics&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;videoRef&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="nf"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;analyze&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;src&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;src&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;video&lt;/span&gt; &lt;span class="na"&gt;ref&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;videoRef&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;controls&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Haptics fire automatically on play and stop on pause or seek. The hook calls &lt;code&gt;detach()&lt;/code&gt; on unmount.&lt;/p&gt;

&lt;p&gt;For non-React projects, the vanilla JS API works the same way:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;video&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"player"&lt;/span&gt; &lt;span class="na"&gt;controls&lt;/span&gt; &lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"your-video.mp4"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/video&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;HapticEngine&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;audio-to-haptics&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;video&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;player&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;engine&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;HapticEngine&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;analyze&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;video&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;src&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nx"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;attach&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;video&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// call when done — stops the RAF loop, cancels vibration, removes listeners&lt;/span&gt;
&lt;span class="c1"&gt;// unlike the React hook, this won't happen automatically&lt;/span&gt;
&lt;span class="nx"&gt;engine&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;detach&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Works on Android Chrome, Samsung Internet, Opera Mobile. Desktop browsers don't implement the Vibration API.&lt;/p&gt;




&lt;h2&gt;
  
  
  Results
&lt;/h2&gt;

&lt;p&gt;A library that lets you sync your phone's vibration motor with any audio or video playing in the browser — point it at a &lt;code&gt;&amp;lt;audio&amp;gt;&lt;/code&gt; or &lt;code&gt;&amp;lt;video&amp;gt;&lt;/code&gt; element, and the haptics follow automatically. A thunderclap, a punch landing, a jump scare, an explosion fading into rumble — whatever the audio has, the haptics follow. On the web, with two lines of code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://audio-to-haptics.pages.dev/" rel="noopener noreferrer"&gt;Try it here&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;The core insight that made this work — comparing each audio bucket to its recent past rather than an absolute threshold — turned out to be the same principle as edge detection. Once that clicked, the rest followed: sustain for decay tails, PWM for intensity, pre-computed analysis for timing precision. The platform findings along the way (audio pipeline latency, &lt;code&gt;decodeAudioData&lt;/code&gt;'s transfer semantics, the pattern array cap) were the kind of things you only find by testing on real hardware.&lt;/p&gt;

&lt;p&gt;If you're building something on the web that already has audio, adding haptics is two lines. Give it a try on Android and see how it feels — it's one of those things that's hard to appreciate until you actually feel it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://audio-to-haptics.pages.dev/usage/" rel="noopener noreferrer"&gt;Read the full docs here&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mobile</category>
      <category>javascript</category>
      <category>webdev</category>
      <category>showdev</category>
    </item>
  </channel>
</rss>
