DEV Community

TommyDee
TommyDee

Posted on

I just open-sourced 13 MIT libraries for web visual effects

After months of work, today I shipped Vysmo — a set of MIT-licensed libraries for web visual computing. All 13 packages are now on npm under @vysmo.

What's in it

  • @vysmo/transitions — 60 WebGL2 transition shaders defined as plain data. Includes a mesh-based page-curl with drag-scrub mid-flip, polygon flip, and classic crossfades/wipes. Tree-shakable to the byte.
  • @vysmo/text — Multi-property choreographed text animation with 300+ presets. Grapheme-safe splitting via Intl.Segmenter works for emoji, Arabic, and Devanagari.
  • @vysmo/effects — WebGL filter primitives (blur, bloom, glow, vignette, chromatic aberration, color grading).
  • @vysmo/easings — 40+ named curves, parametric builders (spring, bezier, wiggle, rough), composition modifiers, CSS export, reduced-motion helpers.
  • @vysmo/scroll — scroll-driven primitives that compose with transitions and effects.
  • @vysmo/flipbook — drag-scrub page-flip component built on the page-curl shader.
  • @vysmo/slideshow — image slideshow with opt-in chrome, drives any of the 60 transitions.
  • @vysmo/animations — value-based tweening: animate(), spring(), timeline(), interpolate().
  • @vysmo/gl-core — shared WebGL2 plumbing.
  • Plus React wrappers: @vysmo/transitions-react, @vysmo/text-react, @vysmo/flipbook-react, @vysmo/slideshow-react.

Design principles

  • Zero runtime dependencies in every package.
  • SSR-safe at module load — enforced by a Node import test per package.
  • Headless-first — components are opt-in wrappers around a vanilla TS core. The same code drives canvas, image, and video sources.
  • Plain-data API for transitions, text, and effects so the same definition can drive DOM today and a canvas renderer later.

Quick example

Crossfade between two images with one of the 60 WebGL transitions:

import { Runner, paintBleed } from "@vysmo/transitions";
import { animate } from "@vysmo/animations";

const canvas = document.querySelector<HTMLCanvasElement>("canvas")!;
const runner = new Runner({ canvas });

const fromImg = new Image();
const toImg = new Image();
fromImg.src = "/photo-a.jpg";
toImg.src = "/photo-b.jpg";
await Promise.all([fromImg.decode(), toImg.decode()]);

animate({
  from: 0,
  to: 1,
  duration: 1200,
  onUpdate: (p) => runner.render(paintBleed, {
    from: fromImg,
    to: toImg,
    progress: p,
  }),
});
Enter fullscreen mode Exit fullscreen mode

The transition is just plain data describing how to interpolate between two textures. The Runner handles WebGL plumbing.

Links

This is the first npm release (0.1.0), so APIs may still shift before 1.0. I'd love feedback on:

  • API design
  • Bugs or weird behavior in the playgrounds
  • Demos or use-cases you'd want to see covered

Thanks for reading.

Top comments (0)