DEV Community

Cover image for TinyNN: A Fast, Zero-Dependency Neural Network Library for Node.js
Paul H
Paul H

Posted on

TinyNN: A Fast, Zero-Dependency Neural Network Library for Node.js

TinyNN: A Fast, Zero-Dependency Neural Network Library for Node.js

A lightweight neural network library optimized for CPU performance. No dependencies, pure JavaScript. Trains MNIST to 91% accuracy in under a minute. Built for developers learning AI who want to understand the underlying mechanics without the overhead of heavy frameworks.

What It Is

TinyNN is a feedforward neural network implementation in ~500 lines of code. It uses typed arrays (Float64Array) and optimized loops for performance.

Key Features

  • Zero dependencies - No external packages required
  • CPU optimized - Typed arrays and cache-friendly loops
  • Small footprint - 8.4 KB compressed
  • Educational - Clean, commented code you can actually read

Quick Example

import tinynn from 'tinynn';
import { relu } from 'tinynn/utils';

// Create network: 784 inputs, two hidden layers, 10 outputs
const network = tinynn([784, 64, 64, 10], relu);

// Train on batch
for (const image of trainingBatch) {
    network.train(image.pixels, image.label);
}
network.updateWeights(0.005, batchSize);
Enter fullscreen mode Exit fullscreen mode

MNIST Demo

The package includes a complete MNIST handwritten digit recognition demo with a proper train/test split (80/20).

npm install tinynn
npm run demo  # Train on 48,000 images
npm test      # Test on 12,000 images (held-out)
Enter fullscreen mode Exit fullscreen mode

Performance: Trains at ~1,000 images/second with batch size 20. Full training (48,000 images) completes in under a minute on standard hardware, reaching 91%+ accuracy on test data.

Training shows:

  • Data loading and normalization
  • Mini-batch gradient descent
  • Real-time loss and accuracy tracking
  • Automatic weight saving

Testing displays:

  • Overall accuracy on held-out data
  • Per-digit accuracy breakdown
  • Confusion patterns (which digits get mistaken for others)
  • Best and worst performing digits

Example test output:

Overall Accuracy: 91.03%

Digit | Accuracy | Most Confused With
  1   |  96.9%   | 8 (15 times)
  0   |  95.7%   | 5 (16 times)
  3   |  84.1%   | 5 (72 times)
Enter fullscreen mode Exit fullscreen mode

Technical Implementation

Optimizations applied:

  • Float64Array for numerical operations (faster than regular arrays)
  • Cached array references in hot loops
  • Pre-computed learning rate factors
  • Single-pass softmax computation

Architecture:

  • He weight initialization (optimized for ReLU)
  • Mini-batch gradient descent
  • Backpropagation with chain rule
  • Softmax + cross-entropy loss

Purpose

Education - Understand neural networks by reading clean, well-documented code. Perfect for students and developers learning AI fundamentals.

Small Applications - Deploy models where you don't need distributed training or GPU acceleration. Ideal for:

  • Single-server deployments
  • Serverless functions (AWS Lambda, Cloudflare Workers)
  • Simple classification tasks (digit recognition, basic text classification)
  • Applications where simplicity and zero dependencies matter more than scale

With 91% accuracy in under a minute of training, it's fast enough for real-world use cases that don't require massive datasets.

Installation

npm install tinynn
Enter fullscreen mode Exit fullscreen mode

Requirements: Node.js >= 18.0.0

Repository

GitHub: github.com/paulhodel/tinynn
NPM: npm install tinynn
License: MIT

Top comments (0)