DEV Community

Cover image for How to Optimize JavaScript Build Process: Tree Shaking, Code Splitting, and Performance Techniques
Nithin Bharadwaj
Nithin Bharadwaj

Posted on

How to Optimize JavaScript Build Process: Tree Shaking, Code Splitting, and Performance Techniques

As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!

Understanding how to make your JavaScript applications faster and more efficient starts with the build process. I've spent years working with complex codebases, and the right build optimizations often make the difference between a sluggish application and a smooth one. Let’s walk through several practical methods to improve your tooling, focusing on clear explanations and code you can use.

Knowing What's in Your Bundle

Before you can optimize, you need to see what you're working with. A bundle analyzer is like an X-ray for your final JavaScript files. It shows you exactly which parts of your code are taking up the most space. I often start a project by setting one up. It helps me catch problems early, like a massive library I accidentally included twice. The goal is to understand the weight of each module.

Here’s a basic structure for an analyzer that you can integrate into your build process. It reads your bundled file and breaks it down.

// A simple analyzer to understand bundle composition
class SimpleBundleAuditor {
  constructor() {
    this.moduleSizes = new Map();
  }

  async auditFile(filePath) {
    const code = await fs.promises.readFile(filePath, 'utf-8');
    const sizeInBytes = Buffer.byteLength(code, 'utf-8');
    console.log(`File: ${filePath} - Total: ${(sizeInBytes / 1024).toFixed(2)} KB`);

    // A naive but useful regex to find potential module boundaries in a bundled file.
    const modulePattern = /\(function\(module,\s*exports[^)]*\)\s*{([\s\S]*?)}\)\(/g;
    let moduleMatch;
    let index = 0;

    while ((moduleMatch = modulePattern.exec(code)) !== null) {
      const moduleContent = moduleMatch[1];
      const moduleSize = Buffer.byteLength(moduleContent, 'utf-8');
      const moduleId = `module_${index}`;
      this.moduleSizes.set(moduleId, moduleSize);
      console.log(`  ${moduleId}: ${(moduleSize / 1024).toFixed(2)} KB`);
      index++;
    }
  }
}

// Use it
const auditor = new SimpleBundleAuditor();
await auditor.auditFile('./dist/main.bundle.js');
Enter fullscreen mode Exit fullscreen mode

Running this will give you a rough map. For a real project, you'd use established tools like webpack-bundle-analyzer, but building a simple version helps you understand the principle. The key is seeing which chunks are the largest. Is it your own code, a UI library, or a utility like lodash? Once you know, you can act.

Removing Code That Goes Nowhere

Tree shaking is a fancy term for a simple idea: don't ship code that is never used. If you import a whole library but only use one function, your build tool should ideally include just that function. This sounds automatic, but it often needs the right setup to work well. It requires your code and your dependencies to be written in a way that allows the tool to see what's unnecessary.

Imagine a utility library is a toolbox. Tree shaking lets you take out just the hammer and screwdriver you need, instead of carrying the whole heavy box to the job site. Here’s how I configure Webpack to be aggressive about it.

// webpack.config.js - Optimizing for tree shaking
module.exports = {
  mode: 'production',
  module: {
    rules: [
      {
        test: /\.js$/,
        exclude: /node_modules/,
        use: {
          loader: 'babel-loader',
          options: {
            // Preset for modern JS, crucial for tree-shaking
            presets: [
              ['@babel/preset-env', { modules: false }] // 'modules: false' is key
            ]
          }
        }
      }
    ]
  },
  optimization: {
    usedExports: true, // Marks unused code
    minimize: true,    // Removes marked code
    sideEffects: true  // Allows pruning of whole modules marked as side-effect free
  }
};
Enter fullscreen mode Exit fullscreen mode

For this to work, your package.json must also signal which files are safe to shake.

{
  "name": "my-library",
  "sideEffects": false
}
Enter fullscreen mode Exit fullscreen mode

If you have a file that causes a side effect just by being imported (like a polyfill), you can list it:

{
  "sideEffects": ["./src/polyfill.js"]
}
Enter fullscreen mode Exit fullscreen mode

I’ve seen bundle sizes drop by 30% or more after properly configuring tree shaking. The trick is to verify it's working. Build your project and search for a module you're sure isn't used. If you can't find it, the shake was successful.

Splitting Code for Faster Loading

Sending one giant JavaScript file to a user's browser means they wait for everything to download before they can interact with anything. Code splitting changes that. It’s like splitting a book into chapters; the reader can start with Chapter 1 while the rest loads in the background. This improves the initial load time dramatically.

There are two main ways: splitting by route and splitting by component. Vendor splitting is also vital—it separates your code from third-party library code. Since vendor code changes less often, browsers can cache it separately.

Here is a Webpack configuration that sets up intelligent splitting.

// webpack.config.js - Advanced code splitting
module.exports = {
  // ... other config
  optimization: {
    splitChunks: {
      chunks: 'all',
      minSize: 20000, // Only chunk files over 20KB
      cacheGroups: {
        vendors: {
          test: /[\\/]node_modules[\\/]/, // Group all node_modules
          name: 'vendors',
          priority: 10 // High priority
        },
        reactBundle: {
          test: /[\\/]node_modules[\\/](react|react-dom)[\\/]/,
          name: 'react.bundle',
          priority: 20 // Even higher for React
        },
        shared: {
          minChunks: 2, // Code used in at least 2 entry points
          name: 'common',
          priority: 5
        }
      }
    }
  }
};
Enter fullscreen mode Exit fullscreen mode

For dynamic splitting within your app, use dynamic imports. This tells the build tool to create a separate chunk for a module.

// In your React component, load a heavy feature only when needed
const HeavyChartComponent = React.lazy(() => import('./components/HeavyChartComponent'));

function Dashboard() {
  return (
    <div>
      <h1>Dashboard</h1>
      <React.Suspense fallback={<div>Loading chart...</div>}>
        <HeavyChartComponent />
      </React.Suspense>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

When you build, HeavyChartComponent will be in its own file. It won't download until the user navigates to the Dashboard. I use this for admin panels, modals, and any feature not essential for the first screen.

Using a Persistent Cache

If you've ever been frustrated waiting for a full rebuild after changing one line of code, you need a persistent cache. It allows your build tool to remember the work it did on the parts of your code that haven't changed. The next build is much faster because it only processes the new or modified files.

Think of it like cooking a complex meal. If you already have the chopped vegetables from yesterday stored in the fridge (the cache), you don't need to chop them again today. You just cook the new meat. Webpack 5 introduced a built-in, filesystem-based cache that's simple to enable.

// webpack.config.js - Enabling persistent cache
module.exports = {
  // ... other config
  cache: {
    type: 'filesystem',
    buildDependencies: {
      config: [__filename], // Invalidate cache if webpack config changes
    },
    version: '1.0' // Change this to invalidate all cache
  }
};
Enter fullscreen mode Exit fullscreen mode

For a task runner like Gulp, you can create a simple cache mechanism for expensive operations like image compression.

const gulp = require('gulp');
const imagemin = require('gulp-imagemin');
const cache = require('gulp-cache');

function optimizeImages() {
  return gulp.src('src/images/*')
    .pipe(cache(imagemin())) // Only processes changed images
    .pipe(gulp.dest('dist/images'));
}

// A task to clear the cache if needed
function clearCache(done) {
  return cache.clearAll(done);
}
Enter fullscreen mode Exit fullscreen mode

After implementing this, you'll notice rebuilds are often seconds instead of minutes. It's one of the highest-impact changes for developer happiness on large projects.

Building Only What You Need

In development, you often work on one part of a large app. Lazy compilation takes advantage of this by only building the pages or modules you're actively using. When you navigate to a new route, it builds that part on the fly. It makes starting the development server almost instant.

Webpack supports this through its lazyCompilation option. It's perfect for applications with dozens of routes.

// webpack.config.js - For fast development startups
module.exports = {
  // ... other config for development
  devServer: {
    hot: true,
  },
  experiments: {
    lazyCompilation: {
      entries: false, // Don't lazy compile the entry point
      imports: true,  // Lazy compile dynamic imports
      backend: {
        // Specify a backend for the lazy compilation server
        listen: [{
          host: 'localhost',
          port: 8081
        }]
      }
    }
  }
};
Enter fullscreen mode Exit fullscreen mode

When you run npm start, the server starts immediately. As you click around your app, you'll see network requests for new chunks as they are compiled for the first time. The trade-off is a slight delay when first visiting a new section, but the gain in initial startup time is worth it. I find this incredibly useful when working on a specific feature buried deep in the app.

Creating Your Own Processing Steps

Sometimes, the standard loaders don't do exactly what you need. A custom loader lets you intercept and transform source code as it passes through the build pipeline. I once needed to automatically replace a version string from package.json into hundreds of source files. A custom loader solved it in ten lines.

A loader is just a function that receives the source file content and returns transformed content. Here’s a loader that adds a custom header comment to every JavaScript file.

// loaders/add-header-loader.js
const packageJson = require('../package.json');

module.exports = function(source) {
  const header = `/* Application: ${packageJson.name} - Version: ${packageJson.version} */\n`;
  return header + source;
};
Enter fullscreen mode Exit fullscreen mode

You then use it in your Webpack configuration.

// webpack.config.js
module.exports = {
  module: {
    rules: [
      {
        test: /\.js$/,
        exclude: /node_modules/,
        use: [
          'babel-loader',
          {
            loader: path.resolve(__dirname, 'loaders/add-header-loader.js')
          }
        ]
      }
    ]
  }
};
Enter fullscreen mode Exit fullscreen mode

Now every built file will have that comment at the top. You can write loaders to do almost anything: optimize SVGs, inject environment variables, or strip out debug statements for production. They give you fine-grained control over the build process.

Configuring Different Builds for Different Goals

Your development build needs speed and helpful error messages. Your production build needs to be as small and fast as possible. Creating separate configurations for each environment lets you optimize for these different goals.

I typically have a base configuration file with shared settings, then merge environment-specific options. Here’s a pattern using the webpack-merge package.

// webpack.config.base.js
module.exports = {
  entry: './src/index.js',
  output: {
    filename: '[name].bundle.js',
    path: path.resolve(__dirname, 'dist'),
  },
  module: {
    rules: [
      {
        test: /\.js$/,
        use: 'babel-loader'
      }
    ]
  }
};

// webpack.config.dev.js
const { merge } = require('webpack-merge');
const baseConfig = require('./webpack.config.base.js');

module.exports = merge(baseConfig, {
  mode: 'development',
  devtool: 'eval-cheap-module-source-map', // Fast source maps for dev
  devServer: {
    contentBase: './dist',
    hot: true,
  },
  optimization: {
    minimize: false // No minification for faster builds
  }
});

// webpack.config.prod.js
const { merge } = require('webpack-merge');
const baseConfig = require('./webpack.config.base.js');
const TerserPlugin = require('terser-webpack-plugin');

module.exports = merge(baseConfig, {
  mode: 'production',
  devtool: 'source-map', // Higher quality source maps for prod
  output: {
    filename: '[name].[contenthash].js', // Hashing for cache busting
  },
  optimization: {
    minimize: true,
    minimizer: [
      new TerserPlugin({
        terserOptions: {
          compress: {
            drop_console: true, // Remove console logs
          }
        }
      })
    ]
  }
});
Enter fullscreen mode Exit fullscreen mode

Run them with different npm scripts.

{
  "scripts": {
    "start": "webpack serve --config webpack.config.dev.js",
    "build": "webpack --config webpack.config.prod.js"
  }
}
Enter fullscreen mode Exit fullscreen mode

This separation keeps your configuration clean and purpose-driven. The production build will be lean and optimized, while the development build gives you a quick feedback loop.

Watching for Problems Automatically

The final technique is about maintaining your gains. It's easy for bundle size to slowly creep up as new features are added. Setting up automated monitoring can catch this before it becomes a problem. You can integrate size checks into your Continuous Integration (CI) pipeline.

The size-limit library is a great tool for this. You set a maximum size for your bundle, and it will fail the build if that limit is exceeded.

First, install it: npm install --save-dev size-limit

Then, configure it in your package.json.

{
  "name": "my-app",
  "size-limit": [
    {
      "path": "dist/main.bundle.js",
      "limit": "150 kB"
    },
    {
      "path": "dist/vendors.bundle.js",
      "limit": "250 kB"
    }
  ],
  "scripts": {
    "size": "size-limit",
    "build": "webpack --config webpack.config.prod.js",
    "ci": "npm run build && npm run size"
  }
}
Enter fullscreen mode Exit fullscreen mode

Now, running npm run size will check the current bundle against your limits. In your CI configuration (like GitHub Actions), you can run the ci script. If a pull request adds too much code, the check will fail, and the developer will need to optimize before merging.

I also like to generate a simple report after each production build. This script logs the size of key assets.

// scripts/report-size.js
const fs = require('fs');
const path = require('path');

function getFileSize(filePath) {
  const stats = fs.statSync(filePath);
  return (stats.size / 1024).toFixed(2);
}

const distPath = path.join(__dirname, '../dist');
const files = ['main', 'vendors', 'runtime'];

console.log('=== Bundle Size Report ===');
files.forEach(name => {
  const fullPath = path.join(distPath, `${name}.bundle.js`);
  if (fs.existsSync(fullPath)) {
    console.log(`${name}.bundle.js: ${getFileSize(fullPath)} KB`);
  }
});
Enter fullscreen mode Exit fullscreen mode

Add it to your build script: "build": "webpack --config webpack.config.prod.js && node scripts/report-size.js". This constant feedback keeps performance in mind for the whole team.

Optimizing your build process is an ongoing task, not a one-time setup. Start with bundle analysis to know your problem areas, then apply techniques like tree shaking and splitting. Use caching to keep your development speed high, and automate checks to prevent regression. Each codebase is different, so experiment with these techniques and measure the results. The payoff is a faster, more pleasant experience for both developers and users.

📘 Checkout my latest ebook for free on my channel!

Be sure to like, share, comment, and subscribe to the channel!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!

Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Top comments (0)