DEV Community

Austin Spivey for Wia

Posted on

How to gzip and deploy your front end assets to Amazon S3 with Webpack

Gzipping your front end assets is a great way to drastically decrease their file size, and thus improve the overall page load time. If you're using Amazon S3 to host these assets, here's a simple way to add gzipping and front end deployment to your production build step.

Follow this tutorial to get a basic Webpack environment set up, if you haven't already.

You'll need two Webpack plugins for this tutorial:

Compression Plugin
and
S3 Plugin

Install the plugins like so:

npm install --save-dev compression-webpack-plugin webpack-s3-plugin

Import the plugins at the top of your Webpack config file:

const CompressionPlugin = require('compression-webpack-plugin');
const S3Plugin = require('webpack-s3-plugin');

Next, we're going to initialise the compression plugin. In the below example, the plugin will apply to all Javascript and CSS files. New gzipped files are created in the same directory with the suffix .gz, and the original files are removed. Click here for more plugin options.

module.exports = {
  ...,
  plugins: [
    new CompressionPlugin({
      test: /\.(js|css)$/,
      asset: '[path].gz[query]',
      algorithm: 'gzip',
      deleteOriginalAssets: true
    })
  ]
}
Enter fullscreen mode Exit fullscreen mode

After the CompressionPlugin config, we'll initialise the S3 plugin:

module.exports = {
  ...,
  plugins: {
    new CompressionPlugin({
      test: /\.(js|css)$/,
      asset: '[path].gz[query]',
      algorithm: 'gzip',
      deleteOriginalAssets: true
    })
    new S3Plugin({
      s3Options: {
        accessKeyId: 'your-access-key', // Your AWS access key
        secretAccessKey: 'your-secret-key', // Your AWS secret key
        region: 'eu-west-1' // The region of your S3 bucket
      },
      s3UploadOptions: {
        Bucket: 'my-bucket', // Your bucket name
        // Here we set the Content-Encoding header for all the gzipped files to 'gzip'
        ContentEncoding(fileName) {
          if (/\.gz/.test(fileName)) {
            return 'gzip'
          }
        },
        // Here we set the Content-Type header for the gzipped files to their appropriate values, so the browser can interpret them properly
        ContentType(fileName) {
          if (/\.css/.test(fileName)) {
            return 'text/css'
          }
          if (/\.js/.test(fileName)) {
            return 'text/javascript'
          }
        }
      },
      basePath: 'my-dist', // This is the name the uploaded directory will be given
      directory: 'public/dist' // This is the directory you want to upload
    })
  }
}
Enter fullscreen mode Exit fullscreen mode

Click here for more S3 plugin options.

With your Webpack config set up, running webpack will gzip your assets, and upload your dist folder to your S3 bucket.

Top comments (3)

Collapse
 
vivek8568 profile image
vivek8568

Hi Austin, This is a great article but I have one doubt. How would index.html would know that it has to point to bundle.js.gz instead of bundle.js. As I'm using split chunks, so my bundle is pointing to all chunks.js files instead of individual chunks.js.gz.
Is this tutorial using any server like express or something else. As I want to serve my react build from S3 only(static hosting). Would you please help me out?

Collapse
 
sifloz profile image
Héctor Sifloz

Hi,
I have the same concern. Did you managed how to do this?

I'm currently developing a React app with create-react-app which is hosted in AWS. I created chunks.js.gz using webpack but I have no idea how to serve using AWS in order to load those gz compressed files instead of just chunks.js.

Collapse
 
danifrontdev profile image
Dani

Great post! It has been very useful for me :) So... Watch out with 'text/javascript' it's obsolete, instead use 'application/javascript' this can get troubles when you upload js files to s3.