DEV Community

Roman Potashow
Roman Potashow

Posted on

Good enough for production

Im frontend developer for more that 10y, and to be in shape I do some small projects just for personal satisfaction, or to learn some new technics and tools.

Most of my work is just some opens source (as I believe in it) - any project that I do for my self gives me experience, that is biggest profit that I can get, and happy when someone can learn something out of that.

For those "Pet projects" exists good place where share code - GitHub, but maybe not all of you knows that GitHub also allows you publish your work as static web pages.

GitHub Pages is a static site hosting service that takes HTML, CSS, and JavaScript files straight from a repository on GitHub, optionally runs the files through a build process, and publishes a website.

In other words everything that is pushed to branch gh-pages - is served as static page under http(s)://<user>.github.io/<repository>.

For some time all my pet projects is written in Elm and elm ecosystem is just single elm compiler that allow you do all job (compile elm to js, run local server, download packages, publish packages, run code optimisations, and even more).

Problem came - when you need to publish it - as just copy development code to public folder (gh-pages brunch) is not acceptable for me. Production code should be not same as development - it should not contain any debug information, it bundle size should be optimised for download not for rebuild time. Same goes for HTML (for development you not really care how it looks as long as it works), for production HTML should be valid, have social media meta tags (twitter, facebook), css should live in separate files.

Other part that I'm (as most developers) lazy, and everything that can be automated - must be automated. And deployment automation can be done via CI (continues integration)

I using Travis - main benefits:

  • easy configurable
  • free for open source projects

Tool:

GitHub Pages

Kind of explained before - but there is few hacks that you can learn and implement.

404.html

create file 404.html in gh-pages root that redirects all request to index.html (2019 - we all about SPA)

<!DOCTYPE html>
<html lang="en">

<head>
  <meta charset="utf-8" />

  <title></title>

  <script>
    sessionStorage.redirect = location.href;
  </script>

  <meta http-equiv="refresh" content="0;URL='http(s)://<user>.github.io/<repository>'" />
</head>

<body>
  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
</body>

</html>

index.html

Add script that will update history from redirect if there was such:

<script>
    var redirect = sessionStorage.redirect;
    delete sessionStorage.redirect;
    if (redirect && redirect !== location.href) {
        history.replaceState(null, null, redirect);
    }
</script>

Javascript bundle

Elm part have great explanation in official guide, section Asset Size
, but if you use custom elements, that is main way to interact with some low-level api from js, or external libs (google maps, code editors,..) you need cannot use only uglify-js, as it not supports ES6 classes that is needed for custom elements - so there come in place terser - that is not so great for elm, but can do stuff for custom elements.

Elm

elm can do all by it self (as mention before):
elm make src/Main.elm --optimize --output=./gh-pages/bundle.js

JsCodeShift

here it is - that file take all take care of glsl string optimisations (if you use WebGL) inline An and Fn functions in result javascript - in other words - elm js voodoo

jscodeshift -t transform.js ./gh-pages/bundle.js

transform.js

Uglify-js

if you not use custom elements just copy it from elm guide

uglifyjs ./gh-pages/bundle.js --compress 'pure_funcs="F2,F3,F4,F5,F6,F7,F8,F9,A2,A3,A4,A5,A6,A7,A8,A9",pure_getters,keep_fargs=false,unsafe_comps,unsafe' | uglifyjs --mangle --output=./gh-pages/bundle.js

or split that in 2 commands, where first one will be done with uglify-js (and second with terser)

uglifyjs ./gh-pages/bundle.js --compress "pure_funcs=[F2,F3,F4,F5,F6,F7,F8,F9,A2,A3,A4,A5,A6,A7,A8,A9],pure_getters" --output=./gh-pages/bundle.js

Prepack

Prepack is a tool that optimizes JavaScript source code: Computations that can be done at compile-time instead of run-time get eliminated. Prepack replaces the global code of a JavaScript bundle with equivalent code that is a simple sequence of assignments. This gets rid of most intermediate computations and object allocations.

Prepack will increase bundle size (most of time) - but application will run faster - as lot of math and code is inlined (where it can), and with pure functional language as Elm - it can done lot.

prepack ./gh-pages/bundle.js --inlineExpressions --out ./gh-pages/bundle.js

Terser

totally same as uglify js - but here you merging together ES6 custom elements with your resulted bundle.js of elm

terser ./gh-pages/bundle.js ./gh-pages/script.js --compress 'keep_fargs=false,unsafe_comps,unsafe' --mangle --output=./gh-pages/bundle.js

Result pipeline looks like for bundle.js:

elm make src/Main.elm --optimize --output=./gh-pages/bundle.js
jscodeshift -t transform.js ./gh-pages/bundle.js
uglifyjs ./gh-pages/bundle.js --compress "pure_funcs=[F2,F3,F4,F5,F6,F7,F8,F9,A2,A3,A4,A5,A6,A7,A8,A9],pure_getters" --output=./gh-pages/bundle.js
prepack ./gh-pages/bundle.js --inlineExpressions --out ./gh-pages/bundle.js
terser ./gh-pages/bundle.js ./gh-pages/script.js --compress 'keep_fargs=false,unsafe_comps,unsafe' --mangle --output=./gh-pages/bundle.js

Now we have production ready bundle.js that is optimised for production, and we done as much as can in build time (that we should not do for development as they take lot of time).

PostHTML

For your project you would like attract people (or at share it) in Facebook, Twitter and others. To make your project looks nice. You can try your links in validators (twitter, facebook) to see how they will be represented. Main idea is that bots (grabbers) go to link, and try find some information to present your application (link) as nice post/card.

But as developer (at least that is true for me) - you would like have your HTML code download fast.

PostHTML & Plugins

  • posthtml-cli - PostHTML сommand line interface
  • htmlnano - Modular HTML minifier, built on top of the PostHTML. Inspired by cssnano.

  • posthtml-content - Flexible content transform for posthtml

  • posthtml-favicons - This module is using favicons to generate all of favicons based on a single image

  • posthtml-head-elements - This plugin is intended to work with PostHTML. It will allow you to keep HTML head elements - title, script, link, base and meta - in a separate JSON file.

  • posthtml-style-to-file - Save DOM styles to CSS file

Each application needs it own favicon, and they must be converted to lot of devices (iPhone, Android, and different sizes for different sizes of screen, icons, pixelRatios)

Each application needs its own Facebook App Id, special meta tags.

posthtml ./gh-pages/index.html -o ./gh-pages/index.html -c posthtml.config.js

posthtml.config.js

Screenshots

To create nice social media meta-tags you need some image of your application. You can just create your images by your self - but im too lazy for that - and just screenshot is good enough for that (and if I not like that idea, I always can create some custom url for taking screenshot).
There is nice tool that can be used to make screenshot in CI - that will update image each build - node-server-screenshot

const http = require("http");
const fs = require("fs");
const takeScreenShot = require("node-server-screenshot");
const port = 3000;

const server = http.createServer((req, res) => {
    if (req.url === "/") {
        req.url = `/index.html`;
    }
    const path = __dirname + "/gh-pages" + req.url;
    fs.access(path, fs.F_OK, (err) => {
        if (err) {
            console.error(err);
            res.writeHead(404);
            res.end();
            process.exit(-1);
            return;
        }
        res.writeHead(200);
        res.end(fs.readFileSync(path));
    })

});

server.listen(port, ()=> {
    console.log("server started")
    screenshot(() => server.close());
});

function screenshot(done) {
    const url = `http://localhost:${port}/`;
    const savePreview = `gh-pages/${process.env.APP_NAME}.png`;
    const preview = new Promise((resolve, reject) =>
        takeScreenShot.fromURL(url, savePreview,
            {
                show: true,
                width: 1200,
                height: 675,
                waitAfterSelector: "body > *",
                waitMilliseconds: 100,
            },
            () => {
                console.log(`Screenshot: ${savePreview}`);
                resolve()
            }
        ));
     preview.then(done);
}

Deploy

Deployment of all that to gh-pages is really easy task after all that, as Travis have provider that do all magic for you. (will not go in to details how to do that - all info is in guide.

deploy:
  provider: pages
  skip-cleanup: true
  github-token: $GITHUB_TOKEN
  keep-history: true
  local-dir: gh-pages
  on:
    branch: master

Conclusions

Lets make our projects look like production application even if that is just "pet experiment project". It is not so hard, and after few projects published like that, you will not be able just push something to public domain until it looks like it is professional app (even if not).
With correct tool and few minutes in preparation - you can automate all that process, and not think about it any more.

Here is examples:
web-tiled - current work in progress where I done all that in few hours
platformer - extended version with binary encoding of game level - that I'll explain some day.

Twitter Cards

Top comments (0)