DEV Community

Cover image for Dynamic Rendering, A Simple Solution for SPA when Shared on Social Media

Dynamic Rendering, A Simple Solution for SPA when Shared on Social Media

burhanahmeed profile image Burhanuddin Ahmed Updated on ・4 min read

The original article was written in Bahasa can be read here.

When making a website, the goal of your website is visited by user/customer, right?. There are various ways for websites that can be visited. Yep, one of them is using SEO technique, this is how to make your website easily found only through search engines like Google, Bing, or Duckduckgo.

Everything will be okay until you realize that your website is built using full of Javascript and most of the content is generated by Javascript. But calm down, search engine like Google is now more advanced in reading Javascript tho. Since May 2019 Google has been using the evergreen more about it can be read here, they claim Google's latest Evergreen bot can be more reliable in rendering Javascript content, the latest Googlebot now uses Chrome version 74 which has the same capabilities as your Chrome browser in rendering Javascript.

Yep it's Google, then what if you share your website on social media? What about Facebook's crawlers or Twitter's crawlers?

If you know, it's not only Google, Bing, or Duckduckgo who have crawler, social media like Facebook and Twitter also have crawlers that's purposed to get meta and display it into an object from a website that is shared to social media.

Gambar object sosial media

How to do this?

Facebook and Twitter have their own tags, so that their bot can detect and create data objects to display as shown above.

<!-- Open Graph / Facebook -->
<meta property="og:type" content="website">
<meta property="og:url" content="">
<meta property="og:title" content="Lorem ipsum dolor sit amet, consectetur adipiscing elit">
<meta property="og:description" content="Lorem ipsum dolor sit amet, consectetur adipiscing elit">
<meta property="og:image" content="">

<!-- Twitter -->
<meta property="twitter:card" content="summary_large_image">
<meta property="twitter:url" content="">
<meta property="twitter:title" content="Lorem ipsum dolor sit amet, consectetur adipiscing elit">
<meta property="twitter:description" content="Lorem ipsum dolor sit amet, consectetur adipiscing elit">
<meta property="twitter:image" content="">
Enter fullscreen mode Exit fullscreen mode

BUT, when your website is a single page application, then you need to prepare when Facebook or Twitter bot and cannot read the meta tags or contents on your website. Based on my experiment, I did it when this article was written in May 2020, Facebook bot is not capable to read SPA or websites that its content is generated by Javascript. Pathetic.

Then, how?

Dynamic Rendering can be your best friend, although there are other ways, like convert your website into a static site.

OK, let's say you only want SPA, and all we need to do is dynamic rendering.

So what is Dynamic Rendering?

If we look at its name 'dynamic', don't forget if you use dynamic rendering then you need a server, in my case I use a NodeJS server. With dynamic rendering, the web pages that will be delivered by the server is different depending on the detected user-agent. If its detected user agent is a bot, the web page that will be delivered to the client is a static generated of the requested page because before sending to the client, Puppeteer will process the webpage and render it first. But if the detected user is a real human, then the page that will be sent to the client is html, js, and css and will be rendered right in the browser of the user.

Dynamic rendering google

How can we implement it?

First you need a server that can support NodeJS, if you don't have it you can use Heroku.

The easy way is to create your project folder then do npm init.

Then install several packages as below:

ExpressJS: npm install express

Puppeteer: npm install puppeteer

Useragent: npm install useragent

After all of the three packages are installed, you will need to create the file index.js as an entry point on your server side.


const express = require('express');
const puppeteer = require('puppeteer');
const ua = require('useragent');
const app = express();
var path = require("path");

const directory = 'dist';
const dist = path.join(__dirname, directory)

const port = process.env.PORT || 3000;

//you can put your puppeteer middleware here later

app.use('*', (req, res) => {
    res.sendFile(path.join(dist, 'index.html'));

app.listen(port, () => {
    console.log(`Web server is running at port ${port}`);
Enter fullscreen mode Exit fullscreen mode

Add this code to middleware to detect useragents.

function isBot (useragent) {
    const agent =;
    return !agent.webkit && !agent.opera && ! &&
        ! && !agent.safari && !agent.mobile_safari &&
        !agent.firefox && !agent.mozilla && !;

const uAgentMiddleware = async (req, res, next) => {
    const local_url = 'YOUR_BASE_URL'

    if (!isBot(req.headers['user-agent'])) {
        next ()
    } else {

        try {
            const browser = await puppeteer.launch({
              'args' : [
            const page = await browser.newPage();
            await page.setUserAgent('Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36');
            await page.goto(local_url, {
                waitUntil: "networkidle0",
            const html = await page.evaluate(() => {
                return document.documentElement.innerHTML;
            await browser.close();

        } catch (err) {

Enter fullscreen mode Exit fullscreen mode

After adding the code above, then make sure you have copied your dist folder or the Vue build folder (in this case I use VueJS) to the same folder as index.js.

Finally in package.json add a script like the following to runindex.js.

Then just run with npm run start to start the server.


"scripts": {
  "start": "node index.js"
Enter fullscreen mode Exit fullscreen mode

Discussion (1)

Editor guide
guliyevravan profile image

Thanks :)