I recently started to learn React and decided to go practical by building a site called Make.rs - a place where makers show what they are working on - and which, in essence, is a Create React App powered by an API developed in NodeJS. I soon realized that social sharing is something almost impossible to do due to technical limitations: CRA executes the Javascript on the client side hence the social crawlers cannot read the updated meta tags. Even if Helmet works well for SEO purposes, as search engine crawlers are able to execute Javascript, this is not the case for social crawlers (Twitter, Facebook, LinkedIn..) yet.
I came up with a solution that works wonders and wanted to share with you, in case you face the same problem. I see it more as a hack, due to the way it is implemented, nonetheless it's simple to set up and does the job 💪
Before I dig into the details, you should know that there are other ways to accomplish social sharing, but this requires you to do server side rendering or implement a prerendering solution but again it will require that you serve the app from a server, which is what I wanted to avoid (I'm happy to have my app on a CDN).
Here's how Make.rs is structured:
- front-end : CRA running at https://make.rs served from a CDN
- back-end: NodeJS with Express running at https://api.make.rs
So here's what I did:
I created a new route at the API level called /sharer: https://api.make.rs/sharer
In it, I check if the request is made by a bot (based on the user agent). Depending on who makes the request, I return a simple HTML if it's for crawlers, or I do a permanent 301 redirect if it's from a user who clicked the shared link.
Example of my middleware:
module.exports = (req, res, next) => {
const bots = [
'facebot',
'facebookexternalhit/1.0',
'facebookexternalhit/1.1',
'twitterbot',
'telegrambot',
'linkedinbot', // linkedin
'skypeuripreview', // microsoft teams
'pinterest',
'discordbot',
'disqus'
];
const source = req.headers['user-agent'];
req.isSocialBot = checkArray(source.toLowerCase(), bots);
next();
};
function checkArray(str, arr){
for(var i=0; i < arr.length; i++){
if(str.match(arr[i]))
return true;
}
return false;
}
Now, with this middleware in place, here's how my /sharer route looks like:
router.get('/project/:slug', socialbot, async function(req, res, next) {
if (req.isSocialBot) {
let html = `
<html>
<head>
<title>xxx</title>
<meta property="og:title" content="xxx">
<meta property="og:description" content="xxx">
<meta property="og:url" content="xxx">
<meta property="og:site_name" content="xxx">
<meta name="twitter:title" content="xxx"/>
<meta name="twitter:description" content="xxx">
<meta name="twitter:image:alt" content="xxx">
<meta name="twitter:site" content="xxx">
</head>
<body>
</body>
</html>
`;
// return the html
res.set('Content-Type', 'text/html');
res.send(html);
}
else {
// do the permanent redirect to the CRA site
res.set('location', 'your_url_here');
res.status(301).send();
}
});
With all this in place, I now generate Twitter/Facebook sharable links in the front-end using the new /sharer route which does all the job of feeding crawlers with the right meta tags and redirecting users to the original/public URL.
Hope you liked it and you found it useful 🙂
Cheers.
Top comments (0)