<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Rachel Costello</title>
    <description>The latest articles on DEV Community by Rachel Costello (@rachellcostello).</description>
    <link>https://dev.to/rachellcostello</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/rachellcostello"/>
    <language>en</language>
    <item>
      <title>The 8 main ways JavaScript can impact SEO performance</title>
      <dc:creator>Rachel Costello</dc:creator>
      <pubDate>Thu, 17 Oct 2019 09:19:22 +0000</pubDate>
      <link>https://dev.to/rachellcostello/the-8-main-ways-javascript-can-impact-seo-performance-4op7</link>
      <guid>https://dev.to/rachellcostello/the-8-main-ways-javascript-can-impact-seo-performance-4op7</guid>
      <description>&lt;p&gt;JavaScript rendering is often a complicated and resource-intensive process, and can significantly impact a variety of different performance and user experience factors which SEO success depends on.&lt;/p&gt;

&lt;p&gt;This is why it’s crucial to understand where these issues can occur and how they can impact your website.&lt;/p&gt;

&lt;p&gt;Here are the 8 main things to watch out for within a JavaScript-powered website that can impact SEO performance:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;b&gt;Rendering speed&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Main thread activity&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Conflicting signals between HTML and JavaScript&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Blocked scripts&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Scripts in the head&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Content duplication&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;User events&lt;/b&gt;&lt;/li&gt;
&lt;li&gt;&lt;b&gt;Service workers&lt;/b&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;&lt;b&gt;1. Rendering speed&lt;/b&gt;&lt;/h2&gt;

&lt;p&gt;The process of rendering can be an expensive and strenuous process due to the different stages required to download, parse, compile and execute JavaScript. This causes significant issues when that work falls on a user’s browser or search engine crawler.&lt;/p&gt;

&lt;p&gt;Having JavaScript-heavy pages that take a long time to process and render means that they are at risk of not being rendered or processed by search engines.&lt;/p&gt;

&lt;blockquote&gt;
“If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources.”
&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://webmasters.googleblog.com/2014/05/understanding-web-pages-better.html"&gt;Google Webmaster Central Blog&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;JavaScript that renders slowly will also impact your users because, with the increase of page load time, bounce rates will also rise. Nowadays a user will expect a page to load within a few seconds or less. However, getting a page that requires JavaScript rendering to load quickly enough to meet those expectations can be challenging.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YGzmsM-U--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/gzcpqq1xkcxskhchpodm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YGzmsM-U--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/gzcpqq1xkcxskhchpodm.png" alt="Page load speed vs bounce rate chart"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;
&lt;center&gt;Source: &lt;a href="https://www.thinkwithgoogle.com/marketing-resources/data-measurement/mobile-page-speed-new-industry-benchmarks/"&gt;Think with Google&lt;/a&gt;
&lt;/center&gt;

&lt;p&gt;Another issue to consider is that a user’s device and CPU will usually have to do the hard work with JavaScript rendering, but not all CPUs are up for the challenge. It’s important to be aware that users will experience page load times differently depending on their device.&lt;/p&gt;

&lt;p&gt;Just because a site appears to load quickly on a high-end phone, it doesn’t mean that this will be the case for a user accessing the same page with a lower-end phone.&lt;/p&gt;

&lt;blockquote&gt;
“What about a real-world site, like CNN.com? On the high-end iPhone 8 it takes just ~4s to parse/compile CNN’s JS compared to ~13s for an average phone (Moto G4). This can significantly impact how quickly a user can fully interact with this site.”
&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/javascript-startup-optimization/"&gt;Google Web Fundamentals&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4-9xepAc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/n78zsuvdd46v2vuhe139.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4-9xepAc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/n78zsuvdd46v2vuhe139.png" alt="Chart showing JavaScript processing times for different devices"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;
&lt;center&gt;Source: &lt;a href="https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/javascript-startup-optimization/"&gt;Google Developers&lt;/a&gt;
&lt;/center&gt;


&lt;h2&gt;&lt;b&gt;2. Main thread activity&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
JavaScript is single-threaded, meaning that the entire main thread is halted while JavaScript is parsed, compiled and executed. With this kind of setup, queues can form and bottlenecks can happen, meaning that the entire process of loading a page can be delayed and a search engine won't be able to see any content on the page until the scripts have been executed.

&lt;p&gt;Delays within the main thread can significantly increase the time it takes to load a page for search engines, and for the page to become interactive for users, so avoid blocking main thread activity wherever possible.&lt;/p&gt;

&lt;p&gt;Keep an eye on how many resources are being executed and where request timeouts are happening, as these can be some of the main culprits which create bottlenecks.&lt;/p&gt;


&lt;h2&gt;&lt;b&gt;3. Conflicting signals between HTML and JavaScript&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
First impressions count with search engines, so make sure you’re giving them clear, straightforward instructions about your website in the HTML as soon as they come across the page.

&lt;p&gt;Adding important meta tags using JavaScript rather than adding them in the HTML is advised against, because either Google won’t see these tags straight away because of its &lt;a href="https://dev.to/rachellcostello/how-search-engines-social-media-crawlers-render-javascript-438e"&gt;delayed rendering process&lt;/a&gt;, or other search engines won’t see them at all due to the fact that they can’t render.&lt;/p&gt;

&lt;p&gt;All search engines will use the signals from the HTML in the initial fetch to determine crawling and indexing. Google and the few search engines that have rendering capabilities will then render pages at a later date, but if the signals served via JavaScript differ from what was initially found in the HTML, then this will contradict what the search engine has already been told about the page.&lt;/p&gt;

&lt;p&gt;For example, if you use JavaScript to remove a robots meta tag like noindex, Google will have already seen the noindex tag in the HTML and won’t waste resources rendering a page it has been told not to include in its index. This means that the instructions to remove the noindex won’t even be seen as they’re hidden behind JavaScript which won’t be rendered in the first place.&lt;/p&gt;

&lt;p&gt;Aim to include the most important tags and signals within the HTML where possible and make sure they’re not being altered by JavaScript. This includes page titles, content, hreflang and any other elements that are used for indexing.&lt;/p&gt;

&lt;blockquote&gt;
“The signals you provide via JavaScript shouldn't conflict with the ones in the HTML.”
&lt;/blockquote&gt;

&lt;p&gt;-John Mueller, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-june-26th-2018/#13"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;&lt;b&gt;4. Blocked scripts&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
If a script is blocked, such as in the &lt;a href="https://www.deepcrawl.com/knowledge/technical-seo-library/robots-txt/"&gt;robots.txt file&lt;/a&gt;, this will impact how search engines will be able to see and understand a website. Scripts that are crucial to the layout and content of a page need to be accessible so that the page can be rendered properly.

&lt;blockquote&gt;
“Blocking scripts for Googlebot can impact its ability to render pages.”
&lt;/blockquote&gt;

&lt;p&gt;-John Mueller, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-june-1st-2018/#1"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bqhdZUD5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/sv31vdr34d2943r0blbh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bqhdZUD5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/sv31vdr34d2943r0blbh.png" alt="Render-blocking scripts report in PageSpeed Insights"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;
&lt;center&gt;Source: &lt;a href="https://developers.google.com/speed/pagespeed/insights/"&gt;PageSpeed Insights&lt;/a&gt;
&lt;/center&gt;

&lt;p&gt;This is especially important for mobile devices, as search engines rely on being able to fetch external resources to be able to display mobile results correctly.&lt;/p&gt;

&lt;blockquote&gt;
“If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile.”
&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://webmasters.googleblog.com/2014/05/understanding-web-pages-better.html"&gt;Google Webmaster Central Blog&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;&lt;b&gt;5. Scripts in the head&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
When JavaScript is served in the head, this can delay the rendering and loading of the entire page. This is because everything in the head is loaded as a priority before the body can start to be loaded.

&lt;blockquote&gt;
“Don't serve critical JavaScript in the head as this can block rendering.”
&lt;/blockquote&gt;

&lt;p&gt;-John Mueller, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-october-30th-2018/#6"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Serving JavaScript in the head is also advised against because it can cause search engines to ignore any other head tags below it. If Google sees a JavaScript tag within the contents of the head, it can assume that the body section has begun and ignore any other elements below it that were meant to be included in the head.&lt;/p&gt;

&lt;blockquote&gt;
“JavaScript snippets can close the head prematurely and cause any elements below to be overlooked.”
&lt;/blockquote&gt;

&lt;p&gt;-John Mueller, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-october-19th-2018/#8"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;&lt;b&gt;6. Content duplication&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
JavaScript can cause duplication and canonicalisation issues when it is used to serve content. This is because if scripts take too long to process, then the content they generate won’t be seen.

&lt;p&gt;This can cause Google to only see boilerplate, duplicate content across a site that experiences rendering issues, meaning that Google won’t be able to find any unique content to rank pages with. This can often be an issue for Single Page Applications (SPAs) where the content dynamically changes without having to reload the page.&lt;/p&gt;

&lt;p&gt;Here are Google Webmaster Trends Analyst, &lt;a href="https://www.reddit.com/r/TechSEO/comments/ag77vi/canonicals_and_angular_js/"&gt;John Mueller’s thoughts on managing SPAs&lt;/a&gt;:&lt;/p&gt;

&lt;blockquote&gt;
“If you're using a SPA-type setup where the static HTML is mostly the same, and JavaScript has to be run in order to see any of the unique content, then if that JavaScript can't be executed properly, then the content ends up looking the same. This is probably a sign that it's too hard to get to your unique content -- it takes too many requests to load, the responses (in sum across the required requests) take too long to get back, so the focus stays on the boilerplate HTML rather than the JS-loaded content.”
&lt;/blockquote&gt;


&lt;h2&gt;&lt;b&gt;7. User events&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
JavaScript elements that require interactivity may work well for users, but they don’t for search engines. Search engines have a very different experience with JavaScript than a regular user.

&lt;p&gt;This is because search engine bots can’t interact with a page in the same way that a human being would. They don’t click, scroll or select options from menus. Their main purpose is to discover and follow links to content that they can add to their index.&lt;/p&gt;

&lt;p&gt;This means that any content that depends on JavaScript interactions to be generated won’t be indexed. For example, search engines will struggle to see any content hidden behind an &lt;a href="https://www.w3schools.com/jsref/event_onclick.asp"&gt;‘onclick’ event&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Another thing to bear in mind is that Googlebot and the other search engine crawlers clear cookies, local storage and session storage data after each page load, so this will be a problem for website owners who rely on cookies to serve any kind of personalised, unique content that they want to have indexed.&lt;/p&gt;

&lt;blockquote&gt;
“Any features that requires user consent are auto-declined by Googlebot.”
&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://developers.google.com/search/docs/guides/rendering"&gt;Google Search&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;&lt;b&gt;8. Service workers&lt;/b&gt;&lt;/h2&gt;
&lt;br&gt;
A &lt;a href="https://developers.google.com/web/fundamentals/primers/service-workers/"&gt;service worker&lt;/a&gt; is a script that works in the background of the browser and on a separate thread. Service workers can run pages and provide content based on their own memory, meaning they can work offline without the server being involved.

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BpMu7PWY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/u6xcq8tomsqari3n6my9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BpMu7PWY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/u6xcq8tomsqari3n6my9.png" alt="Diagram showing how Service Workers work"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;
&lt;center&gt;Source: &lt;a href="https://developers.redhat.com/blog/2017/03/30/service-workers-in-the-browser-not-clerks-the-movie/"&gt;Red Hat&lt;/a&gt;
&lt;/center&gt;

&lt;p&gt;The benefit of using a service worker is that it decreases page load time because it doesn’t reload assets that aren’t needed. However, the issue is that Google and other search engine crawlers don’t support service workers.&lt;/p&gt;

&lt;p&gt;The service worker can make it look like content is rendering correctly, but this may not be the case. Make sure your website and its key content still works properly without a service worker, and test your server configuration to avoid this issue.&lt;/p&gt;

&lt;p&gt;Hopefully this guide has provided you with some new insights into the impact that JavaScript can have on SEO performance, as well as some areas that you can look into for the websites that you manage.&lt;/p&gt;

</description>
      <category>seo</category>
      <category>javascript</category>
    </item>
    <item>
      <title>How search engines &amp; social media crawlers render JavaScript</title>
      <dc:creator>Rachel Costello</dc:creator>
      <pubDate>Fri, 04 Oct 2019 13:50:35 +0000</pubDate>
      <link>https://dev.to/rachellcostello/how-search-engines-social-media-crawlers-render-javascript-438e</link>
      <guid>https://dev.to/rachellcostello/how-search-engines-social-media-crawlers-render-javascript-438e</guid>
      <description>&lt;p&gt;JavaScript is a widely discussed topic in the SEO community, because it can cause significant issues for search engines and other crawlers that are trying to access the pages on our sites.&lt;/p&gt;

&lt;p&gt;The information that SEOs are gathering on the topic of JavaScript rendering should be more widely shared, as these findings will impact everyone who has a JavaScript-heavy website that they want to be visible to new users.&lt;/p&gt;

&lt;p&gt;That’s why I’ve put together this guide to explain some of the key considerations to be aware of.&lt;/p&gt;

&lt;h2&gt;&lt;b&gt;How search engines render JavaScript&lt;/b&gt;&lt;/h2&gt;

&lt;p&gt;From looking at this example code, a search engine like Google won’t have any idea what the page is meant to be about:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;body&amp;gt;
&amp;lt;app-root&amp;gt;&amp;lt;/app-root&amp;gt;
&amp;lt;script src="runtime.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;script src="polyfills.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;script src="main.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;/body&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The JavaScript contained within this code needs to be processed and executed so that the output code can be displayed for the client. For the contents of a JavaScript-heavy page to mean anything to a search engine or social media crawler, they need to render the page.&lt;/p&gt;

&lt;p&gt;However, rendering is an expensive, resource-intensive process which the majority of search engine bots and social media bots struggle with. So it’s important to understand their rendering capabilities, so you can be aware of what they will struggle to see on your site.&lt;/p&gt;

&lt;p&gt;It’s important to bear in mind that most search engines can’t render at all, and those that do have their own rendering limitations, as I’ll explain later in this article.&lt;/p&gt;

&lt;p&gt;If your website relies on JavaScript to power its content and navigation, search engines could end up seeing a blank screen with nothing of value to crawl or index.&lt;/p&gt;

&lt;p&gt;I’ve put together the latest updates on how the main search engines are currently equipped for rendering, as well as some key considerations for building sites that can be crawled and indexed.&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Google’s rendering capabilities&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Google is one of the few search engines that currently renders JavaScript, and provides a lot of &lt;a href="https://developers.google.com/search/docs/guides/rendering" rel="noopener noreferrer"&gt;documentation and resources on JavaScript best practice for search&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This means we’re able to build a pretty clear picture of what we need to do to get our websites indexed in Google’s SERPs (Search Engine Results Pages).&lt;/p&gt;

&lt;p&gt;When Google renders, it generates markup from templates and the data available from a database or an API. The key step in this process is to get this fully generated markup, because this is what’s readable for Google’s web crawler, Googlebot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fgoogle-javascript-rendering.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fgoogle-javascript-rendering.png" alt="The key stage of rendering for Google"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Source: [Martin Splitt, AngularUP Conference](https://www.youtube.com/watch?v=Z9mHopLeE40)&lt;/center&gt;

&lt;p&gt;To carry out this process, Googlebot uses a headless browser for its web rendering service (WRS). Google’s WRS used to be based on Chrome 41, which was an outdated version launched in 2015.&lt;/p&gt;

&lt;p&gt;However, &lt;a href="https://webmasters.googleblog.com/2019/05/the-new-evergreen-googlebot.html" rel="noopener noreferrer"&gt;Google have now made their WRS ‘evergreen’&lt;/a&gt;, meaning that it will be regularly updated to run the &lt;a href="https://www.chromestatus.com/features" rel="noopener noreferrer"&gt;latest version of Chrome&lt;/a&gt; on an ongoing basis.&lt;/p&gt;

&lt;p&gt;This change allows Googlebot to process features that it was previously unable to, such as ES6, IntersectionObserver and Web Components.&lt;/p&gt;

&lt;p&gt;The crawling and indexing process is usually very quick for sites that don’t rely on JavaScript, however, Google can’t crawl, render and index in one instantaneous process due to the scale of the internet and the processing power that would be required to do so.&lt;/p&gt;

&lt;blockquote&gt;“The internet is gigantic, that’s the problem. We see over 160 trillion documents on the web daily, so Googlebot is very busy. Computing resources, even in the cloud age, are pretty tricky to come by. This process takes a lot of time, especially if your pages are really large, we have to render a lot of images, or we have to process a lot of megabytes of JavaScript.”&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://twitter.com/g33konaut" rel="noopener noreferrer"&gt;Martin Splitt&lt;/a&gt;, Webmaster Trends Analyst at Google&lt;/p&gt;

&lt;p&gt;This is why Google has a two-wave indexing process. In the first wave of indexing, HTML pages are crawled and indexed and Googlebot will use a classifier to determine pages with JavaScript on them that need to be rendered.&lt;/p&gt;

&lt;p&gt;These pages will be added to a queue to be rendered at a later date when enough resources become available, in the second wave of indexing. A page will only be added to the index in the second wave after it has been rendered.&lt;/p&gt;

&lt;blockquote&gt;“Google determines if pages need to be rendered by comparing content found in the initial HTML and the rendered DOM.”&lt;/blockquote&gt;

&lt;p&gt;-Martin Splitt, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-august-23rd-2019/#7" rel="noopener noreferrer"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fgoogle-two-wave-indexing.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fgoogle-two-wave-indexing.png" alt="Google's two waves of indexing"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Source: [Google I/O 2018](https://www.youtube.com/watch?v=PFwUbgvpdaQ)&lt;/center&gt;

&lt;p&gt;When resources do become available, there isn’t a specific way of prioritising the pages that will be rendered first, which means that there are no guarantees on when pages will actually be rendered after they are initially discovered by Googlebot.&lt;/p&gt;

&lt;p&gt;What is the gap between the first and second wave of indexing then? According to Google’s Tom Greenaway and Martin Splitt during &lt;a href="https://www.deepcrawl.com/blog/events/google-javascript-rendering-secrets-chrome-dev-summit-2018/?utm_source=white_paper&amp;amp;utm_medium=content&amp;amp;utm_campaign=javascript_white_paper" rel="noopener noreferrer"&gt;Chrome Dev Summit 2018&lt;/a&gt;, it could take “minutes, an hour, a day or up to a week” for Google to render content after a page has been crawled.&lt;/p&gt;

&lt;p&gt;If your website gets stuck between these two waves of indexing, any new content you add or any changes you make to your website won’t be seen or indexed for an undetermined amount of time.&lt;/p&gt;

&lt;p&gt;This will have the biggest impact on sites that rely on fresh search results, such as ecommerce or news sites.&lt;/p&gt;

&lt;blockquote&gt;“Ecommerce websites should avoid serving product page content via JavaScript.”&lt;/blockquote&gt;

&lt;p&gt;-John Mueller, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-september-21st-2018/#6" rel="noopener noreferrer"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;“News sites should avoid content that requires JavaScript to load.”&lt;/blockquote&gt;

&lt;p&gt;-John Mueller, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-september-4th-2018/#9" rel="noopener noreferrer"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Bing’s rendering capabilities&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Bing’s crawler allegedly does render JavaScript, but is limited in being able to process the latest browser features and render at scale.&lt;/p&gt;

&lt;p&gt;The team at Bing recommended implementing dynamic rendering to make sure Bingbot is able to crawl and index your JavaScript-powered content and links.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“In general, Bing does not have crawling and indexing issues with web sites using JavaScript, but occasionally Bingbot encounters websites heavily relying on JavaScript to render their content, especially in the past few years. Some of these sites require far more than one HTTP request per web page to render the whole page, meaning that it is difficult for Bingbot, like other search engines, to process at scale on every page of every large website.&lt;/p&gt;

&lt;p&gt;Therefore, in order to increase the predictability of crawling and indexing of websites relying heavily on JavaScript by Bing, we recommend dynamic rendering as a great alternative. Dynamic rendering is about detecting the search engine’s bot by parsing the HTTP request user agent, prerendering the content on the server-side and outputting static HTML, helping to minimize the number of HTTP requests needed per web page and ensure we get the best and most complete version of your web pages every time Bingbot visits your site.&lt;/p&gt;                                                  

&lt;p&gt;When it comes to rendering content specifically for search engine crawlers, we inevitably get asked whether this is considered cloaking... and there is nothing scarier for the SEO community than getting penalized for cloaking. The good news is that as long as you make a good faith effort to return the same content to all visitors, with the only difference being that the content is rendered on the server for bots and on the client for real users, this is acceptable and not considered cloaking.”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://twitter.com/facan" rel="noopener noreferrer"&gt;Fabrice Canel&lt;/a&gt;, Principal Program Manager at Bing&lt;/p&gt;

&lt;p&gt;Even though Bing can render in some capacity, it isn’t able to extract and follow URLs that are contained within JavaScript.&lt;/p&gt;

&lt;blockquote&gt;“Don’t bury links to content inside JavaScript.”&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://www.bing.com/webmaster/help/webmaster-guidelines-30fba23a" rel="noopener noreferrer"&gt;Bing Webmaster Guidelines&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Yahoo’s rendering capabilities&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Yahoo cannot currently render at all. It is recommended to make sure that content isn’t ‘hidden’ behind JavaScript, as the search engine won’t be able to render to be able to find any content generated by the script. Only content that is served within the HTML will be picked up.&lt;/p&gt;

&lt;p&gt;You can get around this by using the &lt;code&gt;&amp;lt;noscript&amp;gt;&lt;/code&gt; element.&lt;/p&gt;

&lt;blockquote&gt;“Unhide content that’s behind JavaScript. Content that’s available only through JavaScript should be presented to non-JavaScript user agents and crawlers with noscript HTML elements.”&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://help.yahoo.com/kb/higher-website-rank-sln2216.html" rel="noopener noreferrer"&gt;Yahoo Webmaster Resources&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Yandex’s rendering capabilities&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;Yandex’s documentation explains that their search engine doesn’t render JavaScript and can’t index any content that is generated by it. If you want your site to appear in Yandex, make sure your key content is returned in the HTML upon the initial request for the page.&lt;/p&gt;

&lt;blockquote&gt;“Make sure that the pages return the full content to the robot. If they use JavaScript code, the robot will not be able to index the content generated by the script. The content you want to include in the search should be available in the HTML code immediately after requesting the page, without using JavaScript code. To do this, use HTML copies.”&lt;/blockquote&gt;

&lt;p&gt;-&lt;a href="https://yandex.com/support/webmaster/recommendations/changing-site-structure.html" rel="noopener noreferrer"&gt;Yandex Support&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;&lt;b&gt;Other search engines’ rendering capabilities&lt;/b&gt;&lt;/h3&gt;

&lt;p&gt;DuckDuckGo, Baidu, AOL and Ask are much less open about their rendering capabilities and lack official documentation as reference guides. The only way to find this out currently is to run tests ourselves.&lt;/p&gt;

&lt;p&gt;In 2017, &lt;a href="https://twitter.com/bart_goralewicz" rel="noopener noreferrer"&gt;Bartosz Góralewicz&lt;/a&gt; ran some experiments using a test site that used different JavaScript frameworks to serve content and analysed which search engines were able to render and index the content they generated.&lt;/p&gt;

&lt;p&gt;We can never make definitive conclusions based on the indexing of test sites alone, but the results showed that only Google and, surprisingly, Ask were able to index rendered content.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fsearch-engine-rendering-comparison.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fsearch-engine-rendering-comparison.png" alt="Search engine JavaScript rendering comparison chart"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;center&gt;Source: [Moz](https://moz.com/blog/search-engines-ready-for-javascript-crawling)&lt;/center&gt;

&lt;blockquote&gt;“Bing, Yahoo, AOL, DuckDuckGo, and Yandex are completely JavaScript-blind and won’t see your content if it isn’t in the HTML.”&lt;/blockquote&gt;

&lt;p&gt;-Bartosz Góralewicz, CEO of Onely&lt;/p&gt;

&lt;p&gt;Take a look at the full &lt;a href="https://moz.com/blog/search-engines-ready-for-javascript-crawling" rel="noopener noreferrer"&gt;article covering the experiment and results&lt;/a&gt; to learn more about Bartosz’s conclusions.&lt;/p&gt;

&lt;h2&gt;&lt;b&gt;How social media platforms render JavaScript&lt;/b&gt;&lt;/h2&gt;

&lt;p&gt;It’s important to know that social media and sharing platforms generally can’t render any JavaScript client-side.&lt;/p&gt;

&lt;blockquote&gt;“Facebook and Twitter’s crawlers can’t render JavaScript.”&lt;/blockquote&gt;

&lt;p&gt;-Martin Splitt, &lt;a href="https://www.deepcrawl.com/blog/news/google-webmaster-hangout-notes-october-30th-2018/#2" rel="noopener noreferrer"&gt;Google Webmaster Hangout&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you rely on JavaScript to serve content that would feed into Open Graph tags, Twitter Cards or even meta descriptions that would show when you share an article on Slack, for example, this content wouldn’t be able to be shown.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fslack-sharing.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fwww.deepcrawl.com%2Fwp-content%2Fuploads%2F2019%2F07%2Fslack-sharing.png" alt="URL sharing content dropdown example in Slack"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Make sure you pre-render, server-side render or dynamically content like featured images, titles and descriptions for crawlers like Twitterbot and Facebot, so they can display your site and its content properly.&lt;/p&gt;

</description>
      <category>seo</category>
      <category>javascript</category>
    </item>
  </channel>
</rss>
