<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Damien Jubeau</title>
    <description>The latest articles on DEV Community by Damien Jubeau (@damienjubeau).</description>
    <link>https://dev.to/damienjubeau</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/damienjubeau"/>
    <language>en</language>
    <item>
      <title>Core Web Vitals: a new SEO factor focusing on website speed (LCP, FID and CLS)</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Tue, 09 Jun 2020 16:02:18 +0000</pubDate>
      <link>https://dev.to/damienjubeau/core-web-vitals-a-new-seo-factor-focusing-on-website-speed-lcp-fid-and-cls-2848</link>
      <guid>https://dev.to/damienjubeau/core-web-vitals-a-new-seo-factor-focusing-on-website-speed-lcp-fid-and-cls-2848</guid>
      <description>&lt;p&gt;&lt;em&gt;The Speed Report in the Google Search Console (recently renamed “Core Web Vitals”) is offering two new performance metrics: Cumulative Layout Shift and Largest Contentful Paint additionally to the First Input Delay. Core Web Vitals have also been deployed in tools such as Page Speed Insights or Lighthouse.&lt;br&gt;&lt;br&gt;
Google recently announced a major change to come: the search engine will release a new Page Experience signal for ranking in 2021, based on Core Web Vitals, with detailed objectives to comply with. Let’s discover about Core Web Vitals and the details of the Google’s announcement.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;span id="more-6296"&gt;&lt;/span&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Page Experience: Google’s focus on UX and Speed
&lt;/h2&gt;

&lt;p&gt;Firstly, let’s talk about the timeline: &lt;a href="https://webmasters.googleblog.com/2020/05/evaluating-page-experience.html" rel="noopener noreferrer"&gt;Google’s announcement&lt;/a&gt; was published on May 28, 2020. According to Google Webmaster Central Blog, the change won’t be in effect before 2021, and Google will provide a notice at least 6 months before rolling out the signal. &lt;/p&gt;

&lt;p&gt;So if your website does not comply yet with the guidelines, the good news is you still have time to optimize your pages. &lt;/p&gt;

&lt;p&gt;According to &lt;a href="https://developers.google.com/search/docs/guides/page-experience" rel="noopener noreferrer"&gt;the documentation about the new “page experience&lt;/a&gt;”:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Page experience is a set of signals that measure how users perceive the experience of interacting with a web page beyond its pure information value.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Fcore-web-vitals-page-experience.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Fcore-web-vitals-page-experience.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;HTTPs usage, Mobile Friendly, Interstitials and Safe Browsing were already documented Search signals. The news is more about the introduction of Core Web Vitals.&lt;/p&gt;

&lt;p&gt;Core Web Vitals is a set of performance metrics that measure real-world user experience for loading performance (LCP: Largest Contentful Paint), interactivity (FID: First Input Delay), and visual stability of the page (CLS: Cumulative Layout Shift).&lt;/p&gt;

&lt;h2&gt;
  
  
  Core Web Vitals: LCP, FID and CLS
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Largest Contentful Paint (LCP) – Loading performance
&lt;/h3&gt;

&lt;p&gt;The Largest Contentful Paint measures the render time of the largest content element visible (within the viewport, so before the user scrolls).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Flcp-google-guidelines-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Flcp-google-guidelines-1.png" alt="LCP is considered good by Google under 2.5 seconds, and poor when higher than 4 seconds."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;According to Google guidelines, your web pages should have a LCP under 2.5 seconds, for 75% of your visitors at least (including both desktop and mobile traffic). &lt;/p&gt;

&lt;p&gt;&lt;em&gt;You can measure and monitor this metric with Dareboost, read our documentation to know more about&lt;/em&gt; &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/largest-contentful-paint-lcp" rel="noopener noreferrer"&gt;&lt;em&gt;Largest Contentful Paint&lt;/em&gt;&lt;/a&gt;&lt;em&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  First Input Delay (FID) – Interactivity
&lt;/h3&gt;

&lt;p&gt;First Input Delay (FID) is the delay a user experiences when interacting with the page for the first time (example: delay to get a feedback from the page when clicking an element)&lt;/p&gt;

&lt;p&gt;According to Google guidelines, your web pages should have a FID under 100ms, for 75% of your visitors at least (including both desktop and mobile traffic). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Ffid-google-guidelines.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Ffid-google-guidelines.png" alt="FID is considered good by Google under 100 ms, and poor when higher than 300 ms."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;You can measure and monitor Max Potential FID with Dareboost, read our documentation to know more about&lt;/em&gt; &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/max-potential-first-input-delay-fid" rel="noopener noreferrer"&gt;&lt;em&gt;Max Potential FID&lt;/em&gt;&lt;/a&gt;&lt;em&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Total Blocking Time as an Alternative to FID
&lt;/h4&gt;

&lt;p&gt;By definition, to collect the FID, you need an interaction. So you can’t measure FID without a real user interacting with the page. You can collect the Max Potential FID, that is the maximum theoretical value the FID could have whatever is the moment of the interaction.  &lt;/p&gt;

&lt;p&gt;Synthetic Monitoring tools like Dareboost are collecting Max Potential FID, but to get your exact FID results, you need to collect them on your audience (Search Console and Chrome UX Reports are a good start). &lt;/p&gt;

&lt;p&gt;Max Potential FID is the worst case scenario. Google suggests to use &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/total-blocking-time-tbt" rel="noopener noreferrer"&gt;Total Blocking Time&lt;/a&gt; (also available on Dareboost) as an alternative to FID. &lt;/p&gt;

&lt;h3&gt;
  
  
  Cumulative Layout Shift (CLS) – Visual Stability
&lt;/h3&gt;

&lt;p&gt;CLS measures the total sum of all individual layout shifts that occurs during the entire lifespan of the page (including after the user has started to interact with the page), taking into account the size of the concerned area, and the distance of the shift. Expected Layout Shift (shifts following a user interaction by less than 500ms) are not counted.  &lt;/p&gt;

&lt;p&gt;According to Google guidelines, your web pages should have a CLS under 0.1, for 75% of your visitors at least (including both desktop and mobile traffic). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Fcls-google-guidelines.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2020%2F06%2Fcls-google-guidelines.png" alt="CLS is considered good by Google under 0.1 seconds, and poor when higher than 0.25 seconds."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/cumulative-layout-shift-cls" rel="noopener noreferrer"&gt;Cumulative Layout Shift&lt;/a&gt; will be available soon on Dareboost. Feel free to get in touch if you are interested by the metric. &lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Search Console: Core Web Vitals replaced Speed Report
&lt;/h2&gt;

&lt;p&gt;In November 2019, the &lt;a href="https://blog.dareboost.com/en/2019/11/search-console-speed-report/" rel="noopener noreferrer"&gt;Speed Report was officially released in the Search Console&lt;/a&gt;, offering 2 performance metrics: First Contentful Paint and First Input Delay.&lt;/p&gt;

&lt;p&gt;The “Speed Report” of the Search Console, now named “Core Web Vitals”, is based on Chrome UX Report (collecting performance and usage data of real Chrome users). &lt;/p&gt;

&lt;p&gt;FCP have been abandoned and replaced by CLS and LCP with the Core Web Vitals report release. Another – and maybe subtler change: FID value is no more the 95th percentile but the  75th one, thus reflecting the guideline to have 75% of the traffic with a  FID under 100ms. &lt;/p&gt;

&lt;p&gt;Cumulative Layout Shift was collected (with an “experimental” flag) since May 2019 in the Chrome UX Report.  Largest ContentFul Paint has been introduced in September 2019.  &lt;/p&gt;

&lt;p&gt;As we wrote &lt;a href="https://blog.dareboost.com/en/2019/11/search-console-speed-report/" rel="noopener noreferrer"&gt;in November 2019&lt;/a&gt;: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“&lt;em&gt;These are some strong signs that we may expect some changes in the future from the Speed Report, with probably some metrics added to the current ones (or replacing them?).&lt;/em&gt;“&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Seems like we were right about this! And we would not be surprised to see other changes before Page Experience release.&lt;/p&gt;

&lt;p&gt;Also, we have to point out that FID, LCP and CLS may not be of equal interest in the Page Experience signal. At least, that’s the case from Lighthouse point of view. &lt;/p&gt;

&lt;p&gt;At the end of 2019, &lt;a href="https://blog.dareboost.com/en/2018/06/lighthouse-tool-chrome-devtools/" rel="noopener noreferrer"&gt;Lighthouse&lt;/a&gt; v6 was already announced and LCP was to be introduced.  CLS future usage was also mentioned, but without any precision. Lighthouse v6 is now released and includes CLS. Still, the metric is not a big part of the performance score, only 5% whereas LCP counts as 25%.&lt;/p&gt;

&lt;h2&gt;
  
  
  Page Experience: a stronger change than the Speed Update
&lt;/h2&gt;

&lt;p&gt;For the first time, Google is not only promoting speed and announcing performance as a ranking signal but also defining precise metrics (First Input Delay, Largest Contentful Paint and Cumulative Layout Shift), detailed targets (FID &amp;lt; 100 ms, LCP &amp;lt; 2.5 seconds and CLS &amp;lt; 0.1) and a context/methodology (measured on Chrome on your real audience, at least 75% of your trafic should have a fast experience, whatever is their context – mobile or desktop).&lt;/p&gt;

&lt;p&gt;In the very same announcement, Google also states that Top Stories feature will no longer require AMP : &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“&lt;em&gt;When we roll out the page experience ranking update, we will also update the eligibility criteria for the Top Stories experience. AMP will no longer be necessary for stories to be featured in Top Stories on mobile; it will be open to any page. Alongside this change, page experience will become a ranking factor in Top Stories, in addition to the many factors assessed.&lt;/em&gt;“&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The fact that both topics are announced together is very interesting. &lt;/p&gt;

&lt;p&gt;We already &lt;a href="https://blog.dareboost.com/en/2016/02/content-performance-policy-for-a-faster-web/" rel="noopener noreferrer"&gt;wrote about Top Stories being limited to AMP&lt;/a&gt; to be symptomatic of Google not having relevant way to asses speed at scale.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;“It’s difficult to determine for sure whether a website is slow or fast, because you need to take into account a lot of params. […] Highlighting AMP, Google actually simplifies the equation: websites using it are considered fast, so they are promoted. Then there’s the rest of the world…”  &lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Opening Top Stories to non-AMP page might be meaningful about Google’s confidence on Core Web Vitals. &lt;/p&gt;

&lt;p&gt;Contrary to the &lt;a href="https://blog.dareboost.com/en/2018/01/google-speed-update-ranking-signal-mobile-searches/" rel="noopener noreferrer"&gt;Speed Update&lt;/a&gt; (2018) that has not been a gamechanger, keep in mind that the Page Experience signal and Web Core Vitals defines precise targets and methodology.  &lt;/p&gt;

&lt;p&gt;According to &lt;a href="https://web.dev/vitals/#core-web-vitals" rel="noopener noreferrer"&gt;another piece of Google’s documentation&lt;/a&gt; :&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;“Core Web Vitals  [… ] should be measured by all site owners”.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Take your Performance to the Next Level with &lt;a href="https://www.dareboost.com/en" rel="noopener noreferrer"&gt;Dareboost&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webperf</category>
      <category>pagespeed</category>
      <category>seo</category>
      <category>google</category>
    </item>
    <item>
      <title>Search Console Speed Report: everything you need to know</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Mon, 25 Nov 2019 12:18:58 +0000</pubDate>
      <link>https://dev.to/damienjubeau/search-console-speed-report-everything-you-need-to-know-c49</link>
      <guid>https://dev.to/damienjubeau/search-console-speed-report-everything-you-need-to-know-c49</guid>
      <description>&lt;p&gt;&lt;em&gt;This article &lt;a href="https://blog.dareboost.com/en/2019/11/search-console-speed-report/" rel="noopener noreferrer"&gt;was originally seen&lt;/a&gt; on Dareboost's blog.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;One year after the&lt;/em&gt; &lt;a href="https://blog.dareboost.com/en/2018/01/google-speed-update-ranking-signal-mobile-searches/" rel="noopener noreferrer"&gt;&lt;em&gt;Speed Update&lt;/em&gt;&lt;/a&gt; &lt;em&gt;has been released, Google has launched a brand new Speed Report within the Search Console. The Speed Report is using the Chrome UX Report data to highlight the slow pages of your website. Let’s discover how to use the Search Console Speed Report and how to interpret the related performance metrics (FID and FCP). As a conclusion, we’ll focus on speed as a ranking signal and the further changes we might expect within the Speed Report.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Search Console Speed Report, a direct access to Chrome UX Report (CrUX) data for your domain
&lt;/h2&gt;

&lt;p&gt;Accessing your Google Search console, you may have noticed a new item in the menu, the Speed report (within the Enhancements sub-menu):&lt;br&gt;
 &lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-report-homescreen.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-report-homescreen.png" alt="Search"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From the homescreen of the Speed report, you’ll have a quick overview of the evolution of your site speed, segmented by device type (Mobile vs Desktop). To be more specific, the charts are showing the number of pages Google is considering slow, with a “moderate” speed, or fast through time.&lt;/p&gt;

&lt;p&gt;You may be surprised by a significantly lower number of URLs here than on your Coverage report. It makes sense, we'll get back to it shortly.&lt;br&gt;
The tool is advertising the data source of the Speed Report: the Chrome UX Report. We have already written about the Chrome UX Report (CrUX) on our blog, as it was already used by Google PageSpeed Insights.&lt;/p&gt;

&lt;p&gt;Chrome UX Report (CrUX) is forged from data collected via Chrome, directly from its users browsing the web. So the data from the Speed Report are the performance data from your very own traffic, at least a part of your traffic using Chrome as a web browser. &lt;br&gt;
Unfortunately, that means you need enough Chrome traffic on your web pages to ensure that you have data available in your Speed Report. That may explain the previously mentioned difference in URLs count between the Speed report and the reality of your website.&lt;/p&gt;

&lt;p&gt;For small websites, not having enough data in Chrome UX Report for any page, all you will get is a warning message:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-report-nodata.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-report-nodata.png" alt="Speed"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hopefully, you’ll have enough traffic to get some data and you’ll then be able to access a detailed report for both Mobile and Desktop segments.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-report-mobile-report.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-report-mobile-report.png" alt="Speed"&gt;&lt;/a&gt;Once you have selected the report for a given device type, you can note that Google is considering both “Slow” and “Moderate” pages as being an issue. The report is very similar to what you’re already used to with the Mobile Usability report for example.&lt;/p&gt;

&lt;p&gt;The tool is giving you the number of URLs that are related to a same issuer (that are not considered fast enough by Google).&lt;/p&gt;

&lt;p&gt;When clicking an issue, you’ll be provided with a list of URLs. Each URL is a sample of a group of URLs sharing speed status, metric type, and similar URL type (performance issues in similar pages is probably due to the same underlying problem as stated per the documentation. Still, the grouping mechanism based on URL patterns may not be a good fit depending on your website).&lt;/p&gt;

&lt;p&gt;You can access a sample of 20 URLs for a group, but the full list is not available.&lt;/p&gt;

&lt;p&gt;Google is focusing on 2 performance metrics to gauge the speed of the page.&lt;/p&gt;

&lt;h2&gt;
  
  
  First Contentful Paint (FCP) and First Input Delay (FID)
&lt;/h2&gt;

&lt;p&gt;You might already be familiar with these metrics, as Google is already highlighting them in &lt;a href="https://blog.dareboost.com/fr/2018/06/google-page-speed-insights/" rel="noopener noreferrer"&gt;PageSpeed Insights&lt;/a&gt;. &lt;br&gt;
In a few words:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;First Contentful Paint&lt;/strong&gt; is the delay before the page starts to render some content (text or image). We have dedicated a previous post to &lt;a href="https://blog.dareboost.com/en/2019/09/first-contentful-paint-fcp/" rel="noopener noreferrer"&gt;FCP&lt;/a&gt; if you want to know more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;First Input Delay&lt;/strong&gt; is the delay required for the browser to respond a the first user interaction (clicks, taps, or key presses) on the page. When your website is using Javascript, it may result in a janky experience for the user, due to the browser not being able to promptly respond to a user interaction. That’s what the First Input Delay is measuring, for the first click of the user.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For each group of pages, the Aggregate value of the metric is provided.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;em&gt;Aggregate FCP is the time it takes for 75% of the visits to a URL in this group to reach FCP.&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Aggregate FID is the time it takes for 95% of the visits to a URL in this group to respond to the first user interaction on that page.&lt;/em&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;More precisely, it’s the 75th (or 95th) percentile of the metric:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;if you have an Aggregate FCP of 1.2 seconds, it means 75% of your visitors experience a FCP faster than 1.2 seconds for the related pages (75th percentile).&lt;/li&gt;
&lt;li&gt;if you have an Aggregate FID of 150 milliseconds, it means 95% of your visitors experience FID faster than 150 milliseconds for the related pages.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To allocate a status to a page, Google is using the following thresholds:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;Fast&lt;/th&gt;
&lt;th&gt;Moderate&lt;/th&gt;
&lt;th&gt;Slow&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;FCP&lt;/td&gt;
&lt;td&gt;&amp;lt;1s&lt;/td&gt;
&lt;td&gt;&amp;lt;3s&lt;/td&gt;
&lt;td&gt;&amp;gt;=3s&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;FID&lt;/td&gt;
&lt;td&gt;&amp;lt;100ms&lt;/td&gt;
&lt;td&gt;&amp;lt;300ms&lt;/td&gt;
&lt;td&gt;&amp;gt;=300ms&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;(from the&lt;/em&gt; &lt;a href="https://support.google.com/webmasters/answer/9205520?hl=en&amp;amp;authuser=0" rel="noopener noreferrer"&gt;&lt;em&gt;Speed Report Documentation&lt;/em&gt;&lt;/a&gt;&lt;em&gt;)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Note that targets are the same for both Mobile and Desktop devices!&lt;/p&gt;

&lt;p&gt;If a page has two different status for the two metrics, the most defavorable status takes precedence. An an example, a page with a “Fast” FCP but a “Slow” FID will be considered “Slow”.&lt;/p&gt;

&lt;p&gt;Resulting status of the page according to its FCP et FID statuses:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;strong&gt;FID \ FCP&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Fast&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Moderate&lt;/strong&gt;&lt;/th&gt;
&lt;th&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Fast&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Fast&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Moderate&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Moderate&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Moderate&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Moderate&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Slow&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Please note that you don’t have any information about the distribution of the user experience from an Aggregate metric.&lt;/p&gt;

&lt;p&gt;Let’s take again the example of an Aggregate FID of 150 milliseconds (95th percentile). The 2 following distribution charts are compatible with this same Aggregate FID value:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Ffid-distribution.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Ffid-distribution.png" alt="FID"&gt;&lt;/a&gt;&lt;br&gt;
To conclude on the subject of aggregated values, please keep in mind that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;95% of fast sessions are required for FID to get the Fast status, but only 75% for FCP&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;this selection is totally hiding the results for your visitors with the worst experience (the 5 slowest % for FID, and the 25 slowest % for FCP).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;the aggregate says nothing about how the performance results are distributed in the given percentile. You can have a 75th percentile of 10 seconds &lt;strong&gt;and&lt;/strong&gt; 74.9% under 1 second (in theory, the example is a bit extreme!).&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Another limitation to be noted when reading those data: it’s collected from your website’s traffic (Chrome only), and your visitors are not equals. You only have 2 segments available, Desktop and Mobile, whereas your visitors are using hundred of devices, different connection types, some have already visited your pages and some are not, their browsing contexts are different (some would experience ads and retargeting, others may block them),etc. Still, all of their performance data are mixed together in the Speed Report.&lt;/p&gt;

&lt;p&gt;Thus, some pages that are technically identical can show very different results from the Speed Report perspective, just because they have a different audience (for example, your top blog post having a wide international traffic compared to your other pages. As this audience may include countries with slower infrastructure or devices, the page will probably be shown as slower, meaning the overall experience for this page is slower, even if the page itself is not slower than others).&lt;/p&gt;

&lt;p&gt;If you’re familiar with Web Performance, keep in mind that Speed Report has the exact same limitation that the ones you can face using a Real Using Monitoring tools.&lt;/p&gt;

&lt;h2&gt;
  
  
  Frequently Asked Questions about Speed Report
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Why my website doesn’t have Fast pages?
&lt;/h3&gt;

&lt;p&gt;Some could be disappointed by getting Fast results for a page on PageSpeed Insights but having the same Page being tagged as Slow or Moderated in the Speed Report.&lt;/p&gt;

&lt;p&gt;Remember that, for page page to be Fast in the Speed Report, you should have 75% of experienced FCP unders 1 second AND 95% of FID under 100 milliseconds. &lt;br&gt;
You have to meet both conditions, whatever the typology of your traffic is.&lt;/p&gt;

&lt;p&gt;As you can see from this PageSpeed Insights report, testing &lt;a href="https://www.google.com" rel="noopener noreferrer"&gt;https://www.google.com&lt;/a&gt; shows they do not meet the criteria either, even if the bars are mostly green:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fpagespeed-insights-report.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fpagespeed-insights-report.png" alt="PageSpeed"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  How can I check the status of a specific URL?
&lt;/h3&gt;

&lt;p&gt;You can’t, except being lucky and the URL being used as a sample for a group. As per the documentation of the tool: “The report is not designed find the status of a specific URL”. However, you can use PageSpeed Insights to do so.&lt;/p&gt;

&lt;h3&gt;
  
  
  How can I know if I’m fast enough for my market?
&lt;/h3&gt;

&lt;p&gt;Check your competition. &lt;br&gt;
Speed Report is using data from CrUX to give you insights about your slow web pages.&lt;br&gt;
You can access your competition data as well by using PageSpeed Insight (or BigQuery if you’re not afraid to compose your queries &lt;a href="https://web.dev/chrome-ux-report-bigquery/" rel="noopener noreferrer"&gt;https://web.dev/chrome-ux-report-bigquery/&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;Dareboost is offering a dedicated tool to &lt;a href="https://www.dareboost.com/en/compare" rel="noopener noreferrer"&gt;compare your website with your competitors&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  I’ve optimized my website, why is the Speed Report not showing the improvements?
&lt;/h3&gt;

&lt;p&gt;The new Search console is generally providing very fresh data. It’s a bit different for the Speed Report, as the performance data as extracted from a one-month rolling period of time (&lt;a href="https://twitter.com/aaronpeters/status/1191671782597627904" rel="noopener noreferrer"&gt;not totally confirmed&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;Monitoring your website performance to facilitate your reporting and to make sure you’re alerted when there is a slowdown is our job at Dareboost, discover &lt;a href="https://www.dareboost.com/en/tool/website-monitoring" rel="noopener noreferrer"&gt;our monitoring features&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why does the Speed Report show URLs from my subdomains?
&lt;/h3&gt;

&lt;p&gt;Select carefully the “property” from the Search Console you want to get the data for. For example, using a “domain” property, you’ll also get the data for the subdomains sharing the same origin (ie: selecting “dareboost.com” property, we would get “blog.dareboost.com” as well as “www.dareboost.com” related data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Slow pages in the Speed Report, a red flag for your ranking?
&lt;/h2&gt;

&lt;p&gt;Knowing about the &lt;a href="https://blog.dareboost.com/en/2018/01/google-speed-update-ranking-signal-mobile-searches/" rel="noopener noreferrer"&gt;Speed Update&lt;/a&gt; like we do, you’re probably very interested in the new Search Console Speed Report. A new step from Google to highlight Speed as a ranking factor or to prepare a strengthening of the factor? Still the definition of what would be a Speed ranking factor is still opaque.&lt;/p&gt;

&lt;p&gt;According to John Mueller and Martin Splitt’s video, &lt;a href="https://www.youtube.com/watch?v=7HKYsJJrySY&amp;amp;feature=youtu.be&amp;amp;t=22" rel="noopener noreferrer"&gt;Site Speed: What SEOs Need to Know &lt;/a&gt;Google is both using Lab’s Data (&lt;a href="https://www.dareboost.com/en/tool/website-monitoring" rel="noopener noreferrer"&gt;synthetic monitoring&lt;/a&gt;) and a data source “similar to the Chrome User Experience report data”, and is not focusing on a particular threshold, just roughly categorizing pages...&lt;/p&gt;

&lt;p&gt;Here’s the transcription of this part the video:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Basically we are categorizing pages more or less as like really good and pretty bad. So there's not really like a threshold in between it's just like we are more or less roughly categorizing the speed experience for users. &lt;br&gt;
And how are we actually doing that ? where do we get the data from ?&lt;br&gt;
We mostly get data from two places on the one hand we try to calculate a theoretical speed of a page using lab data, and then we also use real field data from users who've actually tried to use those pages. And that field data is similar to the Chrome User Experience report data &lt;br&gt;
So we are having like hypothetical data and practical data so we don't really have a threshold to give away but basically the recommendation I would say is just make sites fast for users.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-ranking-factor.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fspeed-ranking-factor.png" alt="Screenshot"&gt;&lt;/a&gt;Screenshot from the Speed Report documentationSo definitely, keeping an eye on your Speed Report sounds like a good idea.&lt;/p&gt;

&lt;p&gt;Among the other news from Google this month, an announcement about Chrome: &lt;a href="https://blog.chromium.org/2019/11/moving-towards-faster-web.html" rel="noopener noreferrer"&gt;Moving towards a faster web&lt;/a&gt;. You may have read some titles in the SEO news Brace Yourself: Chrome Is About to Shame Slow Websites with a Dedicated Loading Screen! Reading the article article carefully, we don’t see more than a declaration of intent to dig about this idea.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fchrome-badgering-slow-websites.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2019%2F11%2Fchrome-badgering-slow-websites.jpg" alt="Chrome"&gt;&lt;/a&gt;Let’s discover in the next months if the Chrome Team will conduct such an experiment and hopefully communicate about how they asses that a page load is usually slow (in a given context?)&lt;/p&gt;

&lt;h2&gt;
  
  
  New Metrics and features to be expected in the Speed Report
&lt;/h2&gt;

&lt;p&gt;During their talk “&lt;a href="https://www.youtube.com/watch?v=iaWLXf1FgI0" rel="noopener noreferrer"&gt;Speed tooling evolutions: 2019 and beyond&lt;/a&gt;” at Chrome Dev Summit 2019, Paul Irish and Elizabeth Sweeny from the Google Chrome Web Platform explained that some new metrics will gain focus : Largest Contentful Paint (LCP) , Total Blocking Time (TBT), and Cumulative Layout Shift (CLS).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.dareboost.com/en/2018/06/lighthouse-tool-chrome-devtools/" rel="noopener noreferrer"&gt;Lighthouse&lt;/a&gt; version 6 (to be released on January 2020) will use LCP and TBT metrics to compute its new Performance score. CLS has been mentioned to be also used is the future.&lt;/p&gt;

&lt;p&gt;If you’re as curious as we are and explore the CrUX data using Bigquery, you’ll note that &lt;em&gt;cumulative_layout_shift&lt;/em&gt; is collected (with an “experimental” flag) since May 2019. Largest ContentFul Paint has been introduced in September.&lt;/p&gt;

&lt;p&gt;This are some strong signs that we may expect some changes in the future from the Speed Report, with probably some metrics added to the current ones (or replacing them?).&lt;/p&gt;

&lt;p&gt;We guess that the Speed Report will progressively be improved with some additional features to explore data by connection type or by country - as already offered by TestMySite, etc. to allow a better understanding of Web Performance in different contexts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Take your Performance to the Next Level with Dareboost
&lt;/h2&gt;

&lt;p&gt;At Dareboost, we provide comprehensive tools to manage easily and efficiently Web Performance. &lt;a href="https://www.dareboost.com/en/features" rel="noopener noreferrer"&gt;Discover Dareboost features&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>webperf</category>
      <category>performance</category>
      <category>speed</category>
      <category>google</category>
    </item>
    <item>
      <title>Lighthouse: a powerful tool included in Chrome and DevTools</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Mon, 30 Jul 2018 11:51:16 +0000</pubDate>
      <link>https://dev.to/damienjubeau/lighthouse-a-powerful-tool-included-in-chrome-and-devtools-a64</link>
      <guid>https://dev.to/damienjubeau/lighthouse-a-powerful-tool-included-in-chrome-and-devtools-a64</guid>
      <description>&lt;p&gt;&lt;em&gt;This article was originally seen on &lt;a href="https://blog.dareboost.com/en/2018/06/lighthouse-tool-chrome-devtools/" rel="noopener noreferrer"&gt;Dareboost's blog&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Released in 2016 as a Chrome extension, Lighthouse is now also available directly in Chrome DevTools, via the "Audits" tab. Lighthouse is a great resource for developers interested in web performance and quality.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Lighthouse, a quick overview
&lt;/h2&gt;

&lt;p&gt;Auditing a web page with Lighthouse is very simple if you are familiar with the Chrome DevTools. Browse to the page with Chrome, open DevTools (Ctrl+Shift+i or ⌥+⌘+i depending on your system) and then go to the "Audits" section. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-dev-tools-audits-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-dev-tools-audits-1.png" alt="lighthouse dev tools audits"&gt;&lt;/a&gt; &lt;br&gt;
Clicking "Perform an audit" will then allow you to configure the level of the audit according to your interests (Performance, SEO, Accessibility, etc.). &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-dev-tools-audit-settings-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-dev-tools-audit-settings-1.png" alt="lighthouse dev tools audit settings"&gt;&lt;/a&gt; &lt;br&gt;
You will be able to see the page loading and reloading, and after a while, a new window will display your audit report. &lt;/p&gt;

&lt;p&gt;If your version of Chrome is less than 69 (the current version at the time of writing this article is 67), this manipulation will trigger Lighthouse 2. You can use the Lighthouse extension available on the Chrome Store to test with Lighthouse 3. During this test, we used the extension. What follows, therefore, refers to Lighthouse 3. &lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-audit-report-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-audit-report-1.png" alt="lighthouse audit report"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;NB: During our test using &lt;strong&gt;Lighthouse&lt;/strong&gt; 3.0.0-beta.0, the screenshots were not fitting the expected viewport, probably affecting the Speed Index computation.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;When Lighthouse has completed the evaluation of your page, you’re provided with an audit report, that begins by several scores (as many scores as categories chosen during the audit configuration). &lt;/p&gt;

&lt;p&gt;The Performance Score is computed from your speed test results, comparing your website speed against others. Getting a 100 score means that the tested webpage is faster than 98% or more of web pages. A score of 50 means the page is faster than 75% of the web. [&lt;a href="https://developers.google.com/web/tools/lighthouse/v3/scoring#perf" rel="noopener noreferrer"&gt;Source&lt;/a&gt;] &lt;/p&gt;

&lt;p&gt;Other scores are depending on the compliance of the page against related best practices (you may note a "Best Practices" category: the name is kind of misleading as other categories are also offering some best practices, "Miscellaneous" may have been a better fit). &lt;/p&gt;

&lt;p&gt;In our example, a question mark is displayed instead of a score. You can get this behavior when some related tests have not been conducted properly and are marked as "Error!". &lt;/p&gt;

&lt;p&gt;After the scores overview, you’ll find the performance results for 6 metrics, and tooltips are available for a quick explanation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;First ContentFul Paint:&lt;/strong&gt; First contentful paint marks the time at which the first text/image is painted.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;First Meaningful Paint:&lt;/strong&gt; First Meaningful Paint measures when the primary content of a page is visible.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Speed Index:&lt;/strong&gt; Speed Index shows how quickly the contents of a page are visibly populated.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;First CPU Idle:&lt;/strong&gt; First CPU Idle marks the first time at which the page's main thread is quiet enough to handle input.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Time to Interactive:&lt;/strong&gt; Interactive marks the time at which the page is fully interactive.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Estimated Input Latency:&lt;/strong&gt; the score above is an estimate of how long your app takes to respond to user input, in milliseconds, during the busiest 5s window of page load. If your latency is higher than 50 ms, users may perceive your app as laggy.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Please note than some of those metrics are still at a very early stage. For example, as stated in its &lt;a href="https://docs.google.com/document/d/1BR94tJdZLsin5poeet0XoTW60M0SjvOJQttKT-JK8HI/view#" rel="noopener noreferrer"&gt;initial specification&lt;/a&gt;: First Meaningful Paint [..] matches user-perceived first meaningful paint in 77% of 198 pages. Collected metrics have significantly changed between Lighthouse V2 and V3. We will detail this in our next article. Still, if you’re eager to know, you can &lt;a href="https://developers.google.com/web/updates/2018/05/lighthouse3#scoring" rel="noopener noreferrer"&gt;check the update announcement&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;In the report, you’ll next find a filmstrip: step by step images of the page loading. That’s particularly useful to make sure the page has loaded as expected. For example, during our benchmark, we got a report with discrepancies. We have been able to confirm something went wrong thanks to the filmstrip: &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-audit-report-error-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-audit-report-error-1.png" alt="lighthouse audit report error"&gt;&lt;/a&gt;&lt;br&gt;
Unfortunately, we were not able to find out more about what went wrong. One may regret the lack of details when using Lighthouse for complex work. Without an access to the page load waterfall, you cannot figure out more deeply what happened here. &lt;/p&gt;

&lt;p&gt;After the performance overview, you’ll be provided with the best practices for each category. Most of the tips are very technical and not extensively detailed in the report itself, but you’ll find very valuable resources under the "Learn more" links. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-audit-report-tip-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2018%2F06%2Flighthouse-audit-report-tip-1.png" alt="lighthouse audit report tip"&gt;&lt;/a&gt;&lt;br&gt;
What makes Lighthouse a great audit tool is also the number of automated controls: about a hundred. Lighthouse also highlights some "Additional items to manually check" that will be precious reminders (for instance in the accessibility category "The page has a logical tab order").&lt;/p&gt;

&lt;p&gt;Note that some best practices are duplicated within several categories, for example, the control related to &lt;a href="https://blog.dareboost.com/en/2015/04/chrome-firefox-and-google-search-https-forcing-its-way#mixed-content" rel="noopener noreferrer"&gt;mixed content&lt;/a&gt; is present in the "Progressive Web App" category as well as in "Best Practices".&lt;/p&gt;

&lt;h2&gt;
  
  
  With Great Power Comes Great Responsibilities
&lt;/h2&gt;

&lt;p&gt;Lighthouse is definitely a great resource, for both performance metrics and quality controls it can provide, always on hand as it’s available directly in Chrome. &lt;br&gt;
This last advantage may also be your worst enemy! &lt;br&gt;
When conducting the performance test from your desktop computer, you’re relying on a lot of parameters from your local environment and it can be difficult to get stable enough results:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Internet connection:&lt;/strong&gt; are you sure you don’t have any background application consuming some bandwidth? If you’re sharing the connection with others, are you sure nobody is paraziting your tests? If your Internet Service Provider offering a connection stable enough?&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;CPU:&lt;/strong&gt; are you sure your other computer usages are not affecting your running test?&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Chrome extensions:&lt;/strong&gt; as &lt;a href="https://twitter.com/LoukilAymen/status/1001823010544898048?s=20" rel="noopener noreferrer"&gt;pointed out by Aymen Loukil&lt;/a&gt; they can deeply affect your results. Be particularly careful about extensions related to ad blocking!&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;User state:&lt;/strong&gt; are you sure that your Lighouse test is starting with a clean state? What about your cookies, the state of your Local Storage, the open sockets (you can check on chrome://net-internals/#sockets), etc.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Lighthouse version:&lt;/strong&gt; Lighthouse may have been updated since your last report, have you checked the changelog? Beware that the extension is auto-updating by default, and the version available through DevTools will be updated when updating Chrome.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The first two points are handled by the latest Lighthouse version (3.0), and a new mode "&lt;em&gt;Simulate throttling for performance audits (faster)&lt;/em&gt;". Rather than relying on Chrome DevTools for traffic shaping, Lighthouse is now using a new internal auditing engine: Lantern. It aims to reduce performance metrics variability without losing too much accuracy. The approach is very interesting. For further details a public &lt;a href="https://docs.google.com/presentation/d/1EsuNICCm6uhrR2PLNaI5hNkJ-q-8Mv592kwHmnf4c6U" rel="noopener noreferrer"&gt;overview&lt;/a&gt; as well as a &lt;a href="https://docs.google.com/document/d/1ktXDwJkF_7w0MPLSKFZqmKdMs5TLZqc7em45XkOObAg" rel="noopener noreferrer"&gt;detailed analysis&lt;/a&gt; are available. Let’s see in the future how reliable it can be at scale! &lt;br&gt;
Whatever the Lighthouse version you’re using, keep in mind the impact of your local environment and conditions. &lt;/p&gt;

&lt;p&gt;Dareboost, as a &lt;a href="https://www.dareboost.com/en/tool/website-monitoring" rel="noopener noreferrer"&gt;synthetic monitoring solution&lt;/a&gt; is particularly aware of these stakes. For every single test on Dareboost, we’re creating a new clean Chrome user profile and opening a new Chrome instance. Each of our test regions are using the exact same infrastructure and network conditions. We do not parallelize tests on our virtual machines to avoid any interdependencies or bottlenecks.&lt;/p&gt;

&lt;p&gt;If you’re not convinced yet that a dedicated environment is required to run your performance tests, we have run a small experiment. &lt;/p&gt;

&lt;p&gt;We have conducted 3 Lighthouse audits on Apache.org from our office (fiber connection - average ping value to Apache.org is 40ms). Here are the median values of Lighthouse results:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Performance Score&lt;/th&gt;
&lt;th&gt;First Contentful Paint&lt;/th&gt;
&lt;th&gt;Speed Index&lt;/th&gt;
&lt;th&gt;Time to Interactive&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;85&lt;/td&gt;
&lt;td&gt;1690&lt;/td&gt;
&lt;td&gt;1730&lt;/td&gt;
&lt;td&gt;5380&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Same exercise, but throtthling our connection to add 500ms latency (using tc Unix command):&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Performance Score&lt;/th&gt;
&lt;th&gt;First Contentful Paint&lt;/th&gt;
&lt;th&gt;Speed Index&lt;/th&gt;
&lt;th&gt;Time to Interactive&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;67&lt;/td&gt;
&lt;td&gt;2780&lt;/td&gt;
&lt;td&gt;3880&lt;/td&gt;
&lt;td&gt;7320&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;We’re getting a 21% variation on Performance Score while the Speed Index has more than doubled with the second context. And we’re getting those results using Lantern mode (emulated Lighthouse throttling) that actually aims to mitigate local network variability. &lt;/p&gt;

&lt;p&gt;Your own network latency is hopefully not varying this much. But keep in mind that the latency is only one of a lot of variable parameters in your local environment! &lt;/p&gt;

&lt;p&gt;For temporary needs, Dareboost offers a free version to benefit from a stable and reliable environment, with up to 5 performance test per month. &lt;a href="https://www.dareboost.com/en" rel="noopener noreferrer"&gt;Give it a try&lt;/a&gt;! &lt;/p&gt;

&lt;p&gt;When using Lighthouse, if you wish to compare several reports, keep in mind the volatility of your context and related bias!&lt;/p&gt;

&lt;h2&gt;
  
  
  Some extra Pro tips
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;  Lighthouse is an Open Source project available on a &lt;a href="https://github.com/GoogleChrome/lighthouse/issues?utf8=%E2%9C%93&amp;amp;q=is%3Aissue+is%3Aopen+capture" rel="noopener noreferrer"&gt;Github repository&lt;/a&gt;. You can dig into the code to know more and the bug tracker will be handful.&lt;/li&gt;
&lt;li&gt;  The tool is also executable with a command-line interface.&lt;/li&gt;
&lt;li&gt;  You can explore Lighthouse results at the scale of the web by using &lt;a href="https://httparchive.org/" rel="noopener noreferrer"&gt;HTTP Archive&lt;/a&gt; collecting data on thousands of pages.&lt;/li&gt;
&lt;li&gt;  If you’re using Lighthouse v2 (throttled connection rather than the Lantern emulation), keep in mind you’re using Chrome DevTools traffic shaping ability. Latency is injected at HTTP level, unlike Dareboost or WebPageTest that inject it at TCP level. As a result, Chrome DevTools is not adding latency during TCP connection neither SSL handshakes.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;To conclude, Lighthouse is a useful and promising tool. Like the two other Google tools previously tested, it could give clues about Google’s strategy with the Speed Update. &lt;/p&gt;

</description>
      <category>loadtime</category>
      <category>performance</category>
      <category>frontend</category>
      <category>tool</category>
    </item>
    <item>
      <title>Load time is out!</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Fri, 24 Nov 2017 15:41:33 +0000</pubDate>
      <link>https://dev.to/damienjubeau/load-time-is-out-252</link>
      <guid>https://dev.to/damienjubeau/load-time-is-out-252</guid>
      <description>&lt;p&gt;&lt;em&gt;This article was originally seen on &lt;a href="https://blog.dareboost.com/en/2017/11/load-time-is-out/"&gt;Dareboost's blog&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Web agencies and their customers, managers and technical teams, are discussing and debating about load time without always understanding each others.&lt;br&gt;
Whereas websites loading speed is an undeniable and strategic matter that every person involved within a web project should consider, relevant and consistent indicators for web performance have to be selected first. You can only improve what you measure and this measurement has to be reliable.&lt;/em&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  “Load time” just don’t exist!
&lt;/h2&gt;

&lt;p&gt;That controversial titling highlights a reality: load time is an general concept, often used without designating one precise indicator! As a matter of facts, several tools talk about load time whereas their definition may greatly vary from one to another:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  The average Page Load Time provided by Google Analytics (available through Behavior &amp;gt; Site Speed menu) actually measures the time until the &lt;code&gt;onload&lt;/code&gt; event is fired.&lt;/li&gt;
&lt;li&gt;  Pingdom also coins a “load time” which is equivalent to the &lt;code&gt;onload&lt;/code&gt; event firing. But, this indicator is used as a Fully Loaded Time within their waterfall, causing some rendering issues. To figure it out, we had to check the downloadable HAR file.&lt;/li&gt;
&lt;li&gt;  Test My Site (Think with Google) in fact provides a Speed index measure. Something you can discover via a tooltip. As we’ll see later, using Speed Index is a good thing. But naming it “load time” is misleading. At last, this tool gives very few information about the testing context (which location? What kind of 3G connection?).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;One more example with Alexa’s tool, giving the current definition:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The load time of an individual page is how long it takes for the DOM – the structure of the page – to be loaded. This time doesn’t include the time to load all images and stylesheets, for example. Alexa.com&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;By excluding the images (more than 50% of the total weight of the web pages nowadays according to HTTP Archive) and also stylesheets (whose loading is blocking the visual rendering), Alexa is very far from measuring user’s experience!&lt;/p&gt;

&lt;p&gt;Even if we should question the value of such an indicator, we have to acknowledge the transparency of the tool, giving a definition for the provided indicator.&lt;/p&gt;

&lt;p&gt;Nevertheless, as a lot of tools do, Alexa fails while giving no information about measuring context: with a robot or a real browser? What kind of connection? From which location? All of these parameters are critical ones though, as they strongly impact the measured values. Without any of these information, we can’t rely on them as they may lack of constancy over time!&lt;/p&gt;

&lt;p&gt;You’d rather use tools that clearly mention their testing methodology (which metrics are gathered and how they are measured). Doing so, you will avoid bias, misinterpretations and groundless debates.&lt;/p&gt;

&lt;h2&gt;
  
  
  Time until the end of loading
&lt;/h2&gt;

&lt;p&gt;You will also find a load time indicator on Dareboost. Yes, you read it! Still, we are not schizophrenic. Indeed, we measure the Fully Loaded time (one indicator you may find on some other tools).&lt;/p&gt;

&lt;p&gt;A few words about our measuring methodology: Dareboost requests the web page and then listens to the network activity. Our web browser (Chrome) waits for a significant interruption of the network traffic before considering the web page to be fully loaded. Do not hesitate to browse our documentation for more details: &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/fully-loaded"&gt;Web Performance Testing &amp;gt; Load time / Fully loaded&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Fully Loaded Time and User Experience
&lt;/h2&gt;

&lt;p&gt;Can we consider, though, the fully loaded time as a relevant indicator to evaluate user experience related to a webpage’s performance? One single example will be enough to blow this hypothesis away… So let’s compare the web performances of 2 very well-known SEO news websites: Searchenginejournal.com and Searchengineland.com. Have a look at the video replay of their homepage loading…&lt;br&gt;&lt;br&gt;
Which one seems the fastest to you? &lt;a href="https://blog.dareboost.com/wp-content/uploads/2017/11/video-comparison-loading-SEJ-SELand.webm"&gt;Check the video replay&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;No dramatical gape between these 2 browsing experiences, but you may find Searchengineland.com a bit faster, as its rendering starts sooner and the main images display too.&lt;br&gt;
For a more detailed analysis, you can also take a look at the filmstrip within &lt;a href="https://www.dareboost.com/en/comparison/598c525b0cf2aac95f75e668/598c525b0cf2aac95f75e669"&gt;the comparison report&lt;/a&gt; provided by our testing tool:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KrvNja22--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/11/filmstrip-comparison-SEJ-SELand-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KrvNja22--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/11/filmstrip-comparison-SEJ-SELand-1.png" alt="Filmstrip comparison SEJ vs SELand"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This confirms the narrow victory (in matter of speed) of searchengineland.com: indeed, at 1,5s, it gets 85% completed while Searchenginjournal.com only reaches 40%.&lt;br&gt;
For the most curious of you, let me point out that the current test has been operated via Dareboost.com, simulating a Seattle-based user of Google Chrome desktop browser and a Cable connection with a 10Mbps downstream bandwidth (28ms latency).&lt;/p&gt;

&lt;p&gt;Let’s get back to fully loaded time values from the very-same test and check what they tell us … The results are quite different! 3,95 seconds for Searchenginejournal.com against 8,75 seconds for Searchengineland.com!&lt;br&gt;
So, according to the fully loaded time analysis, we would conclude that Searchengineland.com is more than twice slower than Searchenginejournal.com. That is &lt;strong&gt;an inconsistent conclusion&lt;/strong&gt; regarding the video replay analysis !&lt;/p&gt;

&lt;p&gt;Then why did Search Engine Land get such a higher fully loaded time compared to its competitor? Because this indicator takes into account all of the HTTP requests and responses that are triggered all along the web page’s loading, including the deferred ones and even those that don’t impact the page rendering nor interactivity (for example: retargeting technologies).&lt;/p&gt;

&lt;p&gt;In other words, the fully loaded time absolutely does not transcribe rendering speed for users but is no more than a technical loading delay.&lt;/p&gt;

&lt;p&gt;Days after days, web pages get more and more complex, no longer limited to few lines of HTML code… Always enriched with more media, more features and external resources, and also a growth of asynchronous behaviours or progressive techniques to get a smoother browsing experience.&lt;br&gt;
Nowadays, the fully loaded time has completely lost its relevancy about website speed as perceived by the end-user. And its only remaining interest is purely technical.&lt;/p&gt;

&lt;h2&gt;
  
  
  Speed Index, an indicator about UX!
&lt;/h2&gt;

&lt;p&gt;If we no longer can rely on the time until the end of loading to evaluate our web pages speed, what other metric to trust then? For a few years now, a new range of indicators has emerged, based on video analysis. A testing base that is directly focused on the user experience related to the web page rendering!&lt;/p&gt;

&lt;p&gt;Dareboost, among others, provides 3 major indicators based on this kind of analysis:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/start-render"&gt;Start Render&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/visually-complete"&gt;Visually Complete&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/metrics/speed-index"&gt;Speed Index&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Those 3 indicators are all focused on the above-the-fold part of the page (visible without having to scroll)! Start Render, Visually Complete and Speed Index are complementary, but if you’d have to choose a single one, you should go for the Speed Index. It synthesizes the whole complexity and progressiveness of the web page rendering, as perceived by a visitor, in one single value.&lt;/p&gt;

&lt;p&gt;Even if the Speed Index is computed from a smart mathematical formula (read &lt;a href="https://blog.dareboost.com/en/2015/07/start-render-visually-complete-speedindex-2/"&gt;this article&lt;/a&gt; for further information about our 3 indicators based on video analysis), it is quite easy to use: the lower it is, the fastest your web page displays.&lt;/p&gt;

&lt;p&gt;So let’s check that with our previous comparison between Search Engine Land and Search Engine Journal:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DeOtkKvo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/11/Webperf-comparison-dareboost-SEJ-SELand.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DeOtkKvo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/11/Webperf-comparison-dareboost-SEJ-SELand.png" alt="Webperformance Dareboost Comparison"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;1396 for Searchengineland.com; 1613 for Searchenginejournal.com… With the Speed Index , we do catch up the speed gap that we’ve observed empirically by watching the page’s loading video !&lt;/p&gt;

&lt;p&gt;Nevertheless, the “load time” issues and limitations described through this post have to teach us one important thing: performance of a website can hardly be measured with a single indicator! If Speed ​​Index can be considered today as the most “synthetic”, it may not be relevant in all cases!&lt;/p&gt;

&lt;p&gt;As an example, Speed Index may be tampered by animations. This could affect the indicator’s consistency or make it irrelevant to transcribe the rendering speed.&lt;/p&gt;

&lt;p&gt;Once again though, as a complete tool to manage your web performance in all cases, Dareboost allows to handle those kind of issues! Our advanced testing parameters offer a special feature to &lt;a href="https://www.dareboost.com/en/doc/website-speed-test/settings/animations-slider"&gt;deactivate the animations&lt;/a&gt; of your web pages. A feature (still experimental) that will allow to deal with most of the cases by disabling CSS3 animations or Carrousels for example.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;To successfully manage your website performance, you first have to adopt a reliable and rigorous measuring tool – as &lt;a href="https://www.dareboost.com"&gt;Dareboost&lt;/a&gt; of course! Then, identify and use a combination of several performance indicators, meeting your objectives. If you would have to keep a single one, It should be the Speed Index!&lt;/p&gt;

&lt;p&gt;A few other metrics are emerging and are very promising, even if some would still need to gain in robustness. For example, First Meaningful Paint could become an excellent complement to Start Render, allowing to only consider the rendering of content with a significant interest…&lt;/p&gt;

</description>
      <category>loadtime</category>
      <category>webdev</category>
      <category>performance</category>
      <category>speed</category>
    </item>
    <item>
      <title>Secure your cookies to the next level with SameSite attribute</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Sat, 01 Jul 2017 17:46:02 +0000</pubDate>
      <link>https://dev.to/damienjubeau/secure-your-cookies-to-the-next-level-with-samesite-attribute</link>
      <guid>https://dev.to/damienjubeau/secure-your-cookies-to-the-next-level-with-samesite-attribute</guid>
      <description>&lt;p&gt;&lt;em&gt;After reading our last article about &lt;a href="https://blog.dareboost.com/en/2016/12/secure-cookies-secure-httponly-flags/"&gt;how to secure your cookies&lt;/a&gt;, you may (should?) already be using Secure and HttpOnly flags. As a reminder, â€˜Secure’ allows to prevent a cookie to be sent on a non-secure web page, whereas â€˜HttpOnly’ prevents any client-side usage of a given cookie.&lt;br&gt;
It is now time to take your website security to the next level with one more attribute for your cookies! Let’s talk about SameSite instruction, allowing to prevent Cross-Site Request Forgery (CSRF) attacks and Cross-Site Script Inclusion (XSSI).&lt;/em&gt; &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NL-LVlUU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/06/Cookies-security.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NL-LVlUU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/06/Cookies-security.jpg" alt="Cookies security"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  SameSite attribute, to manage when a cookie should or should not be sent
&lt;/h3&gt;

&lt;p&gt;The main concept behind Same-Site is similar to HTTPOnly and Secure features: getting control over the cookie behaviour, more precisely, defining when the cookie should not be sent. &lt;br&gt;
There are two policies for SameSite attribute, defined by its values (case-insensitive): Strict (default) and Lax.&lt;/p&gt;

&lt;h4&gt;
  
  
  Strict policy for Same-Site Cookie
&lt;/h4&gt;

&lt;p&gt;The defined cookie will only be sent if the request is originating from the same site.&lt;/p&gt;

&lt;pre&gt;_&lt;span&gt;Set-Cookie: SID=31d4d96e407aad42; SameSite=Strict&lt;/span&gt;_&lt;/pre&gt;

&lt;h4&gt;
  
  
  Lax policy for Same-Site Cookie
&lt;/h4&gt;

&lt;p&gt;Lax mode is adding one exception for the cookie to be sent if we’re not in a Same-Site context: the defined cookie will also be sent for requests using &lt;a href="https://tools.ietf.org/html/rfc7231#section-4.2.1"&gt;a safe method&lt;/a&gt; (GET method for most) for top-level navigation (basically something resulting in the URL changing in the web browser address bar).&lt;br&gt;
The Strict mode would prevent any session cookie to be sent for a website reached by following an external link (from an email, from search engines results, etc.), resulting for a user not being logged in. (if we were using Strict Same-Site on &lt;a href="https://www.dareboost.com/"&gt;dareboost.com&lt;/a&gt;, by clicking this link, you would not be detected as logged in, whether you were connected or not). &lt;br&gt;
The behaviour can be confusing for the final user, so you would prefer using the Lax mode.&lt;/p&gt;

&lt;pre&gt;_&lt;span&gt;Set-Cookie: SID=31d4d96e407aad42; SameSite=Lax&lt;/span&gt;_&lt;/pre&gt;

&lt;p&gt;&lt;strong&gt;WARNING:&lt;/strong&gt; Strict being the default mode, any typo writing the Lax value would result in Strict behaviour.&lt;/p&gt;

&lt;h3&gt;
  
  
  Cross-Site Request Forgery, the initial problem
&lt;/h3&gt;

&lt;p&gt;As defined by the OWASP, Cross-Site Request Forgery (CSRF) is an attack that forces an end-user to execute unwanted actions within a web application they're currently authenticated.&lt;br&gt;
 Let’s consider a predictable URL such as this one (that we can assume will provoke the user’s email to be updated):&lt;br&gt;
 &lt;a href="https://myapp.com/updateUserEmail?newEmail=myemail@example.com"&gt;https://myapp.com/updateUserEmail?newEmail=myemail@example.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;An attacker succeeding in having this request executed by an user being logged in, but with the attacker email as a param, would get access to the user’s account.&lt;br&gt;
Social Engineering is often used to trick end-users and get them to trigger similar requests, for example by including an image having this URL as an SRC attribute. (by the way, most of the mail clients are now blocking images by default to &lt;a href="https://stackoverflow.com/a/3608241"&gt;limit CSRF attacks&lt;/a&gt;). &lt;br&gt;
So far, preventing Cross-Site Request Forgery attacks was achieved by checking the Referer header and CSRF tokens (a token provided by the server to the web browser, and that will be sent by the browser within the protected request. If the request does not embed a valid token, it will be rejected). &lt;br&gt;
Using a Same-Site Cookie (with Strict mode) would prevent CSRF attacks, except - as pointed out by the spec - if the attacker is combining its attack with an XSS one (do not forget to use &lt;a href="https://blog.dareboost.com/en/2016/08/content-security-policy-secure-your-website/"&gt;CSP&lt;/a&gt;!).&lt;/p&gt;

&lt;h3&gt;
  
  
  Same-Site Support
&lt;/h3&gt;

&lt;p&gt;There are other limitations for Same-Site to be Bullet proof against CSRF attacks, and the first one would be the &lt;a href="https://caniuse.com/#search=same-site"&gt;current status of Same-Site support by web browsers&lt;/a&gt;.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mF3vv5Qp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/06/caniuse-samesite-support.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mF3vv5Qp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2017/06/caniuse-samesite-support.png" alt="SameSite support by browsers"&gt;&lt;/a&gt; &lt;br&gt;
Even if Firefox have shown clear signs of supporting Same-Site, Chrome and Opera are for now the only browsers dealing with the new instruction. A web browser that is not supporting the attribute will simply ignore it, you don’t have to worry of any compatibility issue.&lt;/p&gt;

&lt;h3&gt;
  
  
  Great feature, but not bullet proof
&lt;/h3&gt;

&lt;p&gt;As a conclusion, Same-Site is definitively something you should consider if your website offers sensitive actions to your users, or if you have significant traffic. You have to note that Same-Site, also allows to mitigate XSSI (Cross-site script inclusion) and Timing attacks as stated by the spec. &lt;br&gt;
Keep in mind that it’s just an extra layer of security: it can’t be your only site's defense against those attacks. And that’s explicitly detailed in the spec:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“Developers are strongly encouraged to deploy the usual server-side defenses (CSRF tokens, ensuring that "safe" HTTP methods are idempotent, etc) to mitigate the risk more fully.”&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;&lt;em&gt;This article was originally seen on &lt;a href="https://www.dareboost.com/"&gt;Dareboost&lt;/a&gt;'s blog.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>cookies</category>
      <category>security</category>
      <category>samesite</category>
    </item>
    <item>
      <title>Content Performance Policy, an alternative to AMP?</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Thu, 22 Jun 2017 12:55:09 +0000</pubDate>
      <link>https://dev.to/damienjubeau/content-performance-policy-an-alternative-to-amp</link>
      <guid>https://dev.to/damienjubeau/content-performance-policy-an-alternative-to-amp</guid>
      <description>&lt;p&gt;&lt;em&gt;This article was originally seen on &lt;a href="https://www.dareboost.com/"&gt;Dareboost&lt;/a&gt;'s blog.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;There is nothing really new about the Content Performance Policy since August 2016. Still, given the last months' &lt;a href="https://twitter.com/molily/status/866697258976366592"&gt;discussions&lt;/a&gt; and statements about AMP, eg &lt;a href="https://www.theregister.co.uk/2017/05/19/open_source_insider_google_amp_bad_bad_bad/"&gt;Kill Google AMP before it KILLS the web&lt;/a&gt;, I think&lt;/em&gt; &lt;strong&gt;Content Performance Policy deserves some attention&lt;/strong&gt;&lt;em&gt;!&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;You may already know about &lt;a href="http://blog.dareboost.com/en/2016/08/content-security-policy-secure-your-website/"&gt;Content Security Policy&lt;/a&gt;. It’s a great feature to add more security to your website, particularly to protect your visitors from the effects of an XSS attack.&lt;br&gt;
The idea behind CSP is to allow website owners to offer a security policy that will next be applied by the web browser. For instance, it allows to whitelist explicitly some JavaScript files, or to ensure the use of HTTPs to request each resource within the page.&lt;br&gt;
&lt;a href="https://timkadlec.com/2016/02/a-standardized-alternative-to-amp/"&gt;Tim Kaldec&lt;/a&gt; and &lt;a href="https://blog.yoav.ws/"&gt;Yoav Weiss&lt;/a&gt; borrowed the CSP general concept to apply it to web performance topic, proposing a new HTTP header (Content Performance Policy), &lt;strong&gt;allowing to declare precisely the compliance level of a given page with some web performance best practices&lt;/strong&gt;. Then, the user agent would be responsible to ensure the effectiveness of the announced best practices.&lt;br&gt;
It’s kind of a SLA emitted by the website to the web browsers, and that the user agent will guarantee even if it means broking its own default behaviour. &lt;br&gt;
Example: my website is announcing to the user agent (via &lt;em&gt;no-blocking-font&lt;/em&gt; directive) that the page content can be displayed without having to wait a font file to be downloaded. If it’s not true (so the directive would be inaccurate), the user agent should bypass its default behaviour, and Â start anyway to display the textual content before downloading the font file. &lt;br&gt;
Within this article, I aim to bring some context about the motivations behind Content Performance Policy (through a summary of the &lt;a href="https://timkadlec.com/2016/02/a-standardized-alternative-to-amp/"&gt;Tim Kaldec's article&lt;/a&gt;), to speak about the spec proposal of course (even if it’s very early stage), and at least to focus on the current limitations we could face with Content Performance Policy. &lt;br&gt;
Indeed, within the Dareboost team, we work to provide a diagnostic tool as reliable as possible to check compliance with performance best practices (among others), so our point of view is not really the classical one.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why a Content Performance Policy ?
&lt;/h2&gt;

&lt;p&gt;We did not address this topic on this blog yet, but if you’re following us on Twitter (if not, &lt;a href="https://twitter.com/DareBoost"&gt;it’s time to do so&lt;/a&gt;!), you surely know about it: Google recently shifted the lines with the &lt;a href="https://www.ampproject.org/"&gt;Accelerated Mobile Pages Project&lt;/a&gt;. &lt;br&gt;
It’s a framework to build mobile web pages, focused on speed and user experience. Google pushes things forwards, &lt;a href="http://searchengineland.com/google-to-launch-amp-in-search-results-february-24-2016-242902"&gt;highlighting websites using this technologies&lt;/a&gt; in the search results since a couples of days. &lt;br&gt;
Tim Kaldec had great words about it: AMP is a great technology, so is the project. But Google promote ONE technology, Â whereas AMP is not at all the only solution to build a fast experience, it’s just one way among others to do so. &lt;br&gt;
Tim continues his analysis, mentioning that Google has actually done this to facilitate its own work. &lt;br&gt;
I did raise this matter a few months ago, within &lt;a href="http://blog.dareboost.com/en/2015/03/red-slow-label-a-strong-sign-of-google/"&gt;this article about the Red Slow Label&lt;/a&gt;: it’s difficult to determine for sure whether a website is slow or fast, because you need to take into account a lot of params. &lt;br&gt;
Besides, neither the Red Slow Label nor the &lt;a href="https://blog.dareboost.com/en/2015/06/google-slow-to-load-ongoing-test/"&gt;Slow to Load&lt;/a&gt; have reached the SERP. Highlighting AMP, Google actually simplifies the equation: websites using it are considered fast, so they are promoted. Then there’s the rest of the world…&lt;br&gt;
Here’s a Tim’s quote I find particularly clever about this:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;So when we look at what AMP offers that you cannot offer yourself already, it’s not a high-performing site–we’re fully capable of doing that already. It’s this verification.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Personally, I’m still very impressed by what Google achieved, by such an adoption of the technology, and it has certainly contribute to wider the field of people caring of web performance. But perhaps for bad reasons?&lt;/p&gt;

&lt;p&gt;Liquid error: internal&lt;/p&gt;

&lt;p&gt;Tim Kaldec and Yoav Weiss worked to find a solution bringing a similar verification potential,Â &lt;span&gt;allowing to promote fast websites with a good user experience, without creating a dependency to a particular technology. Because we need to preserve the web in its openness and its diversity. &lt;br&gt;
That’s how &lt;a href="http://wicg.github.io/ContentPerformancePolicy/"&gt;Content Performance Policy&lt;/a&gt; was born!Â (event if it seems that &lt;a href="https://timkadlec.com/2016/02/a-standardized-alternative-to-amp/#comment-2533868150"&gt;AMP team was already on a similar matter&lt;/a&gt;) &lt;br&gt;
As used framework or whatever technology would not be the guarantee of good performance anymore like AMP is, that should be the user agent role.Â A website is announcing what are the best practices it's complying with. If the site then broke its promise, the browser would have to enforce them.&lt;/span&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Content Performance Policy, a good solution?
&lt;/h2&gt;

&lt;p&gt;Let’s summarize:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Google enforced web performance to numerous website owners that are dependant of organic trafic, therefore required to adopt AMP in order to benefit from the promotion of this technology in the search results&lt;/li&gt;
&lt;li&gt;  Content Performance Policy do not go against the idea to promote speed, it even probably recognizes the benefit of the positive pressure that search engines can bring. However CPP idea is to offer an alternative to an AMP only world. Â &lt;/li&gt;
&lt;li&gt;  AMP was a deal between Google and websites, whereas Content Performance Policy add user agent vendors to the team, as user agents would vouch for websites’ promises&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Of course, within the Dareboost team, we’re very enthusiastic to discover such great ideas! We 100% agree with the approach and the need of something else than AMP. &lt;/p&gt;

&lt;p&gt;Google penalizing slow websites has always been a matter for us, with a lot of questions raised. We automated the detection of a lot of web performance issues, going further that most of tools. We also benchmarked big samples of websites to know more about performance. &lt;br&gt;
That’s why we also wished to bring our opinion through this article, even if we're more used to speak about finalized recommendations. Â &lt;/p&gt;

&lt;h3&gt;
  
  
  Stakeholders interdependencies
&lt;/h3&gt;

&lt;p&gt;The first difficulty that may be interesting to focus on is probably the narrow relation between 3 parties to see CPP to be adopted: search engines, user agent vendors and website owners. &lt;br&gt;
Without incentive from search engine, CPP would probably stay a web performance world thing, an interesting tool but not helping into pushing speed at a new level as AMP seems to achieve. &lt;br&gt;
Without web browsers implementations, CPP would not be a verification signal at all, as nothing would enforce websites to comply with their promises. &lt;br&gt;
Without a wide adoption from website owners it would result as penalizing a lot of fast websites not using the mechanism. Â &lt;/p&gt;

&lt;h3&gt;
  
  
  Speed or nothing?
&lt;/h3&gt;

&lt;p&gt;Reading the &lt;a href="http://wicg.github.io/ContentPerformancePolicy/#directives"&gt;list of directives&lt;/a&gt; (on 25th feburary), we can assume that understanding Content Performance Policy requires some advanced knowledge about various topics. &lt;br&gt;
It sounds like it may hold up the adoption process. Not only you have to understand what it is about, but also what are the stakes and especially what a directive violation would result in. &lt;br&gt;
Using CPP also implies an important maintenance effort. Today, a mistake is slowing down Â your website. You can detect the slowdown, for instance using a &lt;a href="http://www.dareboost.com"&gt;web performance monitoring tool&lt;/a&gt; like Dareboost ;). Using CPP, a mistake may mean your website won’t work anymore! (example: my website promise is to load less than 800ko. If an heavy image is added for instance via a CMS - and so directly in production outside of my staging workflow - web browsers will block data to avoid an overflow in order to keep my website complying with its promise. What if the block content is an essential JS or a CSS?) &lt;br&gt;
Some of the CPP obstacles are similar to the ones we have with Content Security Policy adoption. If you provide A/B Testing or Tag Manager solutions to your team, you expose yourself to broke the promises at any time in production. And that might result in broking your website, as the browser job is to vouch for the promise. &lt;br&gt;
It’s very interesting to be able to constraint third-party contents. Nevertheless, for performance matters I think it something you have to do sooner, when choosing your services. Using CPP to do so, the risk to see some third-party content broking at any time due to an update is high. &lt;br&gt;
Yoav answer :&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“I'm not sure how it would look like, but we plan to have mechanisms that will limit the impact to certain hosts.” Â &lt;a href="https://github.com/WICG/ContentPerformancePolicy/pull/11/files"&gt;Proposal has since been updated&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Finding a common reference basis
&lt;/h3&gt;

&lt;p&gt;Last but not least, how can we establish the degree of importance for each directive? How to be sure we have enough directives in the list to reasonably affirm that a page is fast enough? Even we succeed, establish thresholds (resource-limit, max-internal-blocking-size) will be hard. Avoiding more complexity is absolutely required to hope website owners adoption. So it would imply a common basis for thresholds, for all the players using CPP as a verification mechanism (browsers, adblockers, search engines, etc). &lt;br&gt;
As pointed out by Yoav, this matter would be taken into account (&lt;a href="https://github.com/WICG/ContentPerformancePolicy/issues/10"&gt;see github issue&lt;/a&gt;). &lt;br&gt;
As reminded by the AMP team, we don’t have to forget that &lt;a href="https://paulbakaus.com/2016/02/26/life-after-amp/"&gt;things that are slow today might not be slow tomorrow&lt;/a&gt;, so this basis should be able to evolve.&lt;br&gt;
&lt;strong&gt;The approach remains great, and it’s really nice to see people coming with this kind of amazing ideas.&lt;/strong&gt; &lt;br&gt;
I think CPP needs to offer more control for each directives, to be able to make a promise including or excluding some resources (in order to prevent breaking anything vital if a promise would come to be broken) &lt;br&gt;
IMO, it might be nice to make a clear separation between promises enforced by user agents that have no major consequences (ex: no-auto-play, no-blocking-font), those that can be risky, and those that can’t reasonably be enforced by browsers (ex : max-internal-blocking-size). &lt;/p&gt;




&lt;p&gt;Maybe it's time to give CPP a new start? feel free to &lt;a href="https://github.com/wicg/ContentPerformancePolicy/"&gt;contribute to the proposal&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>amp</category>
      <category>webperf</category>
      <category>performance</category>
    </item>
    <item>
      <title>Secure your Cookies (Secure and HttpOnly flags) </title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Mon, 20 Mar 2017 16:20:56 +0000</pubDate>
      <link>https://dev.to/damienjubeau/secure-your-cookies-secure-and-httponly-flags</link>
      <guid>https://dev.to/damienjubeau/secure-your-cookies-secure-and-httponly-flags</guid>
      <description>&lt;p&gt;&lt;em&gt;Cookies are omnipresent all over the web as they let publishers store data directly on the user's web browser. Especially used to identify the user session allowing the web server to recognize him all along his browsing, cookies usually contain sensitive data. You have to properly protect them.&lt;/em&gt; &lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--v6CG0eDr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2016/12/how-to-secure-cookies.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--v6CG0eDr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2016/12/how-to-secure-cookies.jpg" alt="How to secure Cookies"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Set-Cookie HTTP header
&lt;/h2&gt;

&lt;p&gt;As a reminder, a cookie is usually created on the web browser following the server response, in order to store a status, that will be transmitted over the next requests. To do this, the web server uses a Set-Cookie header in an HTTP response. Here is the syntax of such a header:&lt;/p&gt;

&lt;pre&gt;_Set-Cookie: =[; =] [; expires=][; domain=] [; path=][; secure][; HttpOnly]_&lt;/pre&gt;

&lt;p&gt;The cookie is identified by a name, associated to a value. A lifetime and/or an expiration date can be set. Note that if both attributes are set then the lifetime value (max-age) will prevail. &lt;/p&gt;

&lt;p&gt;The web server can also define a path and a domain to be used for the cookie. These domain and path attributes allow to restrain its range… or extend it (by allowing its usage on any subdomain for example). As a consequence, one of the first best practice about Cookies security consists in handling properly their range. &lt;/p&gt;

&lt;p&gt;The last 2 attributes, secure and HttpOnly are specifically dealing with security. Please note that they don't accept a value. Their presence only will imply the browser behavior towards the cookie.&lt;/p&gt;

&lt;h2&gt;
  
  
  Forbid to use a cookie user-side thanks to HttpOnly
&lt;/h2&gt;

&lt;p&gt;A cookie can be placed and used by a web server, but also directly on the web browser via Javascript. &lt;/p&gt;

&lt;p&gt;In an XSS breach case, an attacker could inject some Javascript, and potentially access to the cookies that, as a reminder, often contain sensitive information. First of all, it is obviously better to prevent the XSS breaches. Then you can avoid those weaknesses to be exploited by defining a &lt;a href="https://blog.dareboost.com/en/2016/08/content-security-policy-secure-your-website/"&gt;Content Security Policy&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;The “HttpOnly” flag blocks the cookies usage via Javascript: if an attacker succeeds in injecting some javascript despite all your precautions, he won't be able to access the cookies anyway. That will significantly limit the attack range.&lt;/p&gt;

&lt;h2&gt;
  
  
  Forbid to use a cookie without HTTPs thanks to the Secure flag
&lt;/h2&gt;

&lt;p&gt;We regularly recommend it on this blog: &lt;a href="https://blog.dareboost.com/en/2016/03/https-requirement-for-your-website/"&gt;your website should use HTTPs&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;If you have already adopted this protocol and applied our previous advice, you may think that your cookies are protected as they are transmitted through a secure communication and as they can't be reached in Javascript. Unfortunately, a noticeable issue remains. What if a user comes to your website via HTTP, simply because he's typing your URL without mentioning “https://”? &lt;/p&gt;

&lt;p&gt;This could also happen if your web page contains &lt;a href="https://blog.dareboost.com/en/2015/04/chrome-firefox-and-google-search-https-forcing-its-way/#mixed-content"&gt;mixed content&lt;/a&gt;. Setting an HSTS (HTTP Strict Transport Security) header, that will enforce HTTPS usage for all the upcoming visits, will limit the risks related to the first scenario. But all the browsers do not support this header… Still, the first visit remains an issue. About the second scenario, the Content Security Policy can prevent from any mixed content risk with browsers that support “Upgrade Insecure Requests” policy. &lt;/p&gt;

&lt;p&gt;Actually, only the secure attribute will let you forbid a cookie to be ever transmitted via simple HTTP. The interest of this flag is clearly mentioned in the RFC &lt;a href="https://tools.ietf.org/html/rfc6265"&gt;HTTP State Management Mechanism&lt;/a&gt;:&lt;/p&gt;

&lt;pre&gt;_Servers that require a higher level of security SHOULD use the Cookie and Set-Cookie headers only over a secure channel. When using cookies over a secure channel, servers SHOULD set the Secure attribute (see [Section 4.1.2.5](https://tools.ietf.org/html/rfc6265#section-4.1.2.5)) for every cookie. If a server does not set the Secure attribute, the protection provided by the secure channel will be largely moot._&lt;/pre&gt;

&lt;p&gt;Obviously, keep in mind that a cookie using this secure flag won't be sent in any case on the HTTP version of your website. So be careful if your website still has got both HTTPS and HTTP areas. &lt;/p&gt;

&lt;p&gt;As a conclusion, feel free to give a try to Dareboost &lt;a href="https://www.dareboost.com/"&gt;web page analysis tool&lt;/a&gt;. It will let you ensure that all of your cookies are secured, by checking if HttpOnly and secure are properly used: &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zd02APed--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2016/12/tip-cookie-secure-dareboost.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zd02APed--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.dareboost.com/wp-content/uploads/2016/12/tip-cookie-secure-dareboost.png" alt="Tip Cookie Secure Dareboost"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cookies</category>
      <category>security</category>
      <category>web</category>
    </item>
    <item>
      <title>Web Performance &amp; Security: how to master Third Party Content impacts</title>
      <dc:creator>Damien Jubeau</dc:creator>
      <pubDate>Thu, 16 Mar 2017 18:28:18 +0000</pubDate>
      <link>https://dev.to/damienjubeau/web-performance--security-how-to-master-third-party-content-impacts-</link>
      <guid>https://dev.to/damienjubeau/web-performance--security-how-to-master-third-party-content-impacts-</guid>
      <description>&lt;h2&gt;
  
  
  Why do we need to look after third parties?
&lt;/h2&gt;

&lt;p&gt;Third Party content are assets on which you basically don’t have much control, as they are provided and hosted by a third-party.&lt;/p&gt;

&lt;p&gt;Social networks widgets, advertising and tracking scripts or web fonts providers are some widespread third-parties you’re used to deal with.&lt;/p&gt;

&lt;p&gt;Whatever they are free or not, these providers allow websites owners and developers to enrich web pages and web usages. Frequently, but not always, some features we just can’t provide by ourselves (even if technically feasible, the related cost might be blocking).&lt;/p&gt;

&lt;p&gt;Third-Party assets are more and more used on our websites (take a look at the evolution of the &lt;a href="http://httparchive.org/trends.php?s=All&amp;amp;minlabel=Dec+16+2010&amp;amp;maxlabel=Dec+2+2016#numDomains&amp;amp;maxDomainReqs" rel="noopener noreferrer"&gt;average number of domains&lt;/a&gt; used by a web page) and we do not pay due attention to this topic. Here’s some evidence: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In 2012, thousands of websites have faced major slowdowns… due to Facebook’s “Like widget (&lt;a href="http://www.forbes.com/forbes/welcome/?toURL=http://www.forbes.com/sites/ericsavitz/2012/06/01/facebook-outage-slowed-1000s-of-retail-content-sites/" rel="noopener noreferrer"&gt;Forbes’s article&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;More recently, you may remind about Typekit’s font serving outage (related to Amazon S3 outage), here again with thousands of websites affected (&lt;a href="https://blog.typekit.com/2015/08/11/well-that-was-just-awful-details-on-yesterdays-font-serving-outage/" rel="noopener noreferrer"&gt;Post Mortem on Typekit’s blog&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;A lighter and funnier example might be DailyMotion broadcasting a job offer last year within the developer consoles of web browsers. The job offer has been broadcasted a few days long on &lt;em&gt;all&lt;/em&gt; websites using their widgets!
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2017%2F03%2Fdailymotion-hiring-console-message.png" title="DailyMotion ASCII Art in console and hiring message with link related to the job offer" alt="DailyMotion ASCII Art in console and hiring message with link related to the job offer"&gt;
&lt;em&gt;(The message that was displayed within your console while browsing any website using a DailyMotion embed video widget)&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;If you don’t pay attention, a third-party can affect your website in many ways, as far as making it unavailable for your visitors!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Still, you’re probably not the only stakeholder when it comes to use or not a third-party. However, you definitively have to go further than just reading and following the provider’s documentation. &lt;br&gt;
I like to consider third-party content as parasites. Friendly ones, with - most of the time - good intentions. But parasites anyway, and a growing infection may take down your whole website.&lt;/p&gt;

&lt;p&gt;Through this post, I aim to share my vision and experience about third-party content, but most of all to bring you some best practices and tool to mitigate the risks related to third-party content integration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Speed and third-party content, a complicated story.
&lt;/h2&gt;

&lt;p&gt;Within this first part, we’ll focus on web performance and third party content effects on speed.&lt;br&gt;
This is an important topic for several reasons. Let’s start with the basics. First of all, we have to note that an external asset is loaded from another domain name than the one used for your website. It means that the web browser has to trigger a DNS resolution and next to establish a new TCP connection.&lt;/p&gt;

&lt;p&gt;At this early stage, we have to question the third-party server location: if the provider does not use a CDN (Content Delivery Network), it might result in a high latency over the network (as latency is related to the nature of the network, but also the distance between the server and the web browser).&lt;/p&gt;

&lt;p&gt;If your website targets a particular country, you may not be familiar with this issue, as you’re probably hosting your website in this very same country. Using a third-party provider without a CDN might result in loading an asset hosted halfway around the world. &lt;/p&gt;

&lt;p&gt;But that’s not all. If the request is benefiting from a security layer thanks to HTTPs (and it should) you’ll have to add an extra delay, as HTTPs usage implies additional exchanges to establish the secured connection.&lt;/p&gt;

&lt;p&gt;Finally,  you’ll be dependant on the server response time and upstream bandwidth of the third-party provider.&lt;/p&gt;

&lt;p&gt;Some website speed test tools (give a try to ours, &lt;a href="https://www.dareboost.com" rel="noopener noreferrer"&gt;Dareboost&lt;/a&gt;!) might help you to understand the impact of third-party content on your website performance. For example, here’s a comparison report of cnn.com both with and without their third-party content. Guess which version is faster?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2017%2F03%2Fcnn-no-3rd-party-performance-report.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fblog.dareboost.com%2Fwp-content%2Fuploads%2F2017%2F03%2Fcnn-no-3rd-party-performance-report.png" title="CNN Performance Comparison Report, with and without third party content" alt="CNN Performance Comparison Report, with and without third party content"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As stated in introduction, you don’t have control over the performance of third party providers. You could still establish an SLA (Service Level Agreement), but most of the time we can’t - except if running a major website with big money.&lt;br&gt;
Third party content can be a great way to enrich a website, but they come with important impacts. That’s why you have to limit as much as possible this kind of dependencies.&lt;/p&gt;

&lt;p&gt;You don’t have to load the whole framework of a social network to display a simple share button. Implementing your own is easy. Imagine your own implementation, and  even if in some some ways it is more limited, it will definitively be a win at the end. &lt;/p&gt;

&lt;p&gt;Obviously, this example is a trivial one, and there may be some third-party content you just can’t do without. Then you’ll need to be very cautious in your integration. &lt;/p&gt;

&lt;h2&gt;
  
  
  Exclude third party from critical render path
&lt;/h2&gt;

&lt;p&gt;Displaying a web page requires a lot of operations from the browser: receiving the HTML source code, fetching some assets, building the DOM, CSS tree. &lt;br&gt;
&lt;strong&gt;Some of these operations are blocking&lt;/strong&gt;, meaning they need to terminate before allowing any display. A well-known example is Javascript called synchronously within the &lt;code&gt;&amp;lt;head&amp;gt;&lt;/code&gt; of a web page.&lt;/p&gt;

&lt;p&gt;Performance optimization best practices are often focusing on the critical rendering path, in other words on elements and operations impactful for the above the fold part of the page.&lt;/p&gt;

&lt;p&gt;Given the different delays added by the use of third party content, it’s really important to avoid using any of them within your critical rendering path. You should be in search of absolute control of the early display of the page.&lt;/p&gt;

&lt;p&gt;Both Facebook and Typekits examples discussed earlier illustrate perfectly what are the stakes of having third party content blocking your critical rendering path. Dependencies that slow down your page loading in the best case scenario. At worst, it might result in website unavailability if they were to fail.&lt;/p&gt;

&lt;p&gt;A dependency failure resulting in such a breakdown is so called a Single Point of Failure. A notion that front-end developers might not be so familiar with.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“A single point of failure (SPOF) is a part of a system that, if it fails, will stop the entire system from working.”&lt;br&gt;
&lt;em&gt;Wikipedia&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;At this stage, most of the time I can hear some people telling me: “Yeah, ok, my website has some SPOFs. But that’s Google Fonts. I mean GOOGLE. I can trust them for 100% uptime or whatever big players they are confident about. 
Short answer: they’re wrong. Pick-up the argument you like the most: 
There’s no bullet-proof big player. Our introduction was about Facebook and Typekit (related to Amazon S3 failure), right? &lt;strong&gt;It will happen&lt;/strong&gt; one day. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;A SPOF is not triggered only in case of failure of the third-party&lt;/strong&gt;! 
A firewall or a network error/congestion might also result in a requested file not having a response!&lt;/li&gt;
&lt;li&gt;When your website is not loading, from the visitors perspective, it is &lt;strong&gt;your&lt;/strong&gt; fault. Not Google’s one. Real people don’t have any idea of what’s required to display your page! You can trust them, but you alone will face the consequences.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I may sound a little bit alarmist here. SPOF is a risk, and you should consider it in terms of probability of occurrence and related income loss against cost to be fixed. It’s your personal blog? You can totally trust Google and keep going with your SPOF! &lt;/p&gt;

&lt;p&gt;Of course, if you’re using a third-party, that’s because it achieves something for your needs. Having a third-party to fail will then result in a missing feature. Most of the time, your website can - and should - work without this feature. If you’re doing business online for real, you need to eliminate SPOFs by all means!&lt;/p&gt;

&lt;p&gt;Unfortunately, a blocking behaviour might be required by some third-party services. For example, some A/B Testing solutions specifically advise to use a blocking integration. Indeed, they wish to avoid flickering (the original page should not be shown before execution of the A/B testing edits). Yes, it’s nothing less than slowing down the page rendering, but on purpose.&lt;/p&gt;

&lt;p&gt;Anyway, you can still avoid that being a SPOF. &lt;/p&gt;

&lt;p&gt;A blocking javascript file, when being too slow to load, will be aborted by the browser. However, the default timeout policies of web browsers are not a good fit for non critical resources (you would rather miss some A/B Test sessions  than having some customers waiting 30 seconds for a page to render). Default timeouts vary from a couple seconds to... minutes (there’s a &lt;a href="http://www.stevesouders.com/blog/2014/11/14/request-timeout/" rel="noopener noreferrer"&gt;great post&lt;/a&gt; from Steve Souders if you want to know more about it).&lt;/p&gt;

&lt;p&gt;Knowing that, you should take control over the default timeout policy if you need a blocking third-party asset. Consider the feature value for you and your user, configure the appropriate timeout, and you can even add a fallback!&lt;/p&gt;

&lt;p&gt;Check &lt;a href="https://helpx.adobe.com/typekit/using/embed-codes.html" rel="noopener noreferrer"&gt;this post from&lt;/a&gt; Typekit, that have updated their docs since the 2015’s breakdown.&lt;/p&gt;

&lt;p&gt;Last point about 3rd parties and speed: Tag Managers. You probably have integrated  Google Tag Manager on some of your projects, allowing to centralize third-party calls, and bringing a whole new level of autonomy for non tech people to add new services… Good news is GTM is loaded asynchronously by default and same goes for subsequent tags. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“With great power comes great responsibility”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Tag Managers often answer needs of people without a technical culture, and who won’t be vigilant about the impacts of such injected content. Risks for load time, but also for security as we will discover later. Offering a Tag Manager means giving power to its users, and your job is probably to let them know about the implications, or to set up a validation workflow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Quality Audit for your 3rd parties
&lt;/h2&gt;

&lt;p&gt;As previously stated, most of the cases we just can’t discuss an SLA with the provider. Anyway, there’s probably available alternatives on the market for what you need, and you should probably add quality and performance as factors considered to choose a provider.&lt;/p&gt;

&lt;p&gt;I remind having considered a click2chat Saas provider for our own website. While quickly checking the loaded assets, I saw an image file that was about 250KB. Optimizing this icon would have resulted in a 100x lighter file to be loaded.&lt;/p&gt;

&lt;p&gt;The chat was loaded asynchronously, so not a big deal. But still, a slow image to be loaded for low bandwidth connections. And more important: it was pointing out that image optimization was not a part of their workflow. Image could as well have been 2MB.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.dareboost.com/en/2015/10/performance-budget/" rel="noopener noreferrer"&gt;Performance Budget&lt;/a&gt; is a very interesting method to make sure your website remains fast. It can be a great way to make sure to keep control over third-party and their evolutions through time thanks to website monitoring.&lt;/p&gt;

&lt;p&gt;To audit the quality of a third-party, Simon Hearne has done a great work for you with a &lt;a href="https://webperf.ninja/2015/questions-to-ask-your-third-parties/" rel="noopener noreferrer"&gt;checklist&lt;/a&gt; you can use to make sure you’re considering all major impacts.&lt;br&gt;
When finding problems or any potential optimization, feel free to get in touch with the provider. Having an editor to fix it will probably benefit thousands of websites! Knowing how the provider is dealing with your demand will also be a great insight.&lt;/p&gt;

&lt;h2&gt;
  
  
  Mitigate the risks related to 3rd parties
&lt;/h2&gt;

&lt;p&gt;Adding a third-party to a web page implies a high level of trust for the provider, as you’re losing some control. Applying previous recommendations, you made sure that the provider is qualitative, and that a failure won’t result in your website being unreachable. &lt;/p&gt;

&lt;p&gt;Still, the third-party have gained control over your page! You can’t blindly trust him. Even if the provider has the best intentions, it can still do mistakes, or even be hacked. Let’s speak about security.&lt;/p&gt;

&lt;p&gt;First of all: HTTPs is a requirement. If your website is using the secure version of the protocol, you just can’t have 3rd parties working over HTTP (that would be &lt;a href="https://developer.mozilla.org/en-US/docs/Web/Security/Mixed_content" rel="noopener noreferrer"&gt;Mixed Content&lt;/a&gt;, something web browsers are blocking automatically).&lt;/p&gt;

&lt;p&gt;You need to go further regarding the security, by adding restrictions to 3rd party content scope. For example, you can use the sandbox attribute over iframes. It will allow you to restrict the actions available from an iframe (for example, forbid Javascript usage). &lt;/p&gt;

&lt;p&gt;Another - very powerful - tool: &lt;a href="https://blog.dareboost.com/en/2016/08/content-security-policy-secure-your-website/" rel="noopener noreferrer"&gt;Content Security Policy&lt;/a&gt;. By using a simple HTTP header in your server's response, you can explicitly define content and behaviours you trust within your page. Whatever is the element not being allowed, the browser won’t execute/request it. Assuming of course that the web browser supports CSP.&lt;/p&gt;

&lt;p&gt;If your website is vulnerable to some Cross-Site Scripting (XSS), defining a Content Security Policy won’t solve the vulnerability. Nevertheless, it will protect your users against the effects of an XSS attack.  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt; By using a form within your page, an attacker succeeds in injecting a javascript file in your page, let’s call it malicious.js. The script tag requesting  malicious.js is indeed in your page. However web browsers won’t actually request the file, instead they will throw an error, as this request violates your Content Security Policy directives. XSS vulnerability has been exploited, but without having effects on your visitors.&lt;/p&gt;

&lt;p&gt;That’s a good way to make sure your third-parties do not allow themselves to do more than they should with your pages. And if a provider was to be hacked, you would have mitigates the risks for your own traffic.&lt;/p&gt;

&lt;p&gt;Last but not least, &lt;a href="https://blog.dareboost.com/en/2016/12/secure-cookies-secure-httponly-flags/" rel="noopener noreferrer"&gt;protect your cookies&lt;/a&gt;! A third-party injecting Javascript within your document can access the cookies related to your domain through Javascript! Setting cookies with an HttpOnly flag, you’ll forbid any client-side usage!&lt;/p&gt;

&lt;h2&gt;
  
  
  Your turn!
&lt;/h2&gt;

&lt;p&gt;Third-parties usage grows. Pay attention to the related risks and limit as much as possible their number. Audit those you need, and choose wisely among offered alternatives.&lt;/p&gt;

&lt;p&gt;Keep in mind they are parasites, and they will evolves: add as much constraints as possible and monitor them! &lt;/p&gt;




&lt;p&gt;&lt;em&gt;This post was originally seen on &lt;a href="https://blog.dareboost.com/en/2017/03/third-party-content-performance-security/" rel="noopener noreferrer"&gt;our blog&lt;/a&gt; about website speed and quality.&lt;/em&gt; &lt;/p&gt;

</description>
      <category>webperf</category>
      <category>performance</category>
      <category>optimization</category>
      <category>web</category>
    </item>
  </channel>
</rss>
