<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Fabrizio Fortunato</title>
    <description>The latest articles on DEV Community by Fabrizio Fortunato (@izifortune).</description>
    <link>https://dev.to/izifortune</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/izifortune"/>
    <language>en</language>
    <item>
      <title>Build a serverless website with SAM on AWS</title>
      <dc:creator>Fabrizio Fortunato</dc:creator>
      <pubDate>Mon, 22 Jul 2019 20:50:38 +0000</pubDate>
      <link>https://dev.to/izifortune/build-a-serverless-website-with-sam-on-aws-23i0</link>
      <guid>https://dev.to/izifortune/build-a-serverless-website-with-sam-on-aws-23i0</guid>
      <description>&lt;p&gt;A website can be considered serverless when it doesn't need, directly, any server to operate. Serverless is quickly becoming a popular choice for running your services and also your entire website. Adopting a serverless architecture enables you and your organisation to focus on delivering business value. Every application must have an inherent amount of irreducible complexity states the &lt;a href="https://en.wikipedia.org/wiki/Law_of_conservation_of_complexity" rel="noopener noreferrer"&gt;Law of conservation of complexity&lt;/a&gt;, the only question is who will have to deal with it. Using serverless we are offloading this complexity to someone else can take care of this for us.&lt;/p&gt;

&lt;p&gt;Working at RyanairLabs, I have the opportunity to work with amazing people every day and to try out s*erverless architecture* for the busiest travel website in Europe. Multiple pages of the &lt;em&gt;Ryanair&lt;/em&gt; website are built using serverless technologies. Check out how we are using AWS serverless technologies in my talk at &lt;a href="https://www.youtube.com/watch?v=8jSu31nQi1g&amp;amp;feature=youtu.be" rel="noopener noreferrer"&gt;AWS Summit London&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Serverless services
&lt;/h2&gt;

&lt;p&gt;AWS has a wide range of serverless services entirely managed by AWS. What we will be focusing here are the basic building blocks to deploy a serverless website. The services are:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fdist%2Fstatic%2Fa0b9dd49dd3e05654ee8ab210d71a0b9%2F9f1f4%2Fsimple-architecture.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fdist%2Fstatic%2Fa0b9dd49dd3e05654ee8ab210d71a0b9%2F9f1f4%2Fsimple-architecture.png" alt="AWS serverless services for a website"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Cloudfront&lt;/em&gt;: a high available CDN delivering your content worldwide through edge locations. We will use &lt;em&gt;Cloudfront&lt;/em&gt; to take advantage of the edge caching for our website assets, routing each user request through the nearest edge location, without the need of contacting back the origin each time. &lt;em&gt;Cloudfront&lt;/em&gt; also ensures the first level of security for our website, establishing a secure connection with the clients and by using &lt;a href="https://aws.amazon.com/waf/" rel="noopener noreferrer"&gt;WAF&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;S3&lt;/em&gt; an object storage service designed to be easy to use and managed. A website page is just a collection of static resources glued together by HTML. While developing web applications, a popular choice is to use a single page application (SPA) frameworks to reduce server-side complexity and improve interactivity. Ryanair website, for example, is built using Angular and Angularjs. Another rising technology is Javascript API Markup (JAM), fueled by static site generators. Both those technologies share a simple approach: they produce static assets which can be easily stored and served from an S3.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Lambda at the Edge (L@E)&lt;/em&gt;, you can execute any function closer to the viewer to add any additional logic between the edges and the origin. &lt;em&gt;L@E&lt;/em&gt; acts as a glue between &lt;em&gt;Cloudfront&lt;/em&gt; and the origin(s).&lt;/p&gt;

&lt;h2&gt;
  
  
  SAM
&lt;/h2&gt;

&lt;p&gt;We will use &lt;a href="https://github.com/awslabs/aws-sam-cli" rel="noopener noreferrer"&gt;sam-cli&lt;/a&gt;, an open-source framework for building a serverless application by AWS to reduce the boilerplate while working with Lambda @ Edge in the next articles.&lt;/p&gt;

&lt;p&gt;The assumption is that we have already configured aws-cli and sam-cli, if that is not the case you can follow the guide &lt;a href="https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Basic Infrastructure template
&lt;/h2&gt;

&lt;p&gt;Building an application using sam-cli we will be using &lt;em&gt;Cloudformation&lt;/em&gt; templates to describe our infrastructure resources. It gives us the ability to replicate, reuse and share infrastructure between projects quickly. A &lt;em&gt;Cloudformation&lt;/em&gt; template powers all the Frontend projects that we are running in RyanairLabs, treating your infrastructure enables us to understand changes quickly, version them and collaborate on the infrastructure.&lt;/p&gt;

&lt;p&gt;Working with &lt;em&gt;sam-cli&lt;/em&gt; or &lt;em&gt;Cloudformation&lt;/em&gt; we tend to keep all the resources of a project in a single file making it easier to manage and deploy the resources. What we are looking to generate in our first iteration is a simple infrastructure that can serve any assets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fdist%2Fstatic%2Fa0b9dd49dd3e05654ee8ab210d71a0b9%2F9f1f4%2Fsimple-architecture.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fdist%2Fstatic%2Fa0b9dd49dd3e05654ee8ab210d71a0b9%2F9f1f4%2Fsimple-architecture.png" alt="Basic infrastructure"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let's start by creating the template file called &lt;code&gt;template.yaml&lt;/code&gt; with some of the basic services that we will need:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;AWSTemplateFormatVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2010-09-09'&lt;/span&gt;
&lt;span class="na"&gt;Transform&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Serverless-2016-10-31&lt;/span&gt;
&lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="s"&gt;Serverless website&lt;/span&gt;

&lt;span class="na"&gt;Resources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;CloudFrontOriginAccessIdentity&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;AWS::CloudFront::CloudFrontOriginAccessIdentity'&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;CloudFrontOriginAccessIdentityConfig&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Comment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Serverless&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;website&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;OA'&lt;/span&gt;

  &lt;span class="na"&gt;CloudfrontDistribution&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;AWS::CloudFront::Distribution"&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;DistributionConfig&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Comment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Cloudfront&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;distribution&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;for&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;serverless&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;website"&lt;/span&gt;
        &lt;span class="na"&gt;DefaultRootObject&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;index.html"&lt;/span&gt;
        &lt;span class="na"&gt;Enabled&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;
        &lt;span class="na"&gt;HttpVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;http2&lt;/span&gt;
        &lt;span class="c1"&gt;# List of origins that Cloudfront will connect to&lt;/span&gt;
        &lt;span class="na"&gt;Origins&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Id&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;s3-website&lt;/span&gt;
            &lt;span class="na"&gt;DomainName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!GetAtt&lt;/span&gt; &lt;span class="s"&gt;S3Bucket.DomainName&lt;/span&gt;
            &lt;span class="na"&gt;S3OriginConfig&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="c1"&gt;# Restricting Bucket access through an origin access identity&lt;/span&gt;
              &lt;span class="na"&gt;OriginAccessIdentity&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; 
                &lt;span class="na"&gt;Fn::Sub&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;origin-access-identity/cloudfront/${CloudFrontOriginAccessIdentity}'&lt;/span&gt;
        &lt;span class="c1"&gt;# To connect the CDN to the origins you need to specify behaviours&lt;/span&gt;
        &lt;span class="na"&gt;DefaultCacheBehavior&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="c1"&gt;# Compress resources automatically ( gzip )&lt;/span&gt;
          &lt;span class="na"&gt;Compress&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;true'&lt;/span&gt;
          &lt;span class="na"&gt;AllowedMethods&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;GET&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;HEAD&lt;/span&gt;
            &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;OPTIONS&lt;/span&gt;
          &lt;span class="na"&gt;ForwardedValues&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
            &lt;span class="na"&gt;QueryString&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
          &lt;span class="na"&gt;TargetOriginId&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;s3-website&lt;/span&gt;
          &lt;span class="na"&gt;ViewerProtocolPolicy &lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;redirect-to-https&lt;/span&gt;

  &lt;span class="na"&gt;S3Bucket&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::S3::Bucket&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="c1"&gt;# Change bucket name to reflect your website&lt;/span&gt;
      &lt;span class="na"&gt;BucketName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;&amp;lt;YOURSWEBSITE.COM&amp;gt;&lt;/span&gt;

  &lt;span class="na"&gt;S3BucketPolicy&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::S3::BucketPolicy&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;Bucket&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Ref&lt;/span&gt; &lt;span class="s"&gt;S3Bucket&lt;/span&gt;
      &lt;span class="na"&gt;PolicyDocument&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="c1"&gt;# Restricting access to cloudfront only.&lt;/span&gt;
        &lt;span class="na"&gt;Statement&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="pi"&gt;-&lt;/span&gt;
            &lt;span class="na"&gt;Effect&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Allow&lt;/span&gt;
            &lt;span class="na"&gt;Action&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s3:GetObject'&lt;/span&gt;
            &lt;span class="na"&gt;Resource&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;arn:aws:s3:::${S3Bucket}/*"&lt;/span&gt;
            &lt;span class="na"&gt;Principal&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
              &lt;span class="na"&gt;AWS&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!Sub&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;arn:aws:iam::cloudfront:user/CloudFront&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Origin&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Access&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;Identity&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;${CloudFrontOriginAccessIdentity}"&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Going through the  template file defined we have different services listed here:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;CloudfrontOriginAccessIdentity&lt;/code&gt;&lt;em&gt;,&lt;/em&gt; used to restrict access to the &lt;em&gt;Bucket&lt;/em&gt; files only for requests coming through &lt;em&gt;Cloudfront&lt;/em&gt;. We don't have to make our &lt;em&gt;Bucket&lt;/em&gt; public or neither enable website hosting by using an &lt;em&gt;OriginAccessIdentity&lt;/em&gt;. The &lt;em&gt;OriginAccessIdentity&lt;/em&gt; will be used in the &lt;em&gt;CloudfrontDistribution Origin&lt;/em&gt; and the S3*BucketPolicy.*&lt;/p&gt;

&lt;p&gt;&lt;code&gt;CloudfrontDistribution&lt;/code&gt;&lt;em&gt;,&lt;/em&gt; it is &lt;em&gt;AWS&lt;/em&gt; &lt;em&gt;CDN&lt;/em&gt; to deliver data and content with low latency globally. There are a series of attributes in the &lt;em&gt;DistributionConfig&lt;/em&gt; which controls the &lt;em&gt;CDN.&lt;/em&gt; I left different comments on the file to explain further if you want to have a look at a complete list of the attributes you can follow the &lt;a href="[https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudfront-distribution.html](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-cloudfront-distribution.html)"&gt;User Guide&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;S3Bucket&lt;/code&gt;&lt;em&gt;,&lt;/em&gt; this is the &lt;em&gt;Bucket&lt;/em&gt; where we will store all the assets of the website. Remember to give the &lt;em&gt;BucketName&lt;/em&gt; a correct and unique identifier for example we can use the domain name of the website to help you remember which &lt;em&gt;Bucket&lt;/em&gt; belongs to which website.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;S3BucketPolicy&lt;/code&gt;&lt;em&gt;,&lt;/em&gt; which defines who can access and what type of operations are permitted in the &lt;em&gt;Bucket.&lt;/em&gt; Using the &lt;em&gt;OriginAccessIdentity&lt;/em&gt; we are restricting read access only to &lt;em&gt;Cloudfront&lt;/em&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploying infrastructure
&lt;/h2&gt;

&lt;p&gt;We can already start deploying our first serverless website only with the four resources listed before. We will host the website assets on the &lt;em&gt;S3Bucket,&lt;/em&gt; Cache and serve content in &lt;em&gt;Cloudfront.&lt;/em&gt; To deploy the infrastructure with sam-cli we need to generate first an &lt;em&gt;S3 Bucket&lt;/em&gt; used to store the deployment package containing all the different inputs for your infrastructure.&lt;/p&gt;

&lt;p&gt;On the commands below &lt;em&gt;remember&lt;/em&gt; always to substitute  with the actual domain name of your website.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws s3 mb s3://&amp;lt;YOURSWEBSITE.COM&amp;gt;-sam &lt;span class="nt"&gt;--region&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;us-east-1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After the Bucket is created we can package the application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;sam package &lt;span class="nt"&gt;--output-template-file&lt;/span&gt; packaged.yaml &lt;span class="nt"&gt;--s3-bucket&lt;/span&gt; &amp;lt;YOURSWEBSITE.COM&amp;gt;-sam
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And finally deploy the application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;sam deploy &lt;span class="nt"&gt;--template-file&lt;/span&gt; packaged.yaml &lt;span class="nt"&gt;--stack-name&lt;/span&gt; &amp;lt;yourwebsite&amp;gt; &lt;span class="nt"&gt;--capabilities&lt;/span&gt; CAPABILITY_IAM &lt;span class="nt"&gt;--region&lt;/span&gt; us-east-1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We are using &lt;em&gt;us-east-1&lt;/em&gt; as a default region for our infrastructure to reduce some of the limitations that we will encounter later on while defining &lt;em&gt;Lambda @ Edge&lt;/em&gt;, where they can only be deployed into that region.&lt;/p&gt;

&lt;p&gt;Creating the &lt;em&gt;Cloudfront&lt;/em&gt; distribution will take some time, generally between 20-30 mins at first. The Distribution, after the deployment is completed, will have an AWS generated domain name with the format &lt;a href="http://d3k1beyfkv9165.cloudfront.net/" rel="noopener noreferrer"&gt;&lt;code&gt;d3k1beyfkv2133.cloudfront.net&lt;/code&gt;&lt;/a&gt; usable for making our first request to the website.&lt;/p&gt;

&lt;p&gt;To test out that everything is working we can upload an &lt;code&gt;index.html&lt;/code&gt; in the root folder of the &lt;em&gt;S3Bucket eg:&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;html&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;body&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;h1&amp;gt;&lt;/span&gt;Hello serverless&lt;span class="nt"&gt;&amp;lt;/h1&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;img&lt;/span&gt; &lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"https://github.com/awslabs/serverless-application-model/blob/master/aws_sam_introduction.png"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/body&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/html&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To upload the file we can use the &lt;em&gt;aws-cli&lt;/em&gt; command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws s3 &lt;span class="nb"&gt;cp &lt;/span&gt;index.html s3://&amp;lt;YOURWEBSITE.COM&amp;gt; &lt;span class="nt"&gt;--acl&lt;/span&gt; public-read
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the file is uploaded if we now navigate to the &lt;em&gt;Cloudfront DomainName&lt;/em&gt; we should be able to see:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fdist%2Fstatic%2Fad25eb5c1d41a31fa5f41c7ef663e5e9%2Ff55ca%2Fhello-serverless.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fdist%2Fstatic%2Fad25eb5c1d41a31fa5f41c7ef663e5e9%2Ff55ca%2Fhello-serverless.png" alt="Hello serverless page"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the next articles to come, we will explore how we can serve multiple pages, secure, handle rewrites, associate a domain and finally how to correctly deploy assets on our serverless website. This is a lot to cover in a single article so don't forget to follow me on twitter &lt;a href="https://twitter.com/izifortune" rel="noopener noreferrer"&gt;@izifortune&lt;/a&gt; to get updates.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PS:  AWS CloudDevelopmentKit&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The AWS CDK is now generally available, if you are looking to have programmatic access to create AWS resources have you can have a look at the examples below: &lt;a href="https://aws.amazon.com/blogs/aws/aws-cloud-development-kit-cdk-typescript-and-python-are-now-generally-available/" rel="noopener noreferrer"&gt;https://aws.amazon.com/blogs/aws/aws-cloud-development-kit-cdk-typescript-and-python-are-now-generally-available/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>aws</category>
    </item>
    <item>
      <title>Lighthouse architecture demystified</title>
      <dc:creator>Fabrizio Fortunato</dc:creator>
      <pubDate>Wed, 20 Mar 2019 07:39:24 +0000</pubDate>
      <link>https://dev.to/izifortune/lighthouse-architecture-demystified-hmb</link>
      <guid>https://dev.to/izifortune/lighthouse-architecture-demystified-hmb</guid>
      <description>&lt;h3&gt;
  
  
  Intro
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Flighthouse.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Flighthouse.jpg" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In one of my previous &lt;a href="https://dev.to/izifortune/performance-budgets-with-lighthouse---lighthouse-keeper-2475-temp-slug-6276913"&gt;articles&lt;/a&gt; I explained how we can use &lt;em&gt;Lighthouse&lt;/em&gt; to put our website on a budget and why monitoring your website performance its an important aspect of web development. In this article, we will go deeper into &lt;em&gt;Lighthouse&lt;/em&gt; building blocks, its architecture and learn how we can start auditing and collecting custom metrics for our web pages.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Lighthouse&lt;/em&gt; is an audit tool developed by Google which collects different metrics of your website. After collecting the metrics it will present a series of scores for your webpage. The scoring is divided into five main auditing areas.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Faudit1-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Faudit1-2.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Besides the scoring of your webpage, &lt;em&gt;Lighthouse&lt;/em&gt; provides more detailed information on where to focus your efforts, what is called "opportunities". Areas of your website which impact may lead to big performance improvement as an example the following application execution time are quite high.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Fopportunities-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Fopportunities-1.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Lighthouse architecture
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Lighthouse&lt;/em&gt;&lt;a href="https://github.com/GoogleChrome/lighthouse/blob/master/docs/architecture.md#components--terminology" rel="noopener noreferrer"&gt;architecture&lt;/a&gt; is built around &lt;a href="https://developer.chrome.com/devtools/docs/debugger-protocol" rel="noopener noreferrer"&gt;Chrome Debugging Protocol&lt;/a&gt; which is a set of low-level API to interact with a Chrome instance. It interfaces a Chrome instance through the &lt;em&gt;Driver&lt;/em&gt;. The &lt;em&gt;Gatherers&lt;/em&gt; collect data from the page using the &lt;em&gt;Driver&lt;/em&gt;. The output of a &lt;em&gt;Gatherer&lt;/em&gt; is an &lt;em&gt;Artifact&lt;/em&gt;, a collection of grouped metrics. An &lt;em&gt;Artifact&lt;/em&gt; then is used by an &lt;em&gt;Audit&lt;/em&gt; to test for a metric. The &lt;em&gt;Audits&lt;/em&gt; asserts and assign a score to a specific metric. The output of an &lt;em&gt;Audit&lt;/em&gt; is used to generate the lighthouse report that we are familiar with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Flighthouse-architcture.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Flighthouse-architcture.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will now take a close look at two of the &lt;em&gt;Lighthouse&lt;/em&gt; building blocks by creating a simple audit tracking the internal rendering of a webpage. Creating an application is necessary in order to include a custom &lt;em&gt;Gatherer&lt;/em&gt; or &lt;em&gt;Audit&lt;/em&gt; since it is not possible to add any custom &lt;em&gt;Gatherer or Audit&lt;/em&gt; directly into the &lt;em&gt;Chrome&lt;/em&gt; panel.&lt;/p&gt;

&lt;p&gt;Let's create our project and install lighthouse as a dependency&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mkdir custom-audit &amp;amp;&amp;amp; cd custom-audit  
npm i --save lighthouse
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To start auditing our website we will then create a new file &lt;code&gt;scan.js&lt;/code&gt; where we will import &lt;em&gt;Lighthouse&lt;/em&gt; and start scanning the webpage of choice. We will use programmatic access to &lt;em&gt;Lighthouse&lt;/em&gt; by importing it inside our project&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const lighthouse = require('lighthouse');  
const chromeLauncher = require('chrome-launcher');

async function launchChromeAndRunLighthouse(url, opts, config = null) {  
  const chrome = await chromeLauncher.launch({chromeFlags: opts.chromeFlags});
  opts.port = chrome.port;
  const { lhr } = await lighthouse(url, opts, config);
  await chrome.kill()
  return lhr;
}

const opts = {};

// Usage:
(async () =&amp;gt; {
  try {
    const results = await launchChromeAndRunLighthouse('https://izifortune.github.io/lighthouse-custom-gatherer', opts);
    console.log(results);
  } catch (e) {
    console.log(e);
  }
})();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If we now try to run our file we should be able to see the results coming from a lighthouse scan in the console:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;node scan.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we have a project with Lighthouse up and running we can start looking at how a Gatherer works and how we can use it in our project. We will use a webpage that I’ve created for this &lt;a href="https://izifortune.github.io/lighthouse-custom-gatherer" rel="noopener noreferrer"&gt;demo&lt;/a&gt;. In the page, I’m fetching todo list items from an API and rendering on the page. I’m measuring the action using &lt;a href="https://developer.mozilla.org/en-US/docs/Web/API/Performance" rel="noopener noreferrer"&gt;PerformanceAPI&lt;/a&gt; as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const getDataFromServer = async () =&amp;gt; {  
  performance.mark('start');
  const todos = await getTodos();
  renderTodos(todos);
  performance.mark('end');
  performance.measure('Render todos', 'start', 'end');
  const measure = performance.getEntriesByName('Render todos')[0];
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Gatherer
&lt;/h4&gt;

&lt;p&gt;A Gatherer is used by Lighthouse to collect data on the page. In fact, any data that is currently needed to perform the default lighthouse audits is collected through a Gatherer. We can extend the Gatherer base class and start creating custom ones:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const { Gatherer } = require('lighthouse');

class MyGatherer extends Gatherer {  
  ...
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The class Gatherer defines three different lifecycle hooks that we can implement in our class:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;em&gt;beforePass&lt;/em&gt;&lt;/strong&gt; - called before the navigation to given URL&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;em&gt;pass&lt;/em&gt;&lt;/strong&gt; - called after the page is loaded and the trace is being recorded&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;em&gt;afterPass&lt;/em&gt;&lt;/strong&gt; - called after the page is loaded, all the other pass have been executed and a trace is available&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A lifecycle hook is expected to return either directly an Artifact or a Promise which resolve to the desired Artifact. Depending on what data are we looking to collect from the Driver and at what time we can use any of the hooks just described.&lt;/p&gt;

&lt;p&gt;Let’s now create a custom Gatherer which will collect the measurements from the &lt;em&gt;PerformanceAPI&lt;/em&gt;. The Gatherer needs then to collect entryType &lt;code&gt;measure&lt;/code&gt; using a PerformanceObserver. We will proceed to create the file &lt;code&gt;todos-gatherer.js&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;'use strict';

const { Gatherer } = require('lighthouse');

function performance() {  
  return new Promise((res) =&amp;gt; {
    let logger = (list) =&amp;gt; {
      const entries = list.getEntries();
      window.todosPerformance = entries[0].duration
      res(entries[0].duration);
    }
    let observer = new PerformanceObserver(logger);
    observer.observe({ entryTypes: ['measure'], buffered: true });
  });
}

class TodosGatherer extends Gatherer {  
  beforePass(options) {
    const driver = options.driver;
    return driver.evaluateScriptOnNewDocument(`(${performance.toString()})()`)
  }

  afterPass(options) {
    const driver = options.driver;
    return driver.evaluateAsync('window.todosPerformance')
  }
}

module.exports = TodosGatherer;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Inside TodosGatherer we are using both the &lt;em&gt;beforePass&lt;/em&gt; and &lt;em&gt;afterPass&lt;/em&gt; hook to contact the Driver and then execute a javascript function inside the context of the current page returning a promise. Inside the &lt;em&gt;beforePass&lt;/em&gt; we are registering a &lt;em&gt;PerformanceObserver&lt;/em&gt; just after the page will load, since the observers are not buffered we might encounter in a race condition. In the &lt;em&gt;afterPass&lt;/em&gt; then we collect the previously registered measure.&lt;br&gt;&lt;br&gt;
To get an idea of all the methods that you use on the driver object you can have a look&lt;br&gt;&lt;br&gt;
&lt;a href="https://github.com/GoogleChrome/lighthouse/blob/34a542a156b271fd7725525b09a92a344d7809c0/lighthouse-core/gather/driver.js" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Now we need to include it in our &lt;code&gt;scan.js&lt;/code&gt; file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const lighthouse = require('lighthouse');  
const chromeLauncher = require('chrome-launcher');

const config = {  
  passes: [{
    passName: 'defaultPass', //Needed to run custom Gatherers/Audits in the same pass
    gatherers: [
      `todos-gatherer`,
    ],
  }],
}
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If we try to run &lt;code&gt;scan.js&lt;/code&gt; at this moment we will receive an error that there are no audits to run. A Gatherer on its own doesn’t provide any information but rather output Artifacts used on the Audits to define metrics. To proceed then we will have a look at the Audits then.&lt;/p&gt;

&lt;h4&gt;
  
  
  Audit
&lt;/h4&gt;

&lt;p&gt;An Audit defines a metric or score, it takes the Artifacts as an input and calculates the desired score. The different audits that &lt;em&gt;Lighthouse&lt;/em&gt; is performing such as &lt;em&gt;FirstMeaningfulPaint&lt;/em&gt; or &lt;em&gt;SpeedIndex&lt;/em&gt; are all in fact defined as an audit internally.&lt;br&gt;&lt;br&gt;
To create a custom &lt;em&gt;Audit&lt;/em&gt;, similar to a &lt;em&gt;Gatherer&lt;/em&gt;, we will extend a base class &lt;code&gt;Audit&lt;/code&gt; and implements the basic methods:&lt;/p&gt;

&lt;p&gt;To create a custom &lt;em&gt;Audit&lt;/em&gt;, similar to a &lt;em&gt;Gatherer&lt;/em&gt;, we will extend a&lt;br&gt;&lt;br&gt;
base class &lt;code&gt;Audit&lt;/code&gt; and implements the basic methods:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const { Audit } = require('lighthouse');

class MyAudit extends Audit {  
  static get meta() {
    ..
  }

  static audit(artifacts) {
    ...
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The class &lt;em&gt;Audit&lt;/em&gt; defines two methods that need to be overridden:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;em&gt;meta&lt;/em&gt;&lt;/strong&gt; - used to define information about the audit&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;em&gt;audit&lt;/em&gt;&lt;/strong&gt; - takes as input the &lt;em&gt;Artifacts&lt;/em&gt; from Gatherers and return a Product metric.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With this information in mind, we can now implement our custom Audit and start collecting the performance of the todo list. The name of the custom audit file will be &lt;code&gt;todos-audit.js&lt;/code&gt; and will contain:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;'use strict';

const Audit = require('lighthouse').Audit;

class TodosAudit extends Audit {  
  static get meta() {
    return {
      id: 'todos-audit',
      title: 'Todos are loaded and rendered',
      scoreDisplayMode: Audit.SCORING_MODES.NUMERIC,
      failureTitle: 'Todos loading is too slow.',
      description: 'Used to measure time for fetching and rendering todos list',
      requiredArtifacts: ['TodosGatherer'],
    };
  }

  static audit(artifacts) {
    const measure = artifacts.TodosGatherer;

    return {
      rawValue: measure,
      score: Math.max(1 - (measure / 1500), 0),
      displayValue: `Todos rendering is: ${measure}ms`
    };
  }
}
module.exports = TodosAudit;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Inside the method &lt;code&gt;meta&lt;/code&gt; we are defining information describing the &lt;em&gt;Audit&lt;/em&gt; itself such as id, title, scoreDisplayMode and description. Also, we are configuring the &lt;em&gt;Artifacts&lt;/em&gt; which are needed by the &lt;em&gt;Audit&lt;/em&gt; in this case &lt;code&gt;TodosGatherer&lt;/code&gt; is the name of the Gatherer of interest.&lt;/p&gt;

&lt;p&gt;And now we need to add it in the configuration inside &lt;code&gt;scan.js&lt;/code&gt; similar to what we did previously to the &lt;em&gt;Gatherer&lt;/em&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const config = {  
  passes: [{
    passName: 'defaultPass’,
    gatherers: [
      `todos-gatherer`,
    ],
  }],
  audits: [
    'todos-audit',
  ],
  categories: {
    todos: {
      title: 'Todos metrics',
      description: 'Performance metrics for todos',
      auditRefs: [
      // When we add more custom audits, `weight` controls how they're averaged together.
      {id: 'todos-audit', weight: 1},
    ],
    },
  },
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Launching now our scan we can notice our custom audit being logged into the console.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Fconsole.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Fconsole.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you prefer to have the report in a different format such as HTML you can add the output option to the lighthouse function. The options object will be then used by &lt;em&gt;Lighthouse&lt;/em&gt; to configure the running audits output format. To recap the final &lt;code&gt;scan.js&lt;/code&gt; will look like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const lighthouse = require('lighthouse');  
const chromeLauncher = require('chrome-launcher');  
const { promisify } = require('util');  
const { writeFile } = require('fs');  
const pWriteFile = promisify(writeFile);

const config = {  
  passes: [{
    passName: 'defaultPass’,
    gatherers: [
      `todos-gatherer`,
    ],
  }],
  audits: [
    'todos-audit',
  ],
  categories: {
    todos: {
      title: 'Todos metrics',
      description: 'Performance metrics for todos',
      auditRefs: [
      // When we add more custom audits, `weight` controls how they're averaged together.
      {id: 'todos-audit', weight: 1},
    ],
    },
  },
}

async function launchChromeAndRunLighthouse(url, opts, config = null) {  
  const chrome = await chromeLauncher.launch({chromeFlags: opts.chromeFlags});
  opts.port = chrome.port;
  const { lhr, report } = await lighthouse(url, opts, config);
  await chrome.kill()
  return report;
}

const opts = {  
  output: 'html'
};

// Usage:
(async () =&amp;gt; {
  try {
    const results = await launchChromeAndRunLighthouse('https://izifortune.github.io/lighthouse-custom-gatherer', opts, config);
    await pWriteFile('report.html', results)
  } catch (e) {
    console.log(e);
  }
})();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And running it now we will have an HTML report inside &lt;code&gt;report.html&lt;/code&gt; which will look similar to the following one:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Freport1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Freport1.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can include also the standards &lt;em&gt;Lighthouse&lt;/em&gt; audits together with our custom one by adding to the configuration object the following key:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const config = {  
  extends: 'lighthouse:default', // Include Lighthouse default audits
  passes: [{
    passName: 'defaultPass’,
    gatherers: [
      `todos-gatherer`,
    ],
  }],
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Ffull-report.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Ffull-report.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the introduction to the Lighthouse architecture, we can start customising the Lighthouse audits by measuring and reporting metrics which are relevant to us. We will explore now how we can use the Gatherers to overcome a common problem while performing the scans on a CI environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  Session guard pages
&lt;/h3&gt;

&lt;p&gt;In &lt;em&gt;Ryanair&lt;/em&gt;, we are using &lt;em&gt;Lighthouse&lt;/em&gt; extensively to audit our webpages as part of an automated job performing scans at regular intervals and then we analyse the results of the scans on a regular basis. One of the main problems that we encountered when you are running an automated scan is how to perform audits on pages behind an authentication or user session. While with manual scans we can easily generate a session before starting the audit; if we are running &lt;em&gt;Lighthouse&lt;/em&gt; from a CI environment we will need to generate a session programmatically and pass the information to &lt;em&gt;Lighthouse.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A common approach for user session authentications management for a web application is to generate tokens, often JWT, on the server after a successful login and store the result token in the browser. You can store the token in different storage available in the browser such as &lt;em&gt;LocalStorage, SessionStorage, Cookies.&lt;/em&gt; I will not judge here where is the best place to store a token what is interesting to us is how we can write to any of that browser storage so that &lt;em&gt;Lighthouse&lt;/em&gt; can access the token and perform an audit.&lt;/p&gt;

&lt;p&gt;By using a custom &lt;em&gt;Gatherer&lt;/em&gt; we can create a user session by leveraging the lifecycle hook &lt;em&gt;beforePass&lt;/em&gt; which triggers before the navigation to the page URL. In the hook, we call an API to generate a session and then through one of the &lt;em&gt;Driver&lt;/em&gt; methods &lt;code&gt;evaluateScriptOnNewDocument&lt;/code&gt; we can pass any function to be executed in the browser instance.&lt;/p&gt;

&lt;p&gt;For the purpose of this demo, I've created another &lt;a href="https://izifortune.github.io/lighthouse-custom-gatherer/auth" rel="noopener noreferrer"&gt;page&lt;/a&gt; where I'm basically rendering the todos only if the user is authenticated. To fake the authentication I'm checking that a specific token is present in &lt;em&gt;LocalStorage&lt;/em&gt; and then start fetching and rendering todos.&lt;/p&gt;

&lt;p&gt;Let's create a new &lt;em&gt;Gatherer&lt;/em&gt; called &lt;em&gt;SessionGatherer&lt;/em&gt; in the file&lt;br&gt;&lt;br&gt;
&lt;code&gt;session-gatherer.js&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const { Gatherer } = require('lighthouse');

const TOKEN = 'iOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiYWRtaW4iOnRydWUsImp0aSI6IjkzZDU0MDBiLTQ5MzgtNDNmZS1iMjY4LTY2MDJlNDIxMjFiYiIsImlhdCI6MTU1MjkwNjc0NywiZXhwIjoxNTUyOTEwMzQ3fQ.qEJflkN2ntXrQFalBkkw4duCh55HdNBLGXZOV-dS3KQ';

function createSession(token) {  
  localStorage.setItem('token', token);
}

class SessionGatherer extends Gatherer {

  async beforePass(options) {
    const driver = options.driver;
    return driver.evaluateScriptOnNewDocument(`(${createSession.toString()})('${TOKEN}')`);
  }
}

module.exports = SessionGatherer;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once we have created the gatherer we need to tell &lt;em&gt;Lighthouse&lt;/em&gt; to include it in the list so that it will be running alongside all the other gatherers while performing an audit. We need to will create another file &lt;code&gt;scan-auth.js&lt;/code&gt; as following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const lighthouse = require('lighthouse');  
const chromeLauncher = require('chrome-launcher');  
const { promisify } = require('util');  
const { writeFile } = require('fs');  
const pWriteFile = promisify(writeFile);

const config = {  
  extends: 'lighthouse:default',
  passes: [{
    passName: 'defaultPass',
    gatherers: [
      `session-gatherer`,
      `todos-gatherer`
    ],
  }],
  audits: [
    'todos-audit'
  ],
  categories: {
    todos: {
      title: 'Todos metrics',
      description: 'Performance metrics for todos',
      auditRefs: [
      // When we add more custom audits, `weight` controls how they're averaged together.
      {id: 'todos-audit', weight: 1},
    ],
    },
  },
}

async function launchChromeAndRunLighthouse(url, opts, config = null) {  
  const chrome = await chromeLauncher.launch({chromeFlags: opts.chromeFlags});
  opts.port = chrome.port;
  const { lhr, report } = await lighthouse(url, opts, config);
  await chrome.kill()
  return report;
}

const opts = {  
  output: 'html'
};

// Usage:
(async () =&amp;gt; {
  try {
    const results = await launchChromeAndRunLighthouse('https://izifortune.github.io/lighthouse-custom-gatherer/auth', opts, config);
    await pWriteFile('report.html', results)
  } catch (e) {
    console.log(e);
  }
})();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now we can start running our scans on the pages behind a user session and monitor their performance.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;node scan-auth.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Which record the default &lt;em&gt;Lighthouse&lt;/em&gt; metrics plus the custom Todos metrics which on this case are behind a user authentication.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Ffull-report-auth.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Ffull-report-auth.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Inside the report now if you look at the &lt;em&gt;filmstrip&lt;/em&gt; you will notice that the todo list got rendered correctly meaning that a valid token was found in &lt;em&gt;LocalStorage&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Ffilmstrip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fizifortune.com%2Fcontent%2Fimages%2F2019%2F03%2Ffilmstrip.png" alt="Lighthouse architecture demystified"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I've collected the examples presented here in this repository &lt;a href="https://github.com/izifortune/lighthouse-custom-gatherer" rel="noopener noreferrer"&gt;https://github.com/izifortune/lighthouse-custom-gatherer&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Credits
&lt;/h3&gt;

&lt;p&gt;The initial idea came by looking at the custom audit recipe from the &lt;em&gt;Lighthouse&lt;/em&gt; team that you can find &lt;a href="https://github.com/GoogleChrome/lighthouse/tree/master/docs/recipes/custom-audit" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;br&gt;&lt;br&gt;
The recipe served as an inspiration and a first example to create a custom Gatherer/Audit. I also would like to thanks @patrickhulce for his time and prompt answers on Gitter.&lt;/p&gt;

</description>
      <category>lighthouse</category>
      <category>performance</category>
    </item>
    <item>
      <title>Angular polyfill strategies</title>
      <dc:creator>Fabrizio Fortunato</dc:creator>
      <pubDate>Mon, 12 Nov 2018 16:42:00 +0000</pubDate>
      <link>https://dev.to/izifortune/angular-polyfill-strategies-2dmn</link>
      <guid>https://dev.to/izifortune/angular-polyfill-strategies-2dmn</guid>
      <description>

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ergwaKlm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/http://izifortune.com/content/images/2018/11/angular-polyfill-strategies.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ergwaKlm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/http://izifortune.com/content/images/2018/11/angular-polyfill-strategies.png" alt="Angular polyfill strategies"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Cross browser compatibility is a big part of modern web development. While the majority of browsers are now aligning to the new web standards, cross browser issues still occur for different reasons. Sometimes browsers have bugs, different level of support for a new technology. &lt;em&gt;Angular&lt;/em&gt; supports all the most recent browsers but supporting all this variety of browsers it's challenging especially where there are lacks of support for modern technologies. This is why the &lt;em&gt;Angular&lt;/em&gt; team &lt;a href="https://angular.io/guide/browser-support"&gt;recommends&lt;/a&gt; to load polyfills depending on your targets. The polyfill provides a functionality expected to be natively available.&lt;/p&gt;

&lt;p&gt;An &lt;em&gt;Angular&lt;/em&gt; application created with the &lt;em&gt;angular-cli&lt;/em&gt; contain the file &lt;code&gt;src/polyfills.ts&lt;/code&gt;. The file highlight the different modules that might be needed for a specific browser in order to work properly. While creating a web application, especially a public facing one, it is necessary to deal with cross browser compatibility. A user might come to your web app using an older browser or an older version of a browser and you shouldn’t force them to upgrade. Depending on the project audience it is necessary to load the polyfills needed.&lt;/p&gt;

&lt;p&gt;In this post, I want to explore different approaches to load polyfills on an Angular application. Starting from a very simple scenario where we load all the possible modules to more advanced ones. The polyfills are often a neglected part of an &lt;em&gt;Angular&lt;/em&gt; application resulting in a performance loss for your web application and users.&lt;/p&gt;

&lt;h3&gt;Bundle them all&lt;/h3&gt;

&lt;p&gt;In our first case, we are just blindly uncommenting all the imports statements in our polyfills files. This is a very simple case and I imagine for a lot of developers that are starting an &lt;em&gt;Angular&lt;/em&gt; project, this is the easiest way to get your application working with no extra effort, except for your users.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'core-js/es6/symbol';  
import 'core-js/es6/object';  
import 'core-js/es6/function';  
import 'core-js/es6/parse-int';  
import 'core-js/es6/parse-float';  
import 'core-js/es6/number';  
import 'core-js/es6/math';  
import 'core-js/es6/string';  
import 'core-js/es6/date';  
import 'core-js/es6/array';  
import 'core-js/es6/regexp';  
import 'core-js/es6/map';  
import 'core-js/es6/weak-map';  
import 'core-js/es6/set';  
import 'core-js/es6/array';  
import 'classlist.js'; // Run `npm install --save classlist.js`.  
import 'core-js/es6/reflect';  
import 'web-animations-js'; // Run `npm install --save web-animations-js`.  
import 'zone.js/dist/zone'; // Included with Angular CLI.
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;All the browsers, even the ones which don’t need polyfills, are paying the price of the older browser in your list of supported ones. So every user needs to download the generated file in its entirety. If we look at the generated polyfill bundle size, this is double the space that a common utility library such as &lt;a href="https://ramdajs"&gt;ramda&lt;/a&gt; or &lt;a href="https://lodash.com/"&gt;lodash&lt;/a&gt; occupies without using any tree-shaking on them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kLG8CIKA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://izifortune.com/content/images/2018/11/polyfills-ramda-lodash.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kLG8CIKA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://izifortune.com/content/images/2018/11/polyfills-ramda-lodash.png" alt="Angular polyfill strategies"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;IE aside&lt;/h3&gt;

&lt;p&gt;As you can imagine this is not an ideal solution and it is impacting all your users with no discrimination. On my research to improve the previous solution I stumble across what is currently used at &lt;a href="https://angular.io"&gt;angular.io&lt;/a&gt;. The website serves an additional file &lt;code&gt;ie-polyfill&lt;/code&gt; only to IE11/10/9 browser and a smaller polyfill for all the rest. The solution implemented is simple and effective and it consists of pre-generating a bundle for IE and loading it in a &lt;em&gt;script&lt;/em&gt; tag with always present in the index adding &lt;code&gt;nomodule&lt;/code&gt; attribute.&lt;/p&gt;

&lt;p&gt;Create the following file &lt;code&gt;src/ie-polyfills.js&lt;/code&gt;:&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/ **IE9, IE10 and IE11 requires all of the following polyfills.** /
import 'core-js/es6/symbol';  
import 'core-js/es6/object';  
import 'core-js/es6/function';  
import 'core-js/es6/parse-int';  
import 'core-js/es6/parse-float';  
import 'core-js/es6/number';  
import 'core-js/es6/math';  
import 'core-js/es6/string';  
import 'core-js/es6/date';  
import 'core-js/es6/array';  
import 'core-js/es6/regexp';  
import 'core-js/es6/map';  
import 'core-js/es6/set';  
import 'classlist.js';  
import 'web-animations-js';
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Leaving only the following inside &lt;code&gt;src/polyfill.ts&lt;/code&gt;&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import 'zone.js/dist/zone'; // Included with Angular CLI.
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;To generate the &lt;em&gt;ie-polyfills&lt;/em&gt; just add the following command inside the &lt;em&gt;script&lt;/em&gt; section in the &lt;code&gt;package.json&lt;/code&gt;:&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"build-ie-polyfills": "webpack-cli src/ie-polyfills.js -o src/generated/ie-polyfills.min.js -c webpack-polyfill.config.js",
"postbuild": "cp -p src/generated/ie-polyfills.min.js dist/generated"
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;And finally, add the &lt;em&gt;script&lt;/em&gt; tag inside the head in &lt;code&gt;src/index.html&lt;/code&gt;:&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;script nomodule="" src="generated/ie-polyfills.min.js"&amp;gt;&amp;lt;/script&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;nomodule&lt;/code&gt; attribute is a boolean attribute that prevents a script from being executed in user agents that support module scripts. Only browsers that do not support modules like IE11/10/9, will download the bigger bundle while the rest of the browsers will ignore it and just download the smaller file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BuRgIJLp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://izifortune.com/content/images/2018/11/polyfill-strategies.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BuRgIJLp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://izifortune.com/content/images/2018/11/polyfill-strategies.png" alt="Angular polyfill strategies"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That’s almost a 60% improvement over the initial, &lt;em&gt;bundle them all&lt;/em&gt;, solution proposed. It can be easily integrated in any application and it doesn't involve any complex change or infrastructure support. There is only one additional file served by your application. Here’s a link to a stackblitz application already configured for the occasion &lt;a href="https://stackblitz.com/edit/angular-ahwqb6"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;Serverless service&lt;/h3&gt;

&lt;p&gt;The previous solution its great and I really encourage to test it out in your project to get an easy performance boost. There are few issues that cannot be ignored: the older browsers will have to perform an additional request to fetch two polyfills. Also using &lt;code&gt;nomodule&lt;/code&gt; can only differentiate between browsers that support or not module. There are scenarios where you need to load a polyfill only for a specific browser. In one of our web application, we needed to load &lt;code&gt;core-js/es7/object&lt;/code&gt; specifically for IOS 9 for example.&lt;/p&gt;

&lt;p&gt;A more advanced solution is to dynamically serve the polyfills by detecting the browser request. The user-agent header can be used to detect the browser making the request.&lt;br&gt;&lt;br&gt;
A popular service used widely by the web development community is &lt;a href="https://polyfill.io"&gt;polyfill.io&lt;/a&gt;. The service created by Jonathan Neal, does exactly what describe earlier. Unfortunately, &lt;em&gt;polyfill.io&lt;/em&gt; cannot be used for &lt;em&gt;Angular&lt;/em&gt; applications due to the changes that it performs on global objects&lt;/p&gt;

&lt;p&gt;The solution provided by &lt;em&gt;polyfill.io&lt;/em&gt; it's very promising and reduces the total number of calls even for the oldest browser in the group. Serverless can recreate a similar solution to &lt;em&gt;polyfill.io&lt;/em&gt; which can be easy to set up, maintain and also working with &lt;em&gt;Angular&lt;/em&gt; applications.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://serverless.com"&gt;Serverless&lt;/a&gt; framework is an open-source CLI for building and deploying serverless applications. &lt;em&gt;Serverless&lt;/em&gt; provides basically a layer of abstraction on top of different cloud providers such as &lt;em&gt;AWS&lt;/em&gt;, &lt;em&gt;Google cloud&lt;/em&gt; or &lt;em&gt;Azure&lt;/em&gt;. Let’s start by downloading and installing the &lt;em&gt;serverless-cli&lt;/em&gt; globally.&lt;/p&gt;

&lt;p&gt;Prerequisite for this section are: an &lt;em&gt;AWS&lt;/em&gt; account and the &lt;em&gt;aws-cli&lt;/em&gt; installed on your machine.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm i -g serverless
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Now that the &lt;em&gt;serverless-cli&lt;/em&gt; is installed globally, we can use it to create the first project based on a nodejs template.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;serverless create --template aws-nodejs --path angular-polyfill &amp;amp;&amp;amp; cd angular-polyfill &amp;amp;&amp;amp; npm init
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The cli creates two different files inside the project: &lt;code&gt;serverless.yml&lt;/code&gt; that specify the infrastructure configuration of the service. &lt;code&gt;handler.js&lt;/code&gt; which contains the function, this is a lambda function which will be executed in a managed &lt;em&gt;AWS&lt;/em&gt; infrastructure.&lt;/p&gt;

&lt;p&gt;Before going into serverless specifics, in order to generate the different polyfills bundles, we are going to use &lt;em&gt;webpack&lt;/em&gt; directly using the command line interface. To reduce the size we will also configure the &lt;em&gt;UglifyJsPlugin&lt;/em&gt; which will take care to perform the necessary optimisations. Start by adding some dependencies&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm i --save-dev core-js web-animations-js classlist.js webpack zone.js webpack-cli uglifyjs-webpack-plugin
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Create a simple &lt;em&gt;webpack&lt;/em&gt; configuration in &lt;code&gt;webpack.config.js&lt;/code&gt; and the &lt;em&gt;javascript&lt;/em&gt; files with the different polyfills imports&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// webpack.config.js
const UglifyJsPlugin = require('uglifyjs-webpack-plugin');

module.exports = {  
  mode: 'production',
  optimization: {
    minimizer: [new UglifyJsPlugin({
      uglifyOptions: {
        warnings: false,
        parse: {},
        compress: {},
        mangle: true, // Note `mangle.properties` is `false` by default.
        output: null,
        toplevel: false,
        nameCache: null,
        ie8: false,
        keep_fnames: false,
      }
    })]
  }
};

//polyfills.js
import 'core-js/es7/reflect'; // Used in JIT - remove if you use only AOT  
import 'zone.js/dist/zone';

//ie-polyfills.js
import 'core-js/es6/symbol';  
import 'core-js/es6/object';  
import 'core-js/es6/function';  
import 'core-js/es6/parse-int';  
import 'core-js/es6/parse-float';  
import 'core-js/es6/number';  
import 'core-js/es6/math';  
import 'core-js/es6/string';  
import 'core-js/es6/date';  
import 'core-js/es6/array';  
import 'core-js/es6/regexp';  
import 'core-js/es6/map';  
import 'core-js/es6/set';  
import 'classlist.js';  
import 'web-animations-js';

import 'core-js/es7/reflect'; // Used in JIT - remove if you use only AOT  
import 'zone.js/dist/zone';
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;And last we add the scripts in our &lt;code&gt;package.json&lt;/code&gt;&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"build-ie-polyfills": "webpack-cli polyfills.js ie-polyfills.js -o ./generated/ie-polyfills.min.js --mode production --optimize-minimize -c webpack.config.js",
"build-polyfills": "webpack-cli polyfills.js -o ./generated/polyfills.min.js --mode production --optimize-minimize -c webpack.config.js"

npm run build-ie-polyfills &amp;amp;&amp;amp; npm run build-polyfills
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Running the two commands generate two different polyfills files inside the &lt;code&gt;generated&lt;/code&gt; folder to be used inside our serverless function. &lt;em&gt;Serverless&lt;/em&gt; framework needs to include the generated files in the function created. Inside &lt;code&gt;serverless.yml&lt;/code&gt; specify:&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package:  
  include:
    - generated/polyfill.min.js
    - generated/ie-polyfill.min.js
  exclude:
    - ie-polyfills.js
    - polyfills.js
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;It is time to open &lt;code&gt;handler.js&lt;/code&gt; and add the logic to serve a different polyfill based upon the &lt;em&gt;user agent&lt;/em&gt; of the request. The idea is to detect the &lt;em&gt;user agent&lt;/em&gt; and then based upon such information serve one or the other polyfill. Import a utility library which parses the &lt;em&gt;user agent&lt;/em&gt; string with high accuracy by using hand-tuned dedicated regular expressions for browser matching.&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm i --save useragent

//handler.js
'use strict';

const useragent = require('useragent');  
const util = require('util');  
const fs = require('fs');  
const readFile = util.promisify(fs.readFile);  
const path = require('path');

const getPolyFillName = (browser) =&amp;gt; {  
  switch (browser) {
    case 'ie':
      return path.join(__dirname, '/generated/ie-polyfills.min.js');
    default:
      return path.join(__dirname, '/generated/polyfills.min.js');
  }
};

module.exports.polyfill = async (event, context) =&amp;gt; {  
  const ua = useragent.parse(event.headers['user-agent']);
  console.log(ua);
  const body = await readFile(getPolyFillName(ua.family.toLowerCase()), 'utf8');
  return {
    statusCode: 200,
    body,
    headers: {
      'Content-Type': 'application/javascript',
      'X-Detected-UA': ua.family,
    }
  };
};
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The serverless function needs an endpoint for our browser to reach it. While working with &lt;em&gt;AWS&lt;/em&gt; we can attach a lambda to an &lt;a href="https://aws.amazon.com/api-gateway/"&gt;API Gateway&lt;/a&gt; and the latter will provide the integration from the function to the &lt;em&gt;REST&lt;/em&gt; endpoint. &lt;em&gt;Serverless&lt;/em&gt; simplify the creation of all the resources needed by simply adding &lt;em&gt;events&lt;/em&gt; to the function inside &lt;code&gt;serverless.yml&lt;/code&gt;&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;polyfill:
    handler: handler.polyfill
    events:
       - http:
           method: get
           path: /polyfill
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Now we are ready to deploy our function&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sls deploy
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Serverless outputs at the end of the creation process a URL that point to your lambda.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fZldEvts--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://izifortune.com/content/images/2018/11/serverless-output.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fZldEvts--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://izifortune.com/content/images/2018/11/serverless-output.png" alt="Angular polyfill strategies"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To test that everything is working by pasting the URL in your browser and the service should return you the polyfill bundle targeted for your browser.&lt;/p&gt;

&lt;p&gt;We can now easily modify our Angular application to download the polyfills directly from the service URL and remove polyfills bundling from it.&lt;/p&gt;

&lt;p&gt;This is a very simple setup to showcase the power of &lt;em&gt;serverless&lt;/em&gt;, depending on your website traffic there are different ways to enhance it, if you are interested let me know in the comments below.&lt;/p&gt;

&lt;p&gt;Spending time deciding the polyfills that should be included is often omitted, penalising everyone for the “sin” of a few. The solutions explored hopefully will help you improve your website's app experience for your users.&lt;/p&gt;


</description>
      <category>angular</category>
      <category>webdev</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
