<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Marco Aguzzi</title>
    <description>The latest articles on DEV Community by Marco Aguzzi (@maguzzi).</description>
    <link>https://dev.to/maguzzi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/maguzzi"/>
    <language>en</language>
    <item>
      <title>Got my AWS AI practitioner certification!</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Thu, 04 Sep 2025 10:55:10 +0000</pubDate>
      <link>https://dev.to/maguzzi/got-my-aws-ai-practitioner-certification-10ap</link>
      <guid>https://dev.to/maguzzi/got-my-aws-ai-practitioner-certification-10ap</guid>
      <description>&lt;p&gt;I wanted to have a more structured overview of AI, so I figured out I’d earn this certification&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.credly.com/badges/2a1b3b0b-87a5-43fc-9116-ef00a5ec4055/public_url" rel="noopener noreferrer"&gt;View credentials here&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>practitioner</category>
      <category>ai</category>
      <category>resume</category>
    </item>
    <item>
      <title>RSS to social integration - An example using Lambda, Python, and Amazon Translate</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Mon, 21 Apr 2025 19:12:00 +0000</pubDate>
      <link>https://dev.to/maguzzi/rss-to-social-integration-an-example-using-lambda-python-and-amazon-translate-14j0</link>
      <guid>https://dev.to/maguzzi/rss-to-social-integration-an-example-using-lambda-python-and-amazon-translate-14j0</guid>
      <description>&lt;p&gt;In this article we’ll explore a RSS to social (e.g. LinkedIn) integration using AWS Lambda with Python. We’ll use Amazon Translate to provide the content of the post in Italian for the social platform. The architecture will be defined via Terraform. We’ll proceed as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Definition of the Lambda infrastructure in Terraform

&lt;ul&gt;
&lt;li&gt;How Terraform manages python code&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Python software components

&lt;ul&gt;
&lt;li&gt;Production code&lt;/li&gt;
&lt;li&gt;Unit and integration test code&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Integration examples&lt;/li&gt;

&lt;li&gt;Upcoming improvements&lt;/li&gt;

&lt;/ul&gt;

&lt;h1&gt;
  
  
  Terraform infrastructure
&lt;/h1&gt;

&lt;p&gt;This is the terraform scheme generated with &lt;code&gt;terraform graph&lt;/code&gt; command plus some editing on the dot file to add some fancy graphics:&lt;br&gt;&lt;br&gt;
 &lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5s2cwzff4uwuk8d5boc.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa5s2cwzff4uwuk8d5boc.webp" alt="Terraform graph with icons" width="501" height="333"&gt;&lt;/a&gt;&lt;br&gt;&lt;br&gt;
Let’s review the components:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;resource blocks (solid lines)

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;rss_to_linkedin&lt;/em&gt; main lambda function | &lt;em&gt;main.tf&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;python_deps&lt;/em&gt; lambda layer | &lt;em&gt;main.tf&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;rss_to_linkedin_role&lt;/em&gt; role for lambda permissions | &lt;em&gt;iam.tf&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;rss_to_linkedin_policy&lt;/em&gt; policy for lambda permissions. At the time of writing, only permissions to call Amazon Translate and store logs on Cloudwatch are granted | &lt;em&gt;iam.tf&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;rss_to_linkedin_policy_attachment&lt;/em&gt; link between role and policy | &lt;em&gt;iam.tf&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;data blocks (dashed lines)

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;python_src&lt;/em&gt; archive file for python source layer | &lt;em&gt;data.tf&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;python_deps&lt;/em&gt; archive file for python library | &lt;em&gt;data.tf&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Python source code in Terraform
&lt;/h2&gt;

&lt;p&gt;Terraform manages python source code in the two &lt;em&gt;archive_file&lt;/em&gt; data blocks &lt;em&gt;python_src&lt;/em&gt; and &lt;em&gt;python_deps&lt;/em&gt;. The first one is dedicated to the source of the lambda function. No particular setup is needed, just having .py files in the folder will do.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;python_src data block&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;data "archive\_file" "python\_src" {
 type = "zip"
 source\_dir = "python\_src"
 output\_path = "target/python.zip"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The second &lt;em&gt;archive_file&lt;/em&gt; is for python dependencies that will be used in the lambda layer&lt;/p&gt;

&lt;p&gt;&lt;em&gt;python_deps data block&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;data "archive\_file" "python\_deps" {
 type = "zip"
 source\_dir = "./python\_deps"
 output\_path = "./target/python\_deps.zip"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this case the libraries that have been installed with &lt;em&gt;pip install&lt;/em&gt; for the code to run locally must be installed in the folder &lt;em&gt;python_deps&lt;/em&gt; in order to be zipped by the &lt;em&gt;data&lt;/em&gt; block. The folder structure under python_deps must respect a peculiar path in order to be used then as a lambda layer. The path is reflected in the &lt;em&gt;pip install&lt;/em&gt; command shown here with the target path (&lt;em&gt;-t&lt;/em&gt;)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install -t python\_deps\python\lib\python3.13\site-packages\ requests feedparser beautifulsoup4 boto3

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Both resources, &lt;em&gt;rss_to_linkedin&lt;/em&gt; and &lt;em&gt;lambda_deps&lt;/em&gt;, which reference their respective &lt;em&gt;archive_file&lt;/em&gt; blocks, have the attribute &lt;em&gt;source_code_hash&lt;/em&gt; defined in their blocks. These hashes tell terraform when it’s necessary to rebuild the zip files, that is the python code has changed or the installed libraries have changed.&lt;/p&gt;

&lt;h1&gt;
  
  
  Python sources
&lt;/h1&gt;

&lt;p&gt;Before describing python sources in details, let’s review a scheme similar to the one in the previous article (&lt;a href="https://dev.to/2025/04/20/what-ive-got-on-linkedin-oauth2/"&gt;An Oauth2 use case - Authenticating and posting articles with images via LinkedIn API (v. 202504)&lt;/a&gt;) but focused on components. The actual external calls (linkedin and boto3) have been omitted to favor simplicity&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94lv13li575fky60xjgy.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94lv13li575fky60xjgy.webp" alt="Sequence diagram about how components communicate" width="800" height="617"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the following paragraph, we’ll reference the numbers in the image above&lt;/p&gt;

&lt;h2&gt;
  
  
  Production code
&lt;/h2&gt;

&lt;h3&gt;
  
  
  main.py
&lt;/h3&gt;

&lt;p&gt;The main lambda function. (1) It uses the package &lt;em&gt;feedparser&lt;/em&gt; to parse the RSS link stored in the environment variable. It could have been also a parameter in the event payload, but this has been for the sake of simplicity. The latest entry from RSS is fetched in the first position of the item array. This is quite brutal, and it’s for sure the next improvement point. Next, the &lt;em&gt;latest_entry&lt;/em&gt; object is converted into a result object with the fields that will be used later on. The resulting object starts with empty body and image, and then it scans the content sub-items to populate image and link attributes.&lt;br&gt;&lt;br&gt;
Once the post has been processed, it’s passed to the &lt;em&gt;publish_to_profile&lt;/em&gt; function (2).&lt;/p&gt;
&lt;h3&gt;
  
  
  social_publish_linkedin.py
&lt;/h3&gt;

&lt;p&gt;This component is responsible of prepare the post. It first hides the process of linkin the image to the content from the caller. It calls &lt;em&gt;linkedin_media_share_manager&lt;/em&gt; &lt;em&gt;prepare_media_for_post&lt;/em&gt; (3) to retrieve (9) the urn (registered by LinkedIn) of the image to be used inside the post.&lt;br&gt;&lt;br&gt;
It then prepares the text by: cleaning html via &lt;em&gt;beautifulsoup4&lt;/em&gt; library; translating it by calling Amazon Translate via boto3 client (10, 11). It then calls &lt;em&gt;request_facade&lt;/em&gt; &lt;em&gt;publish_to_profile&lt;/em&gt; method to actually post the content (12).&lt;br&gt;&lt;br&gt;
There are two more sub-steps in the cleaning phase that are not depicted in the graph:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the text must be escaped otherwise LinkedIn will cut off the text at the first special character like parenthesis (see below for the reference).&lt;/li&gt;
&lt;li&gt;The text is placed into a template that adds a preview and a closing for the content such as “This is a new post” and “Continues on…”&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Points 13 and 14 are simply the returns to the main caller&lt;/p&gt;
&lt;h3&gt;
  
  
  linkedin_media_share_manager.py
&lt;/h3&gt;

&lt;p&gt;This component encapsulates the logic used by LinkedIn to use media in the posts. The steps masked by this component are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Asking the &lt;em&gt;request_facade&lt;/em&gt; for an upload URL and the URN to use for the media via method &lt;em&gt;get_upload_url_urn&lt;/em&gt; (4, 5)&lt;/li&gt;
&lt;li&gt;Download the image referenced on the RSS link to the local storage (6)&lt;/li&gt;
&lt;li&gt;Uploading the image payload by calling &lt;em&gt;request_facade&lt;/em&gt; &lt;em&gt;upload_image&lt;/em&gt; (7, 8)&lt;/li&gt;
&lt;li&gt;Returning its arn to be used in posting the content (9)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Please note that, as depicted in the diagram, if the image is not present in the post the whole “get the url and then upload” is not performed. The publish step will use the link to the post to embed an article in it.&lt;/p&gt;
&lt;h3&gt;
  
  
  requests_facade.py
&lt;/h3&gt;

&lt;p&gt;This component masks requests body details from upper logic, such as API version changes (e.g. from LinkedIn &lt;em&gt;ugcPost&lt;/em&gt; to &lt;em&gt;202504&lt;/em&gt; api). It uses python &lt;em&gt;requests&lt;/em&gt; library to do the http calls. We won’t go in the details of the calls because those have been addressed in the previous post&lt;/p&gt;
&lt;h3&gt;
  
  
  Libraries used in python
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;feedparser - Used for parsing the RSS feed into a json object&lt;/li&gt;
&lt;li&gt;boto3 - Aws client used for connecting with aws translate &lt;/li&gt;
&lt;li&gt;requests - Http client to do get, put, and post requests &lt;/li&gt;
&lt;li&gt;beautifulsoup4 - Used to translate html tags such as headings, line breaks, and paragraphs into newlines&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Test code
&lt;/h2&gt;

&lt;p&gt;The code contains a stub (we can call it a mock in its literal sense) of unit and integration tests. This has made easy try the code prior to going live with the lambda otherwise troubleshooting would have been a real pain. A lot of nicer and poshier method would have been available, but those simple calls have done their job.&lt;br&gt;&lt;br&gt;
The command to run the test is (from the root of python source folder)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m unittest discover tests

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Four tests are avaiable&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;test_remove_html_to_spaces&lt;/em&gt; and &lt;em&gt;test_remove_html_preserve_links&lt;/em&gt; - these unit tests check that the cleaning of the html is as expected. It has an excerpt taken from the blog html and assert that newlines and spaces are correctly placed.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;test_escape_special_chars&lt;/em&gt; - this unit test checks that special characters are correctly escaped&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;test_process_event&lt;/em&gt; - this is an integration test that actually calls the main entry point and run all the code util finally post to LinkedIn. It might be over the top, but it offers an instant check on the code against the final result&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Integration samples
&lt;/h1&gt;

&lt;p&gt;We can download the project on our machine and issue &lt;code&gt;terraform init&lt;/code&gt;, &lt;code&gt;terraform plan&lt;/code&gt;, and &lt;code&gt;terraform apply&lt;/code&gt; to create the lambda. Environment variable can be passed using the syntax &lt;code&gt;--var-file=file&lt;/code&gt;. The file should look like this (like the example on GitHub)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rss\_url = "your rss to parse"
access\_token = "access token taken from linkedin"
profile\_id = "profile id got from userInfo api"
# remember to use single quote if using powershell
message\_template = "some text before $translated\_text some text after" 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the lambda is ready in AWS, we can invoke it from the cli:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws lambda invoke --function-name arn:aws:lambda:us-east-1:2\*\*\*\*\*\*\*\*\*\*8:function:rss\_to\_linkedin

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and its output will be&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
 "StatusCode": 200,
 "ExecutedVersion": "$LATEST"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What follows are two examples of actual posted content with screenshots and links&lt;/p&gt;

&lt;h2&gt;
  
  
  Content with link (article)
&lt;/h2&gt;

&lt;p&gt;An example of content with link run against the &lt;a href="https://aws.amazon.com/it/blogs/aws/amazon-nova-premier-our-most-capable-model-for-complex-tasks-and-teacher-for-model-distillation/" rel="noopener noreferrer"&gt;AWS Blog&lt;/a&gt;&lt;br&gt;&lt;br&gt;
with RSS feed located &lt;a href="https://aws.amazon.com/blogs/aws/feed/" rel="noopener noreferrer"&gt;here&lt;/a&gt; can be viewed on my Linkedin profile &lt;a href="https://www.linkedin.com/posts/marcoaguzzi_amazon-nova-premier-our-most-capable-model-activity-7323696698977935361-EuWQ?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAARwjJsBrgvEjnn744onHRpQkGJ_PVoxtgI" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;br&gt;&lt;br&gt;
 &lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl6gvf43a1s1vfu1cd0p.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgl6gvf43a1s1vfu1cd0p.webp" alt="Post on linked with link" width="575" height="520"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Content with image
&lt;/h2&gt;

&lt;p&gt;An example of content with image run against the &lt;a href="https://marcoaguzzi.it/2025/04/20/what-ive-got-on-linkedin-oauth2/" rel="noopener noreferrer"&gt;marcoaguzzi.it&lt;/a&gt;&lt;br&gt;&lt;br&gt;
with RSS feed located &lt;a href="https://marcoaguzzi.it/atom.xml" rel="noopener noreferrer"&gt;here&lt;/a&gt; can be viewed on my Linkedin profile &lt;a href="https://www.linkedin.com/posts/marcoaguzzi_nuovo-articolo-su-marcoaguzziit-in-questo-activity-7321798559324876801-yTcM?utm_source=share&amp;amp;utm_medium=member_desktop&amp;amp;rcm=ACoAAARwjJsBrgvEjnn744onHRpQkGJ_PVoxtgI" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;br&gt;&lt;br&gt;
 &lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fct662tk5dr39rbwpc83y.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fct662tk5dr39rbwpc83y.webp" alt="Post on linked with image" width="522" height="550"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusions
&lt;/h1&gt;

&lt;p&gt;This post has taken the knowledge gained the previous post and encapsulated in an AWS Lambda function. Next steps could be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;implement a mechanism to know if a post has already been posted on social media with a storage (e.g. DynamoDB)&lt;/li&gt;
&lt;li&gt;schedule the lambda to be called within a certain frequence, like once or twice a day&lt;/li&gt;
&lt;li&gt;better managing the difference between post with image and post with link&lt;/li&gt;
&lt;li&gt;better unit and integration testing with proper mocks&lt;/li&gt;
&lt;li&gt;remove RSS specific parameters from environment variable (otherwise we would have one lambda per RSS)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Thanks for the reading!&lt;/p&gt;

&lt;h1&gt;
  
  
  Links
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;The lambda source code can be viewed &lt;a href="https://github.com/maguzzi/terraform_rss_integration" rel="noopener noreferrer"&gt;on GitHub&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Reference to the code that prevents special characters to truncate the post: &lt;a href="https://stackoverflow.com/questions/73712703/linkedin-post-api-post-text-gets-cut-off-if-contains" rel="noopener noreferrer"&gt;stackoverflow.com/questions/73712703/linkedin-post-api-post-text-gets-cut-off-if-contains&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Icon Attributions
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://iconscout.com/free-3d-illustration/rss-reader-2950169_2447930" rel="noopener noreferrer"&gt;IconScout Rss reader icon&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.flaticon.com/free-icons/zip-format" rel="noopener noreferrer"&gt;Zip format icons created by Dimas Anom - Flaticon&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.flaticon.com/free-icons/layer" rel="noopener noreferrer"&gt;Layer icons created by itim2101 - Flaticon&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>integration</category>
      <category>aws</category>
      <category>linkedin</category>
      <category>github</category>
    </item>
    <item>
      <title>An Oauth2 use case - Authenticating and posting articles with images via LinkedIn API (v. 202504)</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Sun, 20 Apr 2025 19:12:00 +0000</pubDate>
      <link>https://dev.to/maguzzi/an-oauth2-use-case-authenticating-and-posting-articles-with-images-via-linkedin-api-v-202504-4ech</link>
      <guid>https://dev.to/maguzzi/an-oauth2-use-case-authenticating-and-posting-articles-with-images-via-linkedin-api-v-202504-4ech</guid>
      <description>&lt;p&gt;In this article we will do a review of all the steps needed for authenticating on Linkedin via its API, which rely on Oauth2 protocol. After authentication, we’ll use the API to post content with images. We’ll assume that the reader already has an active account on LinkedIn.&lt;br&gt;&lt;br&gt;
We’ll go in details will all the requests and responses needed for succeeding, and in each step we’ll provide a sequence diagram to better follow the process.&lt;br&gt;&lt;br&gt;
We’ll also refer to the last version of the LinkedIn API, which il 2025-04.&lt;br&gt;&lt;br&gt;
The content will be as follow:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prerequisites before starting&lt;/li&gt;
&lt;li&gt;Oauth2 in Action&lt;/li&gt;
&lt;li&gt;Get the profile URN&lt;/li&gt;
&lt;li&gt;Finally posting content&lt;/li&gt;
&lt;/ul&gt;
&lt;h1&gt;
  
  
  We have a LinkedIn account, what else do we need?
&lt;/h1&gt;

&lt;p&gt;Let’s do a summary here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a company page, needed for creating&lt;/li&gt;
&lt;li&gt;an app on LinkedIn&lt;/li&gt;
&lt;li&gt;Something capable to respond to a callback URL&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  A company page
&lt;/h2&gt;

&lt;p&gt;In order to leverage &lt;em&gt;Oauth2&lt;/em&gt; protocol, a company page must be present on the profile we’ll use for posting the content.&lt;br&gt;&lt;br&gt;
Creating a page is quite simple, it’s free, and the mandatory information are really a few. I’ve created mine basically for technical purposes like the one of this post.&lt;br&gt;&lt;br&gt;
We can view the page &lt;a href="https://www.linkedin.com/company/marcoaguzzi-it/about/" rel="noopener noreferrer"&gt;at this link&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  An app on LinkedIn developer
&lt;/h2&gt;

&lt;p&gt;Once logged in onto our profile, go to &lt;a href="https://developer.linkedin.com/" rel="noopener noreferrer"&gt;developer.linkedin.com&lt;/a&gt; to create the app that will be between us and LinkedIn in order to grant permissions. Let’s review the app creation:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmjyroi2y1l8oebi5r58l.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmjyroi2y1l8oebi5r58l.webp" alt="New app creation" width="800" height="602"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Other than the obvious name and logo (the privacy policy is not mandatory), the importand part is the company page linked to the app. We’ve put a red dot on the sentences that explain how developers, apps, and company page are regulated. Once created the app is listed like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjf0hz2fqvdzrkzuowaew.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjf0hz2fqvdzrkzuowaew.webp" alt="New app done" width="406" height="281"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Configuring the app
&lt;/h3&gt;

&lt;p&gt;Let’s review the app configuration for Oauth2. This is the header we’ll be presented after creating the app:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftbucq8n0jolz45136js1.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftbucq8n0jolz45136js1.webp" alt="Config header" width="572" height="128"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  Settings
&lt;/h4&gt;

&lt;p&gt;It just shows the main information entered upon creation. We can skip the screenshot this time&lt;/p&gt;
&lt;h4&gt;
  
  
  Auth
&lt;/h4&gt;

&lt;p&gt;This is the one that matters. It contains 3 boxes, which are&lt;/p&gt;
&lt;h5&gt;
  
  
  Application credentials
&lt;/h5&gt;

&lt;p&gt;It is dedicated to Client ID and Client Primary secret, which are used for the authentication calls. We could do a screenshot here, but it would be all greyed out 🙂&lt;/p&gt;
&lt;h5&gt;
  
  
  OAuth 2.0 settings
&lt;/h5&gt;

&lt;p&gt;Big stuff here. Let’s show the screenshot:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0wy6py7gl3c36argg2mq.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0wy6py7gl3c36argg2mq.webp" alt="Oauth2 settings" width="748" height="317"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The box shows the time to live of the token (we’ll see how to have it in a minute), which is two months. Then it shows the address of the redirect urls that LinkedIn will use to send its code and state for the authentication process. Before seeing the details, let’s just say that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The localhost address is something that will live on our PC, valid for testing purposes, and LinkedIn is completely fine about that.&lt;/li&gt;
&lt;li&gt;The LinkedIn address has been automatically added by LinkedIn while using the procedure to manually create an authorization token via its web UI. Also this one is for testing purposes but it can come handy.&lt;/li&gt;
&lt;/ul&gt;
&lt;h5&gt;
  
  
  OAuth 2.0 scopes
&lt;/h5&gt;

&lt;p&gt;Those are used used by LinkedIn to set boundaries for the actions. Some of them are required for posting, others for logging in. In the free tier, we can activate all the possible four options. The screenshot here will give explanation of all the cases:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faqzagqrdc8zglrlydj3m.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faqzagqrdc8zglrlydj3m.webp" alt="Oauth2 scopes" width="747" height="433"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h4&gt;
  
  
  Products
&lt;/h4&gt;

&lt;p&gt;The products define what the app will do with LinkedIn: and our app need will “Share on Linkedin” and “OpenID Connect”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr48ixplquiuxin0htaly.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr48ixplquiuxin0htaly.webp" alt="Products" width="763" height="338"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  (A server for the) redirect URL
&lt;/h2&gt;

&lt;p&gt;As we’ve seen above, we’ve configured redirect urls in the app, so we’ll need something to listen to those url and respond.&lt;br&gt;&lt;br&gt;
For this test, I’ve scraped down some python code from AI that just gets the job done. The code is meant to run on localhost, so there’s no need to make it available through some cloud. I’m putting the code &lt;a href="https://gist.github.com/maguzzi/1c4e14c1ba9326be9f09618dc27e9276" rel="noopener noreferrer"&gt;in this gist&lt;/a&gt; in order to have the full picture to look at instead of leaving only two or three lines in between the text of this article. The pros of having something written from scratch is the possibility to see the actual calls going back and forth from LinkedIn and the local PC.&lt;/p&gt;
&lt;h3&gt;
  
  
  How should I run the server?
&lt;/h3&gt;

&lt;p&gt;I’ve downloaded Python 3.13 and &lt;a href="https://docs.python.org/3/library/idle.html" rel="noopener noreferrer"&gt;IDLE&lt;/a&gt;, a minimal Python shell, to run the python file on my PC. The code of the server requires some library installation, which can be done via &lt;em&gt;pip install&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Now we should have everything set up and ready to see&lt;/p&gt;
&lt;h1&gt;
  
  
  Oauth2 in action
&lt;/h1&gt;

&lt;p&gt;Let’s start with a sequence diagram like the one below and follow the numbered steps.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcg9g35bjunk5elrajw3k.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcg9g35bjunk5elrajw3k.webp" alt="Sequence diagram of the Oauth2 process" width="736" height="403"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  (1) Authentication request
&lt;/h2&gt;

&lt;p&gt;From local PC, issue the request with the client id, the scope(s), and the callback url. We can find all this information in the app. Let’s list them one by one:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;client_id&lt;/code&gt; is taken from the app configuration&lt;/li&gt;
&lt;li&gt;the &lt;code&gt;redirect_uri&lt;/code&gt; (url encoded) &lt;code&gt;http://localhost:3000/callback&lt;/code&gt; must match exactly one that’s registered in the app&lt;/li&gt;
&lt;li&gt;the &lt;code&gt;state&lt;/code&gt; is something that can range from a static string to whatever algorithm we like, and it can be used to counter CSRF attacks&lt;/li&gt;
&lt;li&gt;the &lt;code&gt;scope&lt;/code&gt; is the list of four ones we’ve seen above (&lt;code&gt;w_member_social, profile, openid, email&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Authentication request&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --location 'https://www.linkedin.com/oauth/v2/authorization?response\_type=code&amp;amp;client\_id=&amp;lt;client\_id&amp;gt;&amp;amp;redirect\_uri=http%3A%2F%2Flocalhost%3A3000%2Fcallback&amp;amp;state=&amp;lt;state&amp;gt;&amp;amp;scope=w\_member\_social%2Cprofile%2Copenid%2Cemail' \
--header 'Content-Type: application/x-www-form-urlencoded'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (2) HTML form
&lt;/h2&gt;

&lt;p&gt;LinkedIn will respond with an HTML containing the form for entering our credentials. We can issue the GET request in the browser and the HTML form contained in the response will render automatically, as shown here below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk2e19lc6ypamld1f7cll.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk2e19lc6ypamld1f7cll.webp" alt="html login form" width="656" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  (3) Submit the form
&lt;/h2&gt;

&lt;p&gt;Now we can insert login and password and submit the authentication form&lt;/p&gt;

&lt;h2&gt;
  
  
  (4) Redirect URL in action
&lt;/h2&gt;

&lt;p&gt;If the login gets through, LinkedIn will call the redirect URL specified in call (1) and, of course, registered in the app. It will receive the status and the authorization code. The code listening on the redirect URL is in charge of validating the status, and checking if the call Is legitimate. It’s up to us to implement this step fully or just have a pass through. Let’s view the request from LinkedIn from &lt;em&gt;IDLE&lt;/em&gt; logs&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;127.0.0.1 - - [17/Apr/2025 22:19:12] "GET /callback?code=&amp;lt;unique\_code\_from\_linkedin&amp;gt;&amp;amp;state=&amp;lt;arbitrary\_state&amp;gt; HTTP/1.1"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (5) Request for the authorization token.
&lt;/h2&gt;

&lt;p&gt;Once the webhook has received the code from LinkedIn, it’s ready to issue the request for the authorization token with a post to the url&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Issuing access token request&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --location 'https://www.linkedin.com/oauth/v2/accessToken' \
--data '{
 "grant\_type": "authorization\_code",
 "code": &amp;lt;unique\_code\_from\_linkedin&amp;gt;,
 "redirect\_uri": "http://localhost:3000/callback",
 "client\_id": "&amp;lt;client\_id\_from\_app&amp;gt;", 
 "client\_secret": "&amp;lt;client\_secret\_from\_app&amp;gt;",
}'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (6) Response with authorization token
&lt;/h2&gt;

&lt;p&gt;If successful, this can be used for subsequent calls to actually get stuff done. This token has a validity of two months and once expired should be renewed by redoing this same procedure. LinkedIn always requires a “manual” login before calling the api, we can also check in enterprise integration tools where we’ll be asked to enter our credentials to renew the token&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Response with access token&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
"access\_token"': "[redacted]",
"expires\_in": 5183999,
"scope": "email,openid,profile,w\_member\_social",
"token\_type": "Bearer",
"id\_token": "[redacted]"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The answer contains: the access token, simply in the field &lt;code&gt;access_token&lt;/code&gt;; the expiration time (60 days in seconds); the scopes we’ve entered before, the token type indicating that it will be used in header as &lt;code&gt;Bearer &amp;lt;access_token&amp;gt;&lt;/code&gt;; a technical id. We’ll refer to the token from now on as &lt;code&gt;&amp;lt;authorization_token&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  (7) Calling post API
&lt;/h2&gt;

&lt;p&gt;There are quite a few options for posting content on LinkedIn through API. The first choice is whether to use the person profile or the Company profile. We’ll explore the first option here.&lt;br&gt;&lt;br&gt;
Once the target has been chosen, there are three possibilities, each of them corresponding to the media option in the request&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Share a text only&lt;/li&gt;
&lt;li&gt;Share a text with (multi) image&lt;/li&gt;
&lt;li&gt;Share a text with link to an article&lt;/li&gt;
&lt;li&gt;Share a text with video&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;While text only and link cases resolve in a single call, for image (and media in general) things get quirkier. We’ll explore the image option. But, before being able to proceed, we’ll have to check for one last piece of information:&lt;/p&gt;
&lt;h1&gt;
  
  
  Who am I?
&lt;/h1&gt;

&lt;p&gt;We need to know the URN of our profile in order to insert it in the subsequent calls. Once that this section is done, we’ll refer to it as &lt;code&gt;&amp;lt;profile_urn&amp;gt;&lt;/code&gt;. Once given the authorization token, it’s a matter of a single call, ruled by the open_id scope, which had to be present in the oauth2 requests.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmye8thkrrfp540k1qjwt.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmye8thkrrfp540k1qjwt.webp" alt="Sequence diagram for getting the profile urn" width="349" height="202"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  (1) &lt;strong&gt;userinfo&lt;/strong&gt; request
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Issuing userinfo request&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --location 'https://api.linkedin.com/v2/userinfo' \
--header 'Content-Type: application/json' \
--header 'Authorization: &amp;lt;authorization\_token&amp;gt;' \

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (2) &lt;strong&gt;userinfo&lt;/strong&gt; response
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Response with the profile urn&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
"sub": "---",
"email\_verified": true,
"name": "Name Surname",
"locale": {
"country": "US",
"language": "en"
},
"given\_name": "Name",
"family\_name": "Surname",
"email": "email",
"picture": "https://media.licdn.com/dms/image/v2/---"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The information we want to parse from this json is the “sub” attribute of the body, that has be concatenated into its urn like &lt;code&gt;urn:li:person:---&lt;/code&gt;. This will be the &lt;code&gt;&amp;lt;profile_urn&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Posting with images
&lt;/h1&gt;

&lt;p&gt;_ &lt;strong&gt;Please note&lt;/strong&gt; _ If we search on the internet abost posting with LinkedIn api, a lot of content will refer to ugcPost API, which is deprecated, and json bodies and responses are more convoluted.&lt;br&gt;&lt;br&gt;
Refer to &lt;a href="https://learn.microsoft.com/en-us/linkedin/marketing/overview?view=li-lms-2025-04" rel="noopener noreferrer"&gt;this page&lt;/a&gt;. At the time of writing, last version is April 2025&lt;/p&gt;

&lt;p&gt;Let’s review a sequence diagram here:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6swvj8xw2py4x6hlklub.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6swvj8xw2py4x6hlklub.webp" alt="Sequence diagram of sharing a post with an image" width="425" height="363"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  (1) Asking for the url to upload the image post
&lt;/h2&gt;

&lt;p&gt;In this request we’re asking LinkedIn the address to use to upload our image and the URN that should be used when sharing the post at the end of the process&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Asking the url for uploading the image&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --location 'https://api.linkedin.com/rest/images?action=initializeUpload' \
--header 'Authorization: Bearer &amp;lt;authorization\_token&amp;gt;' \
--header 'LinkedIn-Version: 202504' \
--header 'X-RestLi-Protocol-Version: 2.0.0' \
--header 'Content-Type: application/json' \
--data '{
 "initializeUploadRequest": {
 "owner": "&amp;lt;profile\_urn&amp;gt;"
 }
}'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (2) The response with the upload url and the media urn:
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Response with both the information&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
"value": {
"uploadUrlExpiresAt": 1745092254555,
"uploadUrl": "https://www.linkedin.com/dms-uploads/sp/v2/---/uploaded-image/---/0?ca=vector\_ads&amp;amp;cn=uploads&amp;amp;iri=B01-77&amp;amp;sync=0&amp;amp;v=beta&amp;amp;ut=---",
"image": "urn:li:image:---"
}
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The releavent information is contained in two fields. We’ve redacted to &lt;code&gt;---&lt;/code&gt; unique codes generated by LinkedIn.&lt;/p&gt;

&lt;p&gt;The upload url is in &lt;code&gt;response["value"]["uploadUrl"]&lt;/code&gt;, that is&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://www.linkedin.com/dms-uploads/sp/v2/---/uploaded-image/---/0?ca=vector\_ads&amp;amp;cn=uploads&amp;amp;iri=B01-77&amp;amp;sync=0&amp;amp;v=beta&amp;amp;ut=---
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and the asset urn is in &lt;code&gt;response["value"]["image"]&lt;/code&gt;, that is &lt;code&gt;urn:li:image:---&lt;/code&gt;. We’ll refer to this element as &lt;code&gt;&amp;lt;media_urn&amp;gt;&lt;/code&gt; in subsequent API calls&lt;/p&gt;

&lt;h2&gt;
  
  
  (3) Upload the image
&lt;/h2&gt;

&lt;p&gt;With this information we can issue the upload request with the actual image payload&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Actual PUT with the image&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --location --request PUT 'https://www.linkedin.com/dms-uploads/sp/v2/---/uploaded-image/---/0?ca=vector\_ads&amp;amp;cn=uploads&amp;amp;iri=B01-77&amp;amp;sync=0&amp;amp;v=beta&amp;amp;ut=---' \
--header 'Authorization: &amp;lt;authorization\_token&amp;gt;' \
--header 'Content-Type: image/webp' \
--data-binary '@kx\_V7l07P/file.webp'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (4) Upload image response
&lt;/h2&gt;

&lt;p&gt;The response is just an acknowledge of the upload (201 created), we can just check for http errors&lt;/p&gt;

&lt;h2&gt;
  
  
  (5) Share the post
&lt;/h2&gt;

&lt;p&gt;We’re finally ready to create the content, putting the &lt;code&gt;authorization_token&lt;/code&gt;, &lt;code&gt;profile_urn&lt;/code&gt;, and &lt;code&gt;media_urn&lt;/code&gt; in the body of the &lt;code&gt;posts&lt;/code&gt; API call&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Finally creating content!&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl --location 'https://api.linkedin.com/rest/posts' \
--header 'Authorization: Bearer &amp;lt;authorization\_token&amp;gt;' \
--header 'X-Restli-Protocol-Version: 2.0.0' \
--header 'LinkedIn-Version: 202504' \
--header 'Content-Type: application/json' \
--data '{
 "author": "&amp;lt;profile\_urn&amp;gt;",
 "commentary": "test",
 "visibility": "PUBLIC",
 "distribution": {
 "feedDistribution": "MAIN\_FEED",
 "targetEntities": [],
 "thirdPartyDistributionChannels": []
 },
 "lifecycleState": "PUBLISHED",
 "isReshareDisabledByAuthor": false,
 "content": {
 "media": {
 "id": "&amp;lt;media\_urn&amp;gt;",
 "altText": "alt tags"
 }
 }
}'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  (6) 201 Created
&lt;/h2&gt;

&lt;p&gt;Once the post is published, the image will be displayed below the text, while the link will appear in a box below the text. Bear in mind that any link present in the text of the share will be automatically shortened by LinkedIn and left in its original position.&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusions and next steps
&lt;/h1&gt;

&lt;p&gt;The process involves a lot of interactions, and there could be a lot more to talk about. To keep things simple, having the authorization token generated by the web UI is an option. Also using Postman in order to avoid writing the code that issues the requests.&lt;br&gt;&lt;br&gt;
Next steps could be automatically write the link in the first comment of the post, in order not to have it shortened, and introducing some automation in the process.&lt;br&gt;&lt;br&gt;
Thanks for reading till here!&lt;/p&gt;

</description>
      <category>integration</category>
      <category>linkedin</category>
      <category>python</category>
      <category>oauth2</category>
    </item>
    <item>
      <title>Got my AWS solution architect certification!</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Sat, 05 Apr 2025 14:58:34 +0000</pubDate>
      <link>https://dev.to/maguzzi/got-my-aws-solution-architect-certification-2hn0</link>
      <guid>https://dev.to/maguzzi/got-my-aws-solution-architect-certification-2hn0</guid>
      <description>&lt;p&gt;It has been a while since last certification, so I felt that its natural next step was earning this one:&lt;br&gt;&lt;br&gt;
&lt;a href="https://www.credly.com/badges/6c40eec8-574b-4457-b21d-69fd488bbf11/public_url" rel="noopener noreferrer"&gt;View credentials here&lt;/a&gt;&lt;/p&gt;

</description>
      <category>resume</category>
      <category>certification</category>
      <category>aws</category>
      <category>associate</category>
    </item>
    <item>
      <title>Got my Terraform certification!</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Fri, 20 Sep 2024 19:00:00 +0000</pubDate>
      <link>https://dev.to/maguzzi/got-my-terraform-certification-474c</link>
      <guid>https://dev.to/maguzzi/got-my-terraform-certification-474c</guid>
      <description>&lt;p&gt;Since I’m working a lot with terraform in my day-to-day job, I thought it was useful to earn a ceritication!&lt;br&gt;&lt;br&gt;
 &lt;a href="https://www.credly.com/badges/31516106-1658-4192-9a64-68d8c52bc4fd/public_url" rel="noopener noreferrer"&gt;View credentials here&lt;/a&gt;&lt;/p&gt;

</description>
      <category>resume</category>
      <category>terraform</category>
      <category>associate</category>
      <category>certification</category>
    </item>
    <item>
      <title>Domesticate AWS nested stacks in Java: doing the chores Cloudformation doesn't do (w/ code samples)</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Tue, 07 May 2024 20:49:02 +0000</pubDate>
      <link>https://dev.to/maguzzi/domesticate-aws-nested-stacks-in-java-doing-the-chores-cloudformation-doesnt-do-w-code-samples-20ch</link>
      <guid>https://dev.to/maguzzi/domesticate-aws-nested-stacks-in-java-doing-the-chores-cloudformation-doesnt-do-w-code-samples-20ch</guid>
      <description>&lt;p&gt;In this article we’ll navigate through the creation of a Nested Stack in Cloudformation using the Java SDK. The child stack will be a lambda function, and the code will be uploaded with a zip archive.&lt;/p&gt;

&lt;h1&gt;
  
  
  What’s Cloudformation, and what’s a nested stack?
&lt;/h1&gt;

&lt;p&gt;Cloudformation is the AWS offering of &lt;em&gt;infrastructure as code&lt;/em&gt;. Instead of navigating the web UI adding and configuring resources, Cloudformation offers the capability of reading a user - supplied file (either JSON or YAML) containing the list of resources and their relationships and create them as the code states.&lt;br&gt;&lt;br&gt;
These resources must be grouped in &lt;em&gt;Stacks&lt;/em&gt;, which is the parentmost object that Cloudformation can process.&lt;br&gt;&lt;br&gt;
Things get interesting when stacks reference &lt;em&gt;other&lt;/em&gt; stacks, of course :-)&lt;/p&gt;
&lt;h1&gt;
  
  
  A nested stack example
&lt;/h1&gt;

&lt;p&gt;Doing this process with SDK, there are a couple of things that are not done automatically (or easier) than using the AWS CLI. Let’s get through them&lt;/p&gt;
&lt;h2&gt;
  
  
  Parent stack
&lt;/h2&gt;

&lt;p&gt;Below there is the JSON for the parent stack. There is no specification that this is a stack (it’s written in the &lt;em&gt;Description&lt;/em&gt;, but it’s arbitrary), because the command that will create the stack will ask first for a stack name to create (or update).&lt;br&gt;&lt;br&gt;
On the contrary, the child stack is explicitly declared as one of the resources with an &lt;em&gt;AWS::CloudFormation::Stack&lt;/em&gt; type (line 7). Within the &lt;em&gt;ChildStack&lt;/em&gt; element, the most important property is the &lt;em&gt;TemplateURL&lt;/em&gt; (line 9), that points to the file (YAML in this case). The path is relative to the root template location.&lt;br&gt;&lt;br&gt;
Here’s the code:&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Parent stack&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
"AWSTemplateFormatVersion": "2010-09-09",
"Description": "Root stack",
 ...
"Resources": {
"ChildStack":{
"Type" : "AWS::CloudFormation::Stack",
"Properties": {
"TemplateURL":"child-folder/child-template.yaml",
 ...
}
},
"OtherResource": {
"Type": "ResourceType",
 ...
}
}
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s view the child stack:&lt;/p&gt;

&lt;h2&gt;
  
  
  Child stack
&lt;/h2&gt;

&lt;p&gt;Also in this case there’s no particular reference to the fact that this is a child stack. The parent stack can pass parameters to the child stack via the parameter section, but that’s it.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Child stack&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;AWSTemplateFormatVersion: "2010-09-09"
Description: child stack
Parameters:
...
Resources:
LambdaEdge:
Properties:
Runtime: nodejs20.x
Handler: index.handler
Code:
S3Bucket:
...
S3Key: "example value"
Role: !GetAtt LambdaRoleForCF.Arn
Type: "AWS::Lambda::Function"
Outputs:
...

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The lambda reference
&lt;/h3&gt;

&lt;p&gt;On line 13, the template file references a S3 location there the zip file with the code must be found. So the zip file with the code must be uploaded in S3 before the creation of the parent stack is issued, otherwise the references won’t work. As it’s noted in the next section, the AWS CLI &lt;em&gt;package&lt;/em&gt; command can resolve references like this, but it’s not present in the SDK.&lt;br&gt;&lt;br&gt;
Once that the zip file with the code is uploaded onto S3, the creation of the stack can proceed.&lt;/p&gt;
&lt;h1&gt;
  
  
  Creating the stack
&lt;/h1&gt;

&lt;p&gt;Before creating the stack, the template file must be present on S3 in order to be pointed at by the create-stack command.&lt;br&gt;&lt;br&gt;
This is fine, but the template contains the child stack file path that references the local machine, which won’t never work from s3.&lt;br&gt;&lt;br&gt;
The solution for Cloudformation is packaging the stack.&lt;/p&gt;
&lt;h1&gt;
  
  
  Packaging the stack
&lt;/h1&gt;
&lt;h2&gt;
  
  
  Using the CLI
&lt;/h2&gt;

&lt;p&gt;Via the CLI is quite easy: there’s a &lt;em&gt;package&lt;/em&gt; command that&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;detects the substack reference&lt;/li&gt;
&lt;li&gt;searches for the referenced file&lt;/li&gt;
&lt;li&gt;uploads it on s3&lt;/li&gt;
&lt;li&gt;changes the reference in the parent file&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Java SDK
&lt;/h2&gt;

&lt;p&gt;Unfortunately, the aws-cli &lt;em&gt;package&lt;/em&gt; command is not available in Java SDK, so a manual approach must be taken.&lt;br&gt;&lt;br&gt;
Here’s the code from the &lt;a href="https://github.com/maguzzi/s3_static_website_gradle/blob/main/src/main/java/it/marcoaguzzi/staticwebsite/commands/misc/PackageTemplateCommand.java"&gt;S3 static website project&lt;/a&gt; (around line 44)&lt;br&gt;&lt;br&gt;
The code does the same stuff that the cli do:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;templateSrcPath&lt;/em&gt; (line 3) contains the path to the local parent template&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;s3PathToReplace&lt;/em&gt; (line 9) contains the path to the child stack template already uploaded to S3&lt;/li&gt;
&lt;li&gt;lines 4 to 11 cycle throught the parent template,

&lt;ul&gt;
&lt;li&gt;searches for a resource with type &lt;em&gt;AWS::CloudFormation::Stack&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;replaces the &lt;em&gt;TemplateURL&lt;/em&gt; attribute with the path of the template uploaded onto S3&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Then it persist the file with the substitution onto a temporary folder.
Using the SDK the template can be referenced from within the disk.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Template packaging&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;App.screenMessage("PACKAGE TEMPLATE START");
ObjectMapper objectMapper = new ObjectMapper();
JsonNode root = objectMapper.readTree(Utils.readFileContent(templateSrcPath)); 
Iterator&amp;lt;JsonNode&amp;gt; elements = root.get("Resources").elements();
while(elements.hasNext()) {
 JsonNode node = elements.next(); 
 if ("AWS::CloudFormation::Stack".equals(node.get("Type").asText())) {
JsonNode properties = node.get("Properties");
 ((ObjectNode)properties).put("TemplateURL", s3PathToReplace);
 }
};
File file = File.createTempFile(new SimpleDateFormat("yyyyMMddHHmmss").format(new Date()),"\_compiled\_template.json");
objectMapper.writeValue(file, root);
App.screenMessage("PACKAGE TEMPLATE END");

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once that the template on the disk has all of its references ready, the &lt;em&gt;CreateStack&lt;/em&gt; api can be called and the stack will be created along with all of its resources.&lt;/p&gt;

&lt;h1&gt;
  
  
  Improvements and what’s next
&lt;/h1&gt;

&lt;p&gt;The actual CLI package command does not only resolve nested stack references, but also lambda code (as it has been shown) and a bunch of other stuff (as stated here).&lt;br&gt;&lt;br&gt;
There are two major improvements to the code for the package section&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;also discover other type of references (like lambda code)&lt;/li&gt;
&lt;li&gt;automatically upload the referenced path to s3 without actually asking for it in the input parameters&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you’d like to browse to the full code &lt;a href="https://github.com/maguzzi/s3_static_website_gradle"&gt;here&lt;/a&gt;, please give it a go!&lt;/p&gt;

</description>
      <category>website</category>
      <category>aws</category>
      <category>cloudformation</category>
      <category>java</category>
    </item>
    <item>
      <title>Please stop publishing AWS S3 buckets as static websites! Read here for a secure, fast, and free-ish approach [1st episode]</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Mon, 06 May 2024 09:30:32 +0000</pubDate>
      <link>https://dev.to/maguzzi/please-stop-publishing-aws-s3-buckets-as-static-websites-read-here-for-a-secure-fast-and-free-ish-approach-1st-episode-4968</link>
      <guid>https://dev.to/maguzzi/please-stop-publishing-aws-s3-buckets-as-static-websites-read-here-for-a-secure-fast-and-free-ish-approach-1st-episode-4968</guid>
      <description>&lt;p&gt;I promise this is not yet another tutorial on how to publish a static website using AWS S3, or at least not solely smashing the S3 content onto the web. I’d like to show you a GitHub project that uses Java to orchestrate Cloudformation when deploying the architecture of a static website.&lt;/p&gt;

&lt;p&gt;The main purpose of this tool is going beyond the S3 out of the box website functionality, that is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Make the S3 bucket private (so, &lt;strong&gt;secure&lt;/strong&gt; )&lt;/li&gt;
&lt;li&gt;Provide HTTPS certificates ( &lt;strong&gt;secure,&lt;/strong&gt; again)&lt;/li&gt;
&lt;li&gt;Serve the content via cloudfront cache (so, &lt;strong&gt;fast&lt;/strong&gt; )&lt;/li&gt;
&lt;li&gt;Hide the complexities of working with Cloudformation&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  I’m in for &lt;em&gt;fast&lt;/em&gt; and &lt;em&gt;secure&lt;/em&gt;, but free…&lt;em&gt;ish?&lt;/em&gt;
&lt;/h1&gt;

&lt;p&gt;Not all the resources that need to be fired up for this architecture are within the AWS free tier, expecially the domain. Nevertheless, all the costs that I’ve seen after this website was published were only &lt;em&gt;live&lt;/em&gt; costs. Let’s review them from the most to the least expensive:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The domain: 20€ / year if hosted on Route53 (as &lt;em&gt;marcoaguzzi.it&lt;/em&gt;) but you can host it elsewere (on &lt;a href="https://www.cloudns.net/"&gt;cloudns&lt;/a&gt;, and it’s free)&lt;/li&gt;
&lt;li&gt;Route53 and Codepipeline: 1€ / month each. It’s one for the hosted zone and one for the pipeline. The pipeline comes with a good amount of free build / minutes&lt;/li&gt;
&lt;li&gt;Secret manager: less than 0.5€ / month (there’s a grace period when started)&lt;/li&gt;
&lt;li&gt;Cloudfront and S3: 0.01€ / month each&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Of course these are starting costs, they can go a lot higher as the usage increase, but it should be a welcomed issue, I suppose&lt;/p&gt;

&lt;h1&gt;
  
  
  Hide the complexities of Cloudformation
&lt;/h1&gt;

&lt;p&gt;Cloudformation migth be a burden to use, especially within the web UI. These are the main issue I addressed in the project:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Have a self - contained architecture&lt;/li&gt;
&lt;li&gt;Be repeatable. Could it deploy the same architecture on another domain?&lt;/li&gt;
&lt;li&gt;Ease the deploy process, especially when the domain is not hosted on Route53&lt;/li&gt;
&lt;li&gt;The nested stacks are not automatically resolved by Cloudformation&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  The Java tool to the rescue
&lt;/h1&gt;

&lt;p&gt;While experimenting with Java and Gradle, I wondered if I could use Java to mitigate the problems listed above by orchestrating the instructions that Cloudformation needs in order to deploy the website. This turned out as a Github project: &lt;a href="https://github.com/maguzzi/s3_static_website_gradle"&gt;https://github.com/maguzzi/s3_static_website_gradle&lt;/a&gt;. The Gradle build creates a distributable archive with all the needed jars.&lt;/p&gt;

&lt;h1&gt;
  
  
  How to use the project
&lt;/h1&gt;

&lt;p&gt;After the packaged app has been downloaded, what’s needed to run?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Java 8+&lt;/li&gt;
&lt;li&gt;An AWS account, with authentication in place. As of now, I’ve tested it with having ACCCESS_KEY and ACCESS_SECRET as enviornment variables locally. If those are not found, the tool stops.&lt;/li&gt;
&lt;li&gt;A dns domain. Let’s use a free service: &lt;a href="//s3staticwebsitetest.cloudns.ch"&gt;s3staticwebsitetest.cloudns.ch&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;A config file named &lt;em&gt;website.properties&lt;/em&gt; containing the website information, like this:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name = Fast, secure, free-ish S3 static website
environment = dev
domain = dev.s3staticwebsitetest.cloudns.ch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Tool commands
&lt;/h1&gt;

&lt;p&gt;Let’s set the LOG_LEVEL to INFO in order not to clog the shell, and check the existing stack in our AWS account:&lt;/p&gt;

&lt;h2&gt;
  
  
  DISTRIBUTION
&lt;/h2&gt;

&lt;p&gt;Expecting the aws account still without this infrastructure, this is the first command to run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;java -jar s3\_static\_website\_gradle-all.jar DISTRIBUTION

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here’s the output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2024-05-01T14:14:16 [main] INFO - AWS\_REGION: us-east-1
2024-05-01T14:14:16 [main] INFO - AWS\_ACCESS\_KEY\_ID: AKIA\*\*\*\*
2024-05-01T14:14:16 [main] INFO - AWS\_SECRET: \*\*\*\*\*
2024-05-01T14:14:16 [main] INFO - AWS setup done.
2024-05-01T14:14:17 [main] INFO - reading content from file: /home/maguzzi/demo/./website.properties
2024-05-01T14:14:17 [main] INFO - Command: DISTRIBUTION environment: dev
2024-05-01T14:14:17 [main] WARN - .websitesetup file does not exists. Creating
2024-05-01T14:14:17 [main] INFO - reading content from file: /home/maguzzi/demo/./.websitesetup
2024-05-01T14:14:17 [main] INFO - Setup new pseudoRandomTimestampString to 20240501141417491
2024-05-01T14:14:17 [main] INFO - Setup new zipDate to 20240501
2024-05-01T14:14:17 [main] INFO - 
2024-05-01T14:14:17 [main] INFO - -- s3-static-website-bootstrap-stack - dev CREATION START --
2024-05-01T14:14:17 [main] INFO - 
2024-05-01T14:14:17 [main] INFO - reading content from jar: jar:file:/home/maguzzi/demo/s3\_static\_website\_gradle-all.jar!/bootstrap/bootstrap.json
2024-05-01T14:14:18 [main] INFO - Stack s3-static-website-bootstrap-stack-dev not yet completed, wait
2024-05-01T14:14:23 [main] INFO - Stack s3-static-website-bootstrap-stack-dev not yet completed, wait
...

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It states:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The environment variable for the AWS configuration are in place&lt;/li&gt;
&lt;li&gt;The &lt;em&gt;website.properties&lt;/em&gt; file is in place&lt;/li&gt;
&lt;li&gt;The &lt;em&gt;.websitesetup&lt;/em&gt; file does not exist, and is created at run time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;pseudoRandomTimestampString&lt;/em&gt; is used to have unique names for the s3 buckets, while &lt;em&gt;zipDate&lt;/em&gt; to identify the artifact for the lambda.&lt;br&gt;&lt;br&gt;
Now the &lt;em&gt;bootstrap&lt;/em&gt; stack is being created, so that the S3 buckets will be in place when needed for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;packaging the templates&lt;/li&gt;
&lt;li&gt;upload the lambda artifact&lt;/li&gt;
&lt;li&gt;host the website content&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The last two lines state that the tool is waiting for the completion of the &lt;em&gt;bootstrap&lt;/em&gt; stack.&lt;/p&gt;

&lt;p&gt;Once that the first stack has been successfully created, the tool states it and outputs the export key for the s3 buckets that will be needed in the &lt;em&gt;distribution&lt;/em&gt; stacks:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Stack creation for stack id arn:aws:cloudformation:us-east-1:\*\*\*\*:stack/s3-static-website-bootstrap-stack-dev/\*\*\*\*\* terminated.
2024-05-01T14:14:54 [main] INFO - 
2024-05-01T14:14:54 [main] INFO - -- s3-static-website-bootstrap-stack - dev CREATION END --
2024-05-01T14:14:54 [main] INFO - 
2024-05-01T14:14:54 [main] INFO - ArtifactS3Bucket -&amp;gt; s3-static-website-lambda-artifact-dev-20240501141417491 (s3-static-website-bootstrap-stack-dev-LambdaArtifactBucket-Export-dev)
2024-05-01T14:14:54 [main] INFO - CompiledTemplateBucket -&amp;gt; s3-static-website-compiled-template-dev-20240501141417491 (s3-static-website-bootstrap-stack-dev-CompiledTemplateBucket-Export-dev)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It then continues to prepare the files that will be needed for the distribution stack (let’s view a trimmed version of the log):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2024-05-01T14:14:54 [main] INFO - -- ZIP ARTIFACT START --
...
2024-05-01T14:14:54 [main] INFO - -- ZIP ARTIFACT END --
2024-05-01T14:14:54 [main] INFO - 
2024-05-01T14:14:54 [main] INFO - ARTIFACT\_COMPRESSED\_PATH -&amp;gt; /tmp/cloudformation\_tmp10381003167251864437/lambda-edge-dev-20240501.zip (-)
...
2024-05-01T14:14:54 [main] INFO - -- UPLOAD FILE TO BUCKET START --
...
2024-05-01T14:14:55 [main] INFO - URL: https://s3-static-website-lambda-artifact-dev-20240501141417491.s3.amazonaws.com/lambda-edge-dev-20240501.zip
2024-05-01T14:14:55 [main] INFO - -- UPLOAD FILE TO BUCKET END --
...
2024-05-01T14:14:56 [main] INFO - -- PACKAGE TEMPLATE START --
2024-05-01T14:14:56 [main] INFO - reading content from jar: jar:file:/home/maguzzi/demo/s3\_static\_website\_gradle-all.jar!/distribution/website-distribution.json
2024-05-01T14:14:56 [main] INFO - -- PACKAGE TEMPLATE END --
2024-05-01T14:14:56 [main] INFO - 
...
2024-05-01T14:14:56 [main] INFO - -- s3-static-website-distribution-stack - dev CREATION START --
2024-05-01T14:14:56 [main] INFO - 
2024-05-01T14:14:56 [main] INFO - reading content from file: /tmp/202405011414567687239912532179902\_compiled\_template.json
2024-05-01T14:14:56 [main] INFO - 
2024-05-01T14:14:56 [main] INFO - -- s3-static-website-distribution-stack - dev CREATION END --
2024-05-01T14:14:56 [main] INFO - 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the log it can be seen that zip files for artifacts are loaded onto S3, and then the template with the sub-stack reference is packaged and put into a temporary folder before the create-stack command is issued.&lt;/p&gt;

&lt;h2&gt;
  
  
  DNS_INFO
&lt;/h2&gt;

&lt;p&gt;Since we’re using a free domain, the tool stops here because cloudformation can’t complete its creation without configuring the domain provider (cloudns in this case). So let’s check the DNS information that has to be provided to the cloudns:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;java -jar s3\_static\_website\_gradle-all.jar DNS\_INFO dns-info.txt

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and the output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2024-05-01T14:15:39 [main] INFO - AWS\_REGION: us-east-1
2024-05-01T14:15:39 [main] INFO - AWS\_ACCESS\_KEY\_ID: AK\*\*\*\*\*\*\*\*
2024-05-01T14:15:39 [main] INFO - AWS\_SECRET: \*\*\*\*\*
2024-05-01T14:15:39 [main] INFO - AWS setup done.
2024-05-01T14:15:40 [main] INFO - reading content from file: /home/maguzzi/demo/./website.properties
2024-05-01T14:15:40 [main] INFO - Command: DNS\_INFO environment: dev
2024-05-01T14:15:40 [main] INFO - reading content from file: /home/maguzzi/demo/./.websitesetup
2024-05-01T14:15:40 [main] INFO - PseudoRandomTimestampString already set to 20240501141417491
2024-05-01T14:15:40 [main] INFO - ZipDate already set to 20240501
2024-05-01T14:15:40 [main] INFO - 
2024-05-01T14:15:40 [main] INFO - -- ROUTE 53 INFO START --
2024-05-01T14:15:40 [main] INFO - 
2024-05-01T14:15:41 [main] INFO - Got hosted zone Id Optional[Z\*\*\*\*\*\*] for stack s3-static-website-distribution-stack-dev
2024-05-01T14:15:41 [main] INFO - hostedZoneId: Optional[Z\*\*\*\*\*]
2024-05-01T14:15:41 [main] INFO - Name: dev.s3staticwebsitetest.cloudns.ch. TTL: 172800
2024-05-01T14:15:41 [main] INFO - ns-1333.awsdns-38.org.
2024-05-01T14:15:41 [main] INFO - ns-1572.awsdns-04.co.uk.
2024-05-01T14:15:41 [main] INFO - ns-844.awsdns-41.net.
2024-05-01T14:15:41 [main] INFO - ns-458.awsdns-57.com.
2024-05-01T14:15:41 [main] INFO - 
2024-05-01T14:15:41 [main] INFO - -- ROUTE 53 INFO END --
2024-05-01T14:15:41 [main] INFO - 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The tool conventiently outputs the DNS information (last 4 lines) in the file &lt;em&gt;dns-info.txt&lt;/em&gt; (as specified on the command line).&lt;br&gt;&lt;br&gt;
It can be uploaded as-is on the cloudns web ui.&lt;/p&gt;
&lt;h2&gt;
  
  
  CHECK
&lt;/h2&gt;

&lt;p&gt;Once that the DNS provider has propagated the DNS records provided by AWS, the tool can be started again to check if the distribution stack has been completed.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;java -jar s3\_static\_website\_gradle-all.jar CHECK

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output is pretty straightforward:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2024-05-01T14:24:11 [main] INFO - AWS\_REGION: us-east-1
2024-05-01T14:24:11 [main] INFO - AWS\_ACCESS\_KEY\_ID: AKIA\*\*\*\*\*\*\*\*
2024-05-01T14:24:11 [main] INFO - AWS\_SECRET: \*\*\*\*\*
2024-05-01T14:24:11 [main] INFO - AWS setup done.
2024-05-01T14:24:12 [main] INFO - reading content from file: /home/maguzzi/demo/./website.properties
2024-05-01T14:24:12 [main] INFO - Command: CHECK environment: dev
2024-05-01T14:24:13 [main] INFO - Stack s3-static-website-distribution-stack-dev not yet completed, wait
...
2024-05-01T14:26:32 [main] INFO - Stack s3-static-website-distribution-stack-dev not yet completed, wait
2024-05-01T14:26:42 [main] INFO - Stack creation for stack id arn:aws:cloudformation:us-east-1:\*\*\*\*:stack/s3-static-website-distribution-stack-dev/\*\*\*\*\*\* terminated.

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  LIST
&lt;/h2&gt;

&lt;p&gt;Now we can list the stacks that have been created, along with their tags. As you can see, the random string that has been setup in the beginning has been propagated onto all the stacks, along with the S3 buckets and all the taggable resources.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;java -jar s3\_static\_website\_gradle-all.jar LIST

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here’s the output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2024-05-01T14:29:59 [main] INFO - AWS\_REGION: us-east-1
2024-05-01T14:29:59 [main] INFO - AWS\_ACCESS\_KEY\_ID: AKIA\*\*\*\*\*
2024-05-01T14:29:59 [main] INFO - AWS\_SECRET: \*\*\*\*\*
2024-05-01T14:29:59 [main] INFO - AWS setup done.
2024-05-01T14:30:00 [main] INFO - reading content from file: /home/maguzzi/demo/./website.properties
2024-05-01T14:30:00 [main] INFO - Command: LIST environment: dev
2024-05-01T14:30:00 [main] INFO - reading content from file: /home/maguzzi/demo/./.websitesetup
2024-05-01T14:30:00 [main] INFO - PseudoRandomTimestampString already set to 20240501141417491
2024-05-01T14:30:00 [main] INFO - ZipDate already set to 20240501
2024-05-01T14:30:00 [main] INFO - 
2024-05-01T14:30:00 [main] INFO - -- LIST STACK START --
2024-05-01T14:30:00 [main] INFO - 
2024-05-01T14:30:01 [main] INFO - s3-static-website-distribution-stack-dev-LambdaEdgeCloudFrontStack-FZ\*\*\*\*\*\* (CREATE\_COMPLETE) - [Tag(Key=s3\_static\_website\_environment, Value=dev), Tag(Key=s3\_static\_website, Value=S3 static website test), Tag(Key=s3\_static\_website\_timestamp\_tag, Value=20240501141417491)]
2024-05-01T14:30:01 [main] INFO - s3-static-website-distribution-stack-dev (CREATE\_COMPLETE) - [Tag(Key=s3\_static\_website\_environment, Value=dev), Tag(Key=s3\_static\_website, Value=S3 static website test), Tag(Key=s3\_static\_website\_timestamp\_tag, Value=20240501141417491)]
2024-05-01T14:30:01 [main] INFO - s3-static-website-bootstrap-stack-dev (CREATE\_COMPLETE) - [Tag(Key=s3\_static\_website\_environment, Value=dev), Tag(Key=s3\_static\_website, Value=S3 static website test), Tag(Key=s3\_static\_website\_timestamp\_tag, Value=20240501141417491)]
2024-05-01T14:30:02 [main] INFO - 
2024-05-01T14:30:02 [main] INFO - -- LIST STACK END --
2024-05-01T14:30:02 [main] INFO - 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At this point, the website has no content. The only thing left to do is to upload a random &lt;em&gt;index.html&lt;/em&gt; on the S3 bucket to see if all the process worked fine. I’ve let the step out of the tool because the website content is intented to be created and uploaded by a pipeline.&lt;br&gt;&lt;br&gt;
You can catch the random string of the s3 bucket in the log:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3 cp index.html s3 bucket

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then you can point the browser to &lt;a href="http://dev.s3staticwebsitetest.cloudns.ch/"&gt;http://dev.s3staticwebsitetest.cloudns.ch&lt;/a&gt; and see that it worked!&lt;/p&gt;

&lt;h1&gt;
  
  
  What’s next?
&lt;/h1&gt;

&lt;p&gt;In the next post, we’ll integrate the CICD pipeline that uploads the website content, along with some consideration about how to delete the stack without using the UI. Cloudformation and Cloudfront force some contraints on how the resources should be deleted, so it’s worth spending some time on it.&lt;/p&gt;

</description>
      <category>website</category>
      <category>aws</category>
      <category>cloudformation</category>
      <category>java</category>
    </item>
    <item>
      <title>Update Github token in Codepipeline with Cloudformation</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Mon, 25 Mar 2024 18:12:32 +0000</pubDate>
      <link>https://dev.to/maguzzi/update-github-token-in-codepipeline-with-cloudformation-hbe</link>
      <guid>https://dev.to/maguzzi/update-github-token-in-codepipeline-with-cloudformation-hbe</guid>
      <description>&lt;h2&gt;
  
  
  The use case
&lt;/h2&gt;

&lt;p&gt;This post comes from the fact that the token used by Codepipeline to connect to Github to download the source code of the website has expired. Hence, the automation “push and update the website” is not working. Here’s the error:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nAtbJlAD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/codepipeline-insufficient-permission.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nAtbJlAD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/codepipeline-insufficient-permission.png" alt="Error in pipeline" title="Error in pipeline" width="667" height="305"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let’s view how the secret is stored into cloudformation, and how codepipeline can connect.&lt;/p&gt;

&lt;h2&gt;
  
  
  The secret stack
&lt;/h2&gt;

&lt;p&gt;The cloudformation stack is quite easy. It does not have any hard dependency on other stacks, and it’s used both to download code for dev and prod website.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
 "AWSTemplateFormatVersion": "2010-09-09",
 "Parameters": {
 "GithubOAuthTokenParameter": {
 "Description": "Github OAuth Token",
 "NoEcho": "true",
 "Type": "String"
 }
 },
 "Resources": {
 "GithubOAuthToken": {
 "Properties": {
 "Name": "GithubOAuthToken",
 "SecretString": {
 "Ref": "GithubOAuthTokenParameter"
 }
 },
 "Type": "AWS::SecretsManager::Secret"
 }
 }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The next part of the post is dedicated on how to create and use this cloudformation template&lt;/p&gt;

&lt;h3&gt;
  
  
  Create stack
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation create-stack --stack-name secrets-stack `
--template-body file://secrets-stack.json `
--parameters file://secret-token-sample.json

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;em&gt;secret-stacks.json&lt;/em&gt; refers to the code just shown before, and the &lt;em&gt;secret-token-sample.json&lt;/em&gt; file is written with the parameter syntax. Here’s a sample:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[
 {
 "ParameterKey": "GithubOAuthTokenParameter",
 "ParameterValue": "sample value",
 "UsePreviousValue": false
 }
]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Check stack resources
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation list-stack-resources `
--stack-name secrets-stack `
--query 'StackResourceSummaries[\*].[LogicalResourceId,ResourceType,ResourceStatus]'

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output is quite straightforward (the stack was already created when I re-ran the command)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[
 [
 "GithubOAuthToken",
 "AWS::SecretsManager::Secret",
 "UPDATE\_COMPLETE"
 ]
]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is how the resource appears in AWS ui, once created (the output of the CLI is just the ARN of the stack).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--W9ymHNVJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-aws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--W9ymHNVJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-aws.png" alt="Secret AWS" title="Secret AWS" width="800" height="255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The actual value (the one that’s expired) of the token can be viewed by clicking on the retrieve secret value button. Time to move to Github and generate the new token&lt;/p&gt;

&lt;h2&gt;
  
  
  Github account setting
&lt;/h2&gt;

&lt;p&gt;While logged in into the Github profile where the repository with the code is kept, go to settings page, and then move to developer settings. Click on “fine grained tokens”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--H2EAEyLG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-setting-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--H2EAEyLG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-setting-1.png" alt="Fine grained tokens" title="Fine grained tokens" width="800" height="174"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Clicking on &lt;em&gt;AWS CP&lt;/em&gt; shows the token settings. It can be seen that it grants read-only access to one repository and no user permission. Of course the actual key cannot be seen anymore, it’s only possible to regenerate it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UK86vhzp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-setting-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UK86vhzp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-setting-2.png" alt="Token grants" title="Token grants" width="800" height="580"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on regenerate token&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bJN3JJ8I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/regenerate-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bJN3JJ8I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/regenerate-1.png" alt="Regenerate step 1" title="Regenerate step 1" width="800" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Confirm the token regeneration. The token will be visile only for it to be copied&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fnzrxpUJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/regenerate-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fnzrxpUJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/regenerate-2.png" alt="Regenerate step 2" title="Regenerate step 2" width="800" height="287"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Cloudformation update
&lt;/h2&gt;

&lt;p&gt;Now that the token is in our hands, we can update the cloudformation stack by putting the token in the place of “sample value” shown before and then issuing&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation update-stack --stack-name secrets-stack `
--template-body file://secrets-stack.json `
--parameters file://secret-token-sample.json

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s check the stack after the update&lt;/p&gt;

&lt;h3&gt;
  
  
  Check stack
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation describe-stacks --stack-name secrets-stack `
--query '[Stacks[0].[StackName,StackStatus,Parameters],Stacks[0].Outputs]'

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Output
&lt;/h4&gt;

&lt;p&gt;The &lt;em&gt;null&lt;/em&gt; value is because there are no outputs to be used by other stacks. Please note that in the template file &lt;em&gt;GithubOAuthTokenParameter&lt;/em&gt; is reported as &lt;em&gt;“NoEcho”: “true”&lt;/em&gt;. In this way the real token won’t be shown neither in output logs nor in the Cloudformation ui.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[
 [
 "secrets-stack",
 "UPDATE\_COMPLETE",
 [
 {
 "ParameterKey": "GithubOAuthTokenParameter",
 "ParameterValue": "\*\*\*\*"
 }
 ]
 ],
 null
]

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The token is then actually used in code pipeline&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HKnjNdIo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-aws-codepipeline.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HKnjNdIo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/secret-aws-codepipeline.png" alt="Github secret in codepipeline" title="Github secret in codepipeline" width="800" height="134"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And the cloudformation code hosting the reference to the token is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Configuration": {
 "Branch": { "Ref": "RepoBranchParameter" },
 "OAuthToken": "{{resolve:secretsmanager:GithubOAuthToken}}",
 "Owner": {"Ref":"RepoOwnerParameter" },
 ...
},

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Refresh and restart the pipeline
&lt;/h2&gt;

&lt;p&gt;Navigate the AWS UI in Codepipelines, select the pipeline, click edit pipeline:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rczoQJFX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/edit_pipeline.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rczoQJFX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/edit_pipeline.png" alt="Edit pipeline" title="Edit pipeline" width="800" height="88"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Search for the button “Edit stage” on the source stage:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gqICnhZp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/edit_stage.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gqICnhZp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/edit_stage.png" alt="Edit stage" title="Edit stage" width="800" height="97"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click it, and then search for the pencil in the “Github download” box:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2BhEA5Hw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/edit_stage_pencil.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2BhEA5Hw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/edit_stage_pencil.png" alt="Edit Github" title="Edit Github" width="800" height="173"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Even if repository and branch are selected, click on “Connect to Github”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--devePbsj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/connect_to_github.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--devePbsj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/connect_to_github.png" alt="Connect to Github" title="Connect to Github" width="332" height="158"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A pop-up will appear asking the oauth confirmation, click it&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kTXGyBj0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/oauth_request.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kTXGyBj0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/oauth_request.png" alt="OAuth request" title="OAuth request" width="388" height="177"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Insert repository and branch in their respective fields (a popup should appear, letting you select)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WyPDqCrz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/repo_and_branch.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WyPDqCrz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/repo_and_branch.png" alt="Respository and branch" title="Respository and branch" width="371" height="169"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Scroll to the bottom of the page and click done to save (actually refresh) the changes. Now you can trigger a new pipeline and it should connect with the repo and dowload the code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0b7QJ0SR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/codepipeline-ok.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0b7QJ0SR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/secret-github/codepipeline-ok.png" alt="Codepipeline ok" title="Codepipeline ok" width="480" height="347"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next time I’ll check how to automate the pipeline refresh via clouformation CLI.&lt;/p&gt;

&lt;p&gt;Thanks for reading it all!&lt;/p&gt;

</description>
      <category>website</category>
      <category>aws</category>
      <category>cloudformation</category>
      <category>codepipeline</category>
    </item>
    <item>
      <title>AI to revamp your resume: is it a paid tool worth?</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Mon, 05 Feb 2024 18:14:25 +0000</pubDate>
      <link>https://dev.to/maguzzi/ai-to-revamp-your-resume-is-it-a-paid-tool-worth-4djo</link>
      <guid>https://dev.to/maguzzi/ai-to-revamp-your-resume-is-it-a-paid-tool-worth-4djo</guid>
      <description>&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;After reading a Linkedin &lt;em&gt;Top Voice&lt;/em&gt; post, I got curious about her suggestion of using an AI tool to help revamping the resume. The website is called &lt;a href="https://resumeworded.com/"&gt;https://resumeworded.com/&lt;/a&gt;.&lt;br&gt;&lt;br&gt;
The website offers three main services:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Resume check&lt;/em&gt; - It analyzes the CV and produces a score and recommendations about it&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;LinkedIn profile check&lt;/em&gt; - Same as the resume, but with the LinkedIn profile&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Resume targeting&lt;/em&gt; - Given a job description, it tells how far the resume is from it&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This post is about my experience with it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Free vs. paid versions
&lt;/h2&gt;

&lt;p&gt;For each voice, there is a free and a paid version. The former offers the most basic checks, while the latter dives a lot deeper in the analysis.&lt;br&gt;&lt;br&gt;
I first played aroud with the free offer, and then decided to buy a one month subscription for the full version, to give it a proper try (full details on costs in the conclusions).&lt;/p&gt;

&lt;h2&gt;
  
  
  Check deep dive
&lt;/h2&gt;

&lt;p&gt;Let’s dive into all the available checks.&lt;br&gt;&lt;br&gt;
If you’re interested in the conclusions you can skip this section, but you’ll miss all the differences between paid and free version. In order to give an order of magnitude, I’ve hid the paid checks behind a clickable dropdown.&lt;/p&gt;

&lt;p&gt;Click on &lt;em&gt;Read more&lt;/em&gt; for the full details&lt;/p&gt;

&lt;p&gt;and a shortcut to conclusions.&lt;/p&gt;

&lt;h3&gt;
  
  
  Resume check
&lt;/h3&gt;

&lt;p&gt;Along with uploading the resume, the website asks the seniority level to be checked against: Junior, Mid or Senior. From what I’ve experienced, this affects how the different sections of the resume (or linkedin profile) should weight respect to each other.&lt;/p&gt;

&lt;h4&gt;
  
  
  Impact
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Quantifying impact&lt;/em&gt; - It basically searches for any number that can be put in the resume, in order to favour a quantitative over qualitative approach.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Repetition&lt;/em&gt; - This step encourages the applicant in removing repeated words.
#Expand to view the 4 paid only checks&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Weak verbs&lt;/em&gt; - Checks for phrases like “responsibile for” or “assisted” that reduce the impact.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Verb tenses&lt;/em&gt; - Checks if verbs are in the correct tense, so 1st person instead of 2nd or 3rd, and simple past instead of present continuous.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Responsibilities&lt;/em&gt; - Checks if the candidate words are more focused on responsibility than accomplishments (e.g. &lt;em&gt;managed x people&lt;/em&gt; is better than &lt;em&gt;responsible for&lt;/em&gt;).&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Spelling and consistency&lt;/em&gt; - Checks for spelling errors or inconsistencies like always using British or American English.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Brevity
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Length&lt;/em&gt; - Checks for resume lenght. For a senior profile 1 - 2 pages are fine.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Use of bullets&lt;/em&gt; - Bullet lists are deemed more effective than paragraphs.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Total bullets&lt;/em&gt; - For senior positions, 12 - 32 bullet points seems to an effective count.
#Expand to view the 2 paid only checks&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Bullet length&lt;/em&gt; - Each bullet point shoud be between 10 to 30 words.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Filler words&lt;/em&gt; - Adejctive and adverbs (filler words) should be replaced by numbers.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Style
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Buzzword&lt;/em&gt; - Checks for words that don’t add any value, such as “good team player”, “strong leadership”. &lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Dates&lt;/em&gt; - Checks if the resume displays all the dates consistently.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Contact and personal details&lt;/em&gt; - Checks if the resume is not disclosing any unwanted information, such as date of birth or race.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Readability&lt;/em&gt; - Checks if sentences are not involuted, too long, in passive voice. Sections should be easy to find with the proper keyword.
#Expand to view the 3 paid only checks&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Personal Pronouns&lt;/em&gt; - Checks for personal pronouns, which should be avoided.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Active voice&lt;/em&gt; - Checks for use of passive voice, whereas active voice should be preferred.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Consistency&lt;/em&gt; - Consistency checks. For instance, it checks wheter bullet points are always ending with a dot, or don’t.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Sections
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Summary&lt;/em&gt; - Checks for a summary section presence. If found, it checks for correct lenght. In the paid version, it also shows buzzword and effectiveness checks.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Education&lt;/em&gt; - Checks for an eduction section presence. If found, it checks if it is relevant to the candidate seniority level.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Unnecessary section&lt;/em&gt; - Checks for references and objecive section, which are deemed irrelevant. For a senior level resume, also hobbies and interests should be removed.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Skills&lt;/em&gt; - Checks for skills section presence. Soft skills are all for paying users.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Skills
&lt;/h4&gt;

&lt;h5&gt;
  
  
  Soft skills
&lt;/h5&gt;

&lt;h1&gt;
  
  
  Expand to view the 5 soft skills paid only checks
&lt;/h1&gt;

&lt;p&gt;These checks are all about finding sentences about what the candidate did in his / her experiences.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Communication&lt;/em&gt; - Presented something or lectured about something.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Leadership&lt;/em&gt; - Led or coached a team, took initiative.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Analytical&lt;/em&gt; - Worked with numbers, designed new processes.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Teamwork&lt;/em&gt; - Coordinated with different departments, worked in team.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Drive&lt;/em&gt; - More or less same checks about &lt;em&gt;Leadership&lt;/em&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;h5&gt;
  
  
  Hard skills
&lt;/h5&gt;

&lt;p&gt;This section searches for the presence of keywords that should be on the resume, for instance &lt;em&gt;Java&lt;/em&gt; on mine.&lt;/p&gt;

&lt;h4&gt;
  
  
  Tools
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Line Analysis&lt;/em&gt; - Each line is examined with the criteria as above and it can be rewritten as bullet point.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Magic write&lt;/em&gt; - This asks the AI to rewrite the sentences. It can be used also in a “freestyle” mode, that is, the suggestion is then editable in a free text box. It is subjected to a credit scheme. There are also “Samples from top resumes”, but it’s an hard task to merge superhero sentences in a standard resume.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Sample bullets&lt;/em&gt; - As the magic write, more oriented to bullets.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Action verbs&lt;/em&gt; - List of synonims to choose from for an higher impact.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Linkedin profile check
&lt;/h3&gt;

&lt;p&gt;In this case I’m instructed to export my linkedin profile in a PDF file (using the standard export function that Linkedin offers), and then upload it on the tool. Here’s a list of what gets checked:&lt;/p&gt;

&lt;h4&gt;
  
  
  Headline
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Headline length&lt;/em&gt; - This is a word count. It seems that 20 words and 168 characters are a good fit.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Reduntant words&lt;/em&gt; - No repeated words&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Hard skills&lt;/em&gt; - Keyword match against their database of skills to use in the profile
#Expand to view the 6 paid only checks&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Job titles&lt;/em&gt; - Job title should be present. In my case &lt;em&gt;Java Developer&lt;/em&gt; was deemed fine.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Language overuse&lt;/em&gt; - Checks for repetitions of job titles.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Boastful language&lt;/em&gt; - Checks for overconfident terms.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Special characters&lt;/em&gt; - Cheks if emojis have been overused.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Buzzwords and clichés&lt;/em&gt; - Checks for vaporware words such as: “motivated”, “team player” or “hardworking”.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Spelling&lt;/em&gt; - This checks does not affect the resume score, otherwise it would be a nigthmare, especially in tech resumes with all the languages and protocols and whatnots.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Summary
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Summary length&lt;/em&gt; - This is a word count. It seems that 200 - 300 words are a good fit.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Use of special characters&lt;/em&gt; - Icons and emojs are fun, but should not be overused. This is what the check is about.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Hard skills&lt;/em&gt; - Same checks as in the hard skills section in the headline.
#Expand to view the 9 paid only checks&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Call to action&lt;/em&gt; - Checks if a CTA is present, such as Feel free to reach me out on… &lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Readability&lt;/em&gt; - Checks for complex sentences, which might affect the engagement.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Buzzwords and clichés&lt;/em&gt; - Checks for &lt;em&gt;vaporware&lt;/em&gt; words such as: “motivated”, “team player” or “hardworking”.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Use of metrics&lt;/em&gt; - Checks for measurable item, such as “10+ years” or “data throughtput 100K/day”.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Active voice&lt;/em&gt; - Checks for passive voice sentences, which affect readability.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Sentiment analysis&lt;/em&gt; - Check for a positive tone of the summary&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Tense&lt;/em&gt; - Cheks for sentences in 2nd or 3rd person, which should be avoided in favour of 1st person sentences.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Spelling&lt;/em&gt; - Checks the spelling as in the headline.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Special characters&lt;/em&gt; - Cheks if emojis have been overused.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Experience
&lt;/h4&gt;

&lt;p&gt;For each experience that shows up in the profile, these checks are performed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Hard skills&lt;/em&gt; - Same checks as in the hard skills section in the headline.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Specific job title&lt;/em&gt; - This one can be tricky, because titles as “Senior consultant” or “Java developer” are considered too vague. Two examples of how I’ve pleased the algorithm:

&lt;ul&gt;
&lt;li&gt;Senior Java backend web developer | Unicredit | SGSS Italia (this was some years ago in consultancy)&lt;/li&gt;
&lt;li&gt;Java / Scala backend application and integration developer (this is my current position)
#Expand to view the 8 paid only checks&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Work description details&lt;/em&gt; - Checks for keywords and if the work description is 50+ words long&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Recruiter red flag and unemployment indicators&lt;/em&gt; - Checks if the experience is, for example, a “planned career break”&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Quantified impact&lt;/em&gt; - Checks for measurable item, such as “10+ years” or “data throughtput 100K/day”.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Weak language&lt;/em&gt; - Checks for phrases like “responsibile for” or “assisted” that reduce the impact&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Spelling&lt;/em&gt; - This checks does not affect the resume score, otherwise it would be a nigthmare, especially in tech resumes with all the languages and protocols and whatnots.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Active voice&lt;/em&gt; - Checks for passive voice sentences, which affect readability.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Special characters&lt;/em&gt; - Cheks if emojis have been overused.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;Buzzwords &amp;amp; clichés&lt;/em&gt; - Checks for &lt;em&gt;vaporware&lt;/em&gt; words such as: “motivated”, “team player” or “hardworking”.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Education
&lt;/h4&gt;

&lt;p&gt;This section checks if the education inserted matches the experience with keywords, if the dates, when not too in the past (+15 years), are reported or if there’s some reference to joining an Alumni group.&lt;/p&gt;

&lt;h3&gt;
  
  
  Other
&lt;/h3&gt;

&lt;p&gt;Checks about other nice to have’s:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Custom profile URL&lt;/li&gt;
&lt;li&gt;Honors and awards&lt;/li&gt;
&lt;li&gt;Location is present&lt;/li&gt;
&lt;li&gt;Certification section&lt;/li&gt;
&lt;li&gt;Professional profile photo&lt;/li&gt;
&lt;li&gt;Skills section&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Keywords to Add
&lt;/h4&gt;

&lt;p&gt;This section is about suggestions on keywords that can be added to the LinkedIn profile. The free version has a lot less suggestions than the paid one.&lt;/p&gt;

&lt;h4&gt;
  
  
  Keyword analysis
&lt;/h4&gt;

&lt;p&gt;Paying users only - This tool rank the keywords found in the profile based on their placement (header, summary…).&lt;/p&gt;

&lt;h4&gt;
  
  
  Networking
&lt;/h4&gt;

&lt;p&gt;Paying users only - Here there are listed some example of messages a user might want to send on LinkedIn for networking purpose.&lt;/p&gt;

&lt;h3&gt;
  
  
  Resume targeting
&lt;/h3&gt;

&lt;p&gt;Given a job description, it tells you how much your CV matches it based on found keywords.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I liked
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;I’ve been given a lot of ideas and suggestions navigating trought the various sections of the websites. I had some doubts on my resume and those have been cleared out (spoiler: if you think that’s a bad idea, yes, it is).&lt;/li&gt;
&lt;li&gt;The checks are repeated through the various sections, and also that helps the overall consistence.&lt;/li&gt;
&lt;li&gt;Going through keywords and metrics can be exhausting, but forces the resume writer to squeeze the experiences and present them in the most effective way. I’m stating this point also in the “I did not like” section below because I think it’s the other side of the coin.&lt;/li&gt;
&lt;li&gt;I think that the LinkedIn profile analysis is the section that delivers the most value. Probably having a fixed structure to start with helps the tool a lot in categorizing the information.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What I did not like
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;There are no multiple versions of resumes (not even for paying users). It would have been nice if more than one resume versions were allowed, for example, a longer and a shorter version. &lt;/li&gt;
&lt;li&gt;The free version only allows two uploads of the resume. Basically the resume should be all adjusted on the first attempt.&lt;/li&gt;
&lt;li&gt;A lot of typos where tech keywords.&lt;/li&gt;
&lt;li&gt;The keyword based checks are too rigid, sometimes it seems that only the exact match will let you score the point for the analysis. This is especially true in the resume targeting section. Also, the call to action search went through only when I used the keywords found in corresponding suggestion section.&lt;/li&gt;
&lt;li&gt;The tool just craves for numbers too much. Software development metrics are a lot of times different from business ones, and it’s difficult to directly link a feature release with revenues. Most of the times these are subjects for different divisions.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusions
&lt;/h2&gt;

&lt;p&gt;I think that there is a lot of difference between the paid and the free versions, and 40$ a month (it gets cheaper if invoiced quarterly or yearly) is indeed quite expensive. Nowadays ATS scan resumes and it’s important to give the algorithm what the algorithm wants. Tools like this provide great help in tuning the CV for this. It’s always interesting to compare what have been done with a benchmark of some kind, be it a colleague or an automated system. As of today, I payed a one month subscription and I’m keeping it only till the end of the month.&lt;br&gt;&lt;br&gt;
I recommed everyone to give a try to the free version. If interested, go for a one month subscription to leverage all the possibile checks.&lt;br&gt;&lt;br&gt;
I’ll be more than happy to know your thoughts!&lt;/p&gt;

</description>
      <category>tools</category>
      <category>cv</category>
      <category>resume</category>
      <category>ai</category>
    </item>
    <item>
      <title>A new AWS account: leave ROOT user and look out for expenses</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Tue, 16 Jan 2024 09:15:02 +0000</pubDate>
      <link>https://dev.to/maguzzi/a-new-aws-account-leave-root-user-and-look-out-for-expenses-4f8d</link>
      <guid>https://dev.to/maguzzi/a-new-aws-account-leave-root-user-and-look-out-for-expenses-4f8d</guid>
      <description>&lt;p&gt;Congrats! you’ve just opened a brand new AWS account. What now? Beside getting rid of the root account, the second most wise action to do before doing anything is setting some control for bills.&lt;br&gt;&lt;br&gt;
I’m writing this post because some months ago I incurred in a 20 - something dollar bill from AWS for one of the accounts I opened in order to do some exercises. The account hadn’t much going on, but I left a disconnected elastic IP on for about a week… thus the mishap.&lt;br&gt;&lt;br&gt;
So let’s see what I’d love to have done in that situation, of course along with the respective Cloudformation templates.&lt;/p&gt;
&lt;h2&gt;
  
  
  Activate cost explorer
&lt;/h2&gt;

&lt;p&gt;While being with the root account, you might want to turn cost explorer on. You can do it from two places: in the main UI you should see a box with “Cost and usage”, and at its center a button stating “Turn on cost explorer”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NDIj9tcr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/turn_on_cost_explorer_root.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NDIj9tcr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/turn_on_cost_explorer_root.png" alt="Wise root usage: activate cost explorer" width="636" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Or you can go in the upper left corner of the webpage, open the dropdown of the account by clicking on it, and go to “Billing and cost management from there”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HpBLf4jt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_management_other_access.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HpBLf4jt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_management_other_access.png" alt="Activate cost explorer: an alternate path" width="322" height="339"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  A little wait
&lt;/h3&gt;

&lt;p&gt;The AWS cost explorer needs more or less 24 hours to set up itself and start collecting spending data. Because of this, even with a root account you might see an “access denied” on the cost explorer UI once activated.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Bbw-d27I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/access_denied.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Bbw-d27I--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/access_denied.png" alt="Wait for cost explorer to set up" width="800" height="240"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While waiting for AWS to set up its data, you can create a new admin user that you will use to manage the account and view the bills&lt;/p&gt;
&lt;h3&gt;
  
  
  Create a new IAM user
&lt;/h3&gt;

&lt;p&gt;I’ll go quick over this procedure, since there are a lot of step by step tutorials out there.&lt;br&gt;&lt;br&gt;
Log in into root account and, using the search bar on the top left of the webpage, search for IAM. There will be a bunch of warning such as: “your root account does not have MFA enabled”, let’s skip those. Search for “Users” and, on the page you’ll be directed to, click on “Create user”. You’ll get to a page like this one:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Nm35GDgw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/create-user-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Nm35GDgw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/create-user-1.png" alt="Create user - Name and password" width="800" height="253"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Be sure to flag “Provide user access to the AWS Management Console - &lt;em&gt;optional&lt;/em&gt;“. This will trigger some other warnings about how users are created. Since this user will be an administrator and will be used also for creating cloudformation stack later, we’ll skip those. I’m specifying the password on creation and unflag the “ask password on login” for simplicity.&lt;br&gt;&lt;br&gt;
Next, let’s give our newly created user the administrator powers. To do so, AWS already provides a policy that can be attached directly to the user that basically states “can do anything on any resource”. Below is the policy choice:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BEw8R2Kr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/create-user-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BEw8R2Kr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/create-user-2.png" alt="Create user - Administrator policy" width="800" height="265"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After linking the policy to the user, we have now an &lt;em&gt;admin&lt;/em&gt; account that has all the power we need without being the root account. However, if we login with this user and search for Cost explorer, we’ll still see “access denied”, as shown below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_DmuBzY6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/iam-access-denied.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_DmuBzY6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/iam-access-denied.png" alt="Still no access to cost explorer :-(" width="653" height="226"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So, it’s time to log back with the root account (for the last time) and enable IAM cost control.&lt;/p&gt;
&lt;h2&gt;
  
  
  Enable IAM cost control
&lt;/h2&gt;

&lt;p&gt;While logged in as the root user, search for “account” settings in the user menu on the top right of the page:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HvyLGuvF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/activate-iam-billing-1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HvyLGuvF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/activate-iam-billing-1.png" alt="Activate iam billing - Account settings" width="331" height="379"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Scroll down on the page and search for “IAM user and role access to Billing information”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jnapFGi4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/activate-iam-billing-2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jnapFGi4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/activate-iam-billing-2.png" alt="Activate iam billing - setting" width="800" height="106"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on the edit button and activate it. Now you can get back to the admin user and, if you were being logged in another browser, hit refresh and you should see the cost explorer enabled, stating a reassuring 0 USD expenses.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cJuoT1oC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_and_usage_iam.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cJuoT1oC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_and_usage_iam.png" alt="Cost and usage from the admin account" width="644" height="463"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can now go to billing and cost management and search for the cost monitor. As expected both budget and monitor require setup. Below is what you should see before configuring both. Please note two things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Depending on when you create the account, you can see a default cost monitor already setup, with 100 USD and 40% usage thresholds&lt;/li&gt;
&lt;li&gt;You should still see an “access denied” below the “Total forecasted month costs”, but that’s fine, it’s only because the account has just been created.
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uiCx84ej--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_monitor_before.png" alt="Budget and monitor before setup" width="370" height="331"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After some time, the cost explorer preview in the home page should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--qlUZr79N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_and_usage_after_some_time.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--qlUZr79N--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_and_usage_after_some_time.png" alt="Cost explorer preview after some time" width="627" height="442"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see, the costs are split by service and by month, and it should be clear which is the service spending more money. Now let’s get to Cloudformation.&lt;/p&gt;
&lt;h2&gt;
  
  
  Create Budget and Monitor
&lt;/h2&gt;

&lt;p&gt;The Cloudformation file that is going to setup budget and monitor is quite simple. I’m showing the whole file first, and then I’ll get to the highlights.&lt;/p&gt;



&lt;p&gt;Here it is what to expect after running the CloudFormation template:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p_Kv4pGD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost-monitor-after-cloudformation.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p_Kv4pGD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost-monitor-after-cloudformation.png" alt="Budget and monitor created" width="376" height="173"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Parameter section
&lt;/h3&gt;

&lt;p&gt;This stack will have one input parameter: the email address were we want to send the notifications when eventually the budget or the monitor will hit the thresholds. That will go from line 3 to 8. The same email address will be used both for the budget and the monitor.&lt;/p&gt;
&lt;h3&gt;
  
  
  The budget
&lt;/h3&gt;

&lt;p&gt;The whole section goes from line 11 to line 41. The resource type is &lt;em&gt;AWS:Budgets:Budget&lt;/em&gt;. Some info about the properties:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Budget&lt;/em&gt;: You’ll have the BudgetLimit, with the amount and the unit. Even after switching my billings to EUR, the only accepted value seems to be USD. Anyway, it’s stating that the expenses should not exceed 6 bucks&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;TimeUnit&lt;/em&gt;: Here we’re saying 6 bucks a month, max&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;BudgetType&lt;/em&gt;: Just &lt;em&gt;COST&lt;/em&gt; here. Other values would have pointed to usage or reserved instance usage, or saving plans. We’re not doing anything that fancy here&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;NotificationsWithSubscribers&lt;/em&gt;: Where to send the email when the threshold gets hit. In this case it’s stating that if the forecast is greater than 80% of the threshold (6 dollars), the email will be sent.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The budget can also be set for taking actions such as “run stuff if costs are too high”, but the architecture for this website isn’t a good fit for an actual example.&lt;br&gt;&lt;br&gt;
Let’s see how a budget line should show after the stack has been created:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9AE_urOR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/budget1_line.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9AE_urOR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/budget1_line.png" alt="A budget line" width="800" height="109"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Clicking on the line, you can see the configuration of the budget and if any alarm has been fired:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mvjuA8ib--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/budget_detail.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mvjuA8ib--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/budget_detail.png" alt="Budget and alarms" width="800" height="379"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  The monitor
&lt;/h3&gt;

&lt;p&gt;This section goes from line 42 to the end of the file. Also in this case the instructions are for sending an email if the thresholds are hit, but in a different flavor. Lines 45 to 47 state how the costs are looked after, and lines 50 to the end tell that the monitor will scan the costs every day (&lt;em&gt;Frequency:Daily&lt;/em&gt;, line 55), checking if any of the service in use is spending more than 10 USD. If so, the email will be sent.&lt;br&gt;&lt;br&gt;
In this file the monitor specification are kept as simple as possible, thus &lt;em&gt;SERVICE&lt;/em&gt; is a mandatory value, and the type is &lt;em&gt;DIMENSIONAL&lt;/em&gt;.&lt;br&gt;&lt;br&gt;
The monitor could also split the costs with a finer grain (e.g. by tags on resources), but (idk if luckly or unluckly) my costs are too low and the resources involved wouldn’t fit for such an analysis.&lt;br&gt;&lt;br&gt;
Here’s the page of the monitor details, showing the history of fired anomalies (luckly, none so far).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YwGlbar7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_anomaly_detection_summary.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YwGlbar7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cost_explorer/cost_anomaly_detection_summary.png" alt="Cost anomaly detection summary" width="800" height="274"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Cloudformation CLI setup
&lt;/h2&gt;

&lt;p&gt;I’m going for multiline with the CLI instruction in order to minimize scrolling.&lt;br&gt;&lt;br&gt;
There’s no dev or prod environment in this case, since both are under the same AWS account.&lt;/p&gt;
&lt;h3&gt;
  
  
  Create stack
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation create-stack --stack-name cost-control-stack \
--template-body file://cost-control.json \
--parameters ParameterKey=EmailAddressForNotificationParameter, \
ParameterValue=&amp;lt;email\_address&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h3&gt;
  
  
  Update stack
&lt;/h3&gt;

&lt;p&gt;The only relevant point here is to use &lt;em&gt;UsePreviousValue=true&lt;/em&gt; in order to leave the email notification parameter untouched.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws cloudformation update-stack --stack-name cost-control-stack \
--template-body file://cost-control.json \
--parameters ParameterKey=EmailAddressForNotificationParameter, \
UsePreviousValue=true

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-budgets-budget.html"&gt;https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-budgets-budget.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-ce-anomalymonitor.html"&gt;https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-ce-anomalymonitor.html&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>website</category>
      <category>aws</category>
      <category>cloudformation</category>
      <category>gist</category>
    </item>
    <item>
      <title>Cloudformation templates for Cloudfront automatic cache invalidation using Lambda within CodePipeline</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Wed, 03 Jan 2024 18:00:00 +0000</pubDate>
      <link>https://dev.to/maguzzi/cloudformation-templates-for-cloudfront-automatic-cache-invalidation-using-lambda-within-codepipeline-39ko</link>
      <guid>https://dev.to/maguzzi/cloudformation-templates-for-cloudfront-automatic-cache-invalidation-using-lambda-within-codepipeline-39ko</guid>
      <description>&lt;p&gt;(original post: &lt;a href="https://marcoaguzzi.it/2024/01/03/lambda-invalidation-cloudformation/"&gt;https://marcoaguzzi.it/2024/01/03/lambda-invalidation-cloudformation/&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;In this post I’m going to show how I triggered an automatic cache invalidation for the Cloudfront distribution that is serving this website. As in the previous posts, all the resources will be provisioned via CloudFormation.&lt;br&gt;&lt;br&gt;
At the end of the post the CLI commands to create and / or update the resources will be shown.&lt;/p&gt;
&lt;h2&gt;
  
  
  The manual procedure
&lt;/h2&gt;

&lt;p&gt;Once that the markdown file for a post is written and a local compilation / rendering has been made, the markdown source can be pushed on the git repo. That triggers the AWS Codepipeline that will download the source, render the markdown into html, and push the result to the S3 bucket served by Cloudfront.&lt;br&gt;&lt;br&gt;
Since Cloudfront is serving the S3 bucket, caching is in place. Newly pushed content won’t be visible until the cache expires, which is not feasible. So, after a successful compilation and pushing to S3, I manually get to Cloudfront distribution invalidations and fire a new invalidation. This way I’m sure that subsequent requests to the website will get the newly updated content.&lt;br&gt;&lt;br&gt;
In the images below the steps for manual invalidation are shown:&lt;/p&gt;

&lt;p&gt;Go to CloudFront / Distributions, and search for “Invalidations” tab&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GzL7WX5V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GzL7WX5V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_1.png" alt="Cloudfront invalidation manual step 1" width="744" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then selecting the last successful invalidation (shown below on the very left) and “copy to new” (upper right)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Fohxo2wz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Fohxo2wz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_2.png" alt="Cloudfront invalidation manual step 2" width="800" height="174"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then confirming the copy of the invalidation with the last path (the path /* is fine since AWS charges per invalidation, regardless of how much deep it is)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KwrXvuav--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KwrXvuav--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_3.png" alt="Cloudfront invalidation manual step 3" width="800" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The invalidation takes a few minutes to be completed, and then the website is good to go. This is a mundane and forgetful-prone task, so I’m better automating it.&lt;/p&gt;
&lt;h2&gt;
  
  
  Automation setup
&lt;/h2&gt;

&lt;p&gt;There is not an “invalidate cache” action that can be directly call from CodePipeline. A Lambda that actually creates the invalidation is needed and must be called as an action in the CodePipeline structure.&lt;br&gt;&lt;br&gt;
Let’s see in details the two resources:&lt;/p&gt;
&lt;h3&gt;
  
  
  The Lambda function
&lt;/h3&gt;

&lt;p&gt;The Lambda function will leverage boto3 Python libraries to create the invalidation and notify the pipeline about the outcome (credits to the website in the reference section).&lt;br&gt;&lt;br&gt;
Let’s see some highlights. At the end of this section the link to the gist with the full source is provided.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Read cloudformation ID from environment&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Caller notifications&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;Invalidation specifications&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h1&gt;
  
  
  Click to view the Gist with the Lambda Python code
&lt;/h1&gt;
&lt;h3&gt;
  
  
  Lambda Cloudformation stack and how to reference it
&lt;/h3&gt;

&lt;p&gt;The Lambda cloudformation stack is similar to the one presented for the 301 redirects and URL rewriting in edge locations (&lt;a href="https://dev.to/2023/12/10/redirect-301-with-lambda/"&gt;here’s the post&lt;/a&gt;); there are a few differences, though (at the end of the paragraph there is the gist with the full code):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the lambda’s resources, an environment variable is declared and its value is read from the cloudformation stack that contains the cloudfront distribution (referenced in the python code as &lt;em&gt;CLOUDFRONT_DISTRIBUTION_ID&lt;/em&gt;).
To be able to read that from here, the cloudformation stack that contains the cloudfront distribution has to list the variable as an output and flag it to be exported:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Cloudfront distribution environment variable reference&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Parameters:
CloudformationExportVar:
Type: String
...
Environment:
Variables:
CLOUDFRONT\_DISTRIBUTION\_ID:
Fn::ImportValue:
!Ref CloudformationExportVar

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And here’s the export in the stack hosting the cloudfront distribution&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Exported value from cloudfront distribution resource&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Outputs": {
"CloudFrontDistributionId": {
"Description": "ID of the CloudFront distribution",
"Value": { "Fn::GetAtt": ["CloudFrontDistribution", "Id"] },
"Export": {
"Name": { "Fn::Join": ["-", [{ "Fn::Sub": "${AWS::StackName}-CloudFrontDistributionId" }, { "Ref": "Stage" }]]}
 }
 }
 }

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;There is no need for the edge permission (only &lt;em&gt;lambda.amazonaws.com&lt;/em&gt; is needed)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;services for the AssumeRole action&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;AssumeRolePolicyDocument:
Version: "2012-10-17"
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action: "sts:AssumeRole"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Other than the &lt;em&gt;BasicExecutionRole&lt;/em&gt;, three other permissions must be granted for the creation of the invalidation and the notification back to the CodePipeline

&lt;ul&gt;
&lt;li&gt;cloudfront:CreateInvalidation&lt;/li&gt;
&lt;li&gt;codepipeline:PutJobFailureResult&lt;/li&gt;
&lt;li&gt;codepipeline:PutJobSuccessResult&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Additional permissions for basic lambda role&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Policies:
- PolicyName: InvalidateCloudfrontDistributionPolicy
PolicyDocument: 
Version: "2012-10-17"
Statement:
- Effect: Allow
Action:
- cloudfront:CreateInvalidation
- codepipeline:PutJobFailureResult
- codepipeline:PutJobSuccessResult
Resource: "\*"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That should put the lambda in place for the purpose.&lt;br&gt;&lt;br&gt;
Here below you can view the full gists of the cloudformation stack of&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the lambda#Click to view the Gist&lt;/li&gt;
&lt;li&gt;the cloudfront distribution (which is the parent stack as shown in the older articles)#Click to view the Gist&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  CodePipeline stage
&lt;/h3&gt;

&lt;p&gt;The action can be added in CodePipeline as a new stage. From there the lambda can be referenced. Here’s how the new stage will look like after cloudformation template has been deployed (You can see that the action is referring to the lambda and still no runs shown):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Oh1md5E---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Oh1md5E---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_4.png" alt="Codepipeline stage invalidation new" width="650" height="260"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Codepipeline Cloudformation stack
&lt;/h3&gt;

&lt;p&gt;Let’s start from the addition of the new stage in CloudFormation. We can see&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the parameter that will reference the exported value from the lambda stack. The parameter value contains the name of the exported variable, and will reference the lambda function name&lt;/li&gt;
&lt;li&gt;the action&lt;/li&gt;
&lt;li&gt;the updated permissions in order to allow the calling of the lambda function from the pipeline:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;New stage in Codepipeline&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Parameters": {
...
"InvalidationLambdaExported": {
"Description": "Lambda function performing the cache invalidation",
"Type": "String"
 }
}
... stages ...
{
"Actions": [
 {
"ActionTypeId": {
"Category": "Invoke",
"Owner": "AWS",
"Provider": "Lambda",
"Version": "1"
 },
"Configuration": {
"FunctionName": {"Fn::ImportValue":{"Ref":"InvalidationLambdaExported"}}
 },
"Name": "InvalidateCloudFrontCacheAction"
 }
 ],
"Name": "InvalidateCloudFrontCacheStage"
}
...
"Policies": [
 {
"PolicyDocument": {
"Statement": [
 {
"Action": [
...
"lambda:invokeFunction"
 ],
"Effect": "Allow",
"Resource": "\*"
 }
 ],
"Version": "2012-10-17"
 },
"PolicyName": "MyCodePipelineRolePolicy"
 }
]


The full reference to the cloudformation template can be found in the gist below:

#Click to view the Gist cloudformation for the Cloudfront distribution&amp;lt;script src="https://gist.github.com/maguzzi/4899697488d40105dd51ce2c37c1e327.js"&amp;gt;&amp;lt;/script&amp;gt;

Below is how the new CodePipeline stage should turn out if everything was successful:

 ![Codepipeline stage invalidation succedeed](https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_invalidation_5.png)

And that should do for adding a new stage with a new action calling the lambda and invalidating the cache

## Cloudformation CLI commands

Here’s the AWS CLI commands (legit ones!) that have been fired in order to create and / or update the cloudformation stacks (and the lambda, of course):

_CloudFront stack update_

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;aws cloudformation package --template-file marcoaguzzi.json --s3-bucket cf-templates-e5ht2sji9no7-us-east-1 --output-template-file target\packaged-template.yaml&lt;br&gt;
aws cloudformation deploy --template-file target\packaged-template.yaml --stack-name marcoaguzzi-website-prod --capabilities CAPABILITY_NAMED_IAM --parameter-overrides Stage=prod&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
It should appear the exported output variable:  
 ![Export from CloudFront](https://marcoaguzzi.it/img/cache-invalidation-cloudformation/cloudfront_output.png)

_New lambda creation, passing the export from above_

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;cd lambda-cloudfront&lt;br&gt;
compress-Archive .\index.py .\lambda-cloudfront-invalidate-prod-20240101.zip&lt;br&gt;
aws s3 cp lambda-cloudfront-invalidate-prod-20240101.zip s3://lambda-artifacts-bucket-maguzzi/&lt;br&gt;
aws cloudformation create-stack --stack-name lambda-invalidate-cloudfront-prod --template-body file://lambda-invalidate.yaml --capabilities CAPABILITY_NAMED_IAM --parameters ParameterKey=ZipDate,ParameterValue=20240101 ParameterKey=Stage,ParameterValue=prod ParameterKey=CloudformationExportVar,ParameterValue=marcoaguzzi-website-prod-CloudFrontDistributionId-prod&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
The new stack is created and the lambda name is exported in outputs:  
 ![Export from Lambda](https://marcoaguzzi.it/img/cache-invalidation-cloudformation/lambda_output.png)

_CodePipeline update referencing the lambda exported variable in parameters_

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;aws cloudformation update-stack --stack-name marcoaguzzi-stack-codepipeline-prod --template-body file://marcoaguzzi-codepipeline.json --parameters file://parameters-codepipeline-prod.json --capabilities &lt;br&gt;
CAPABILITY_IAM&lt;/p&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;


And now the Codepipeline should have the last stage as shown in the pictures above, ready to invalidate the cache after the website deploy to S3 :-)

## References

- [AWS: Creating a CloudFront Invalidation in CodePipeline using Lambda Actions](https://medium.com/fullstackai/aws-creating-a-cloudfront-invalidation-in-codepipeline-using-lambda-actions-49c1fd3a3c31)
- [AWS Cloudformation user guide](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/Welcome.html)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>website</category>
      <category>aws</category>
      <category>cloudformation</category>
      <category>cloudfront</category>
    </item>
    <item>
      <title>SEO optimizations with Cloudformation</title>
      <dc:creator>Marco Aguzzi</dc:creator>
      <pubDate>Fri, 15 Dec 2023 17:49:53 +0000</pubDate>
      <link>https://dev.to/maguzzi/seo-optimizations-with-cloudformation-44bk</link>
      <guid>https://dev.to/maguzzi/seo-optimizations-with-cloudformation-44bk</guid>
      <description>&lt;p&gt;(Original post: &lt;a href="https://marcoaguzzi.it/2023/12/15/SEO-optimization-cloudformation/"&gt;https://marcoaguzzi.it/2023/12/15/SEO-optimization-cloudformation/&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;Looking (again) at SEO metrics, I wanted to fix two misbehaviors of the website: compression and error pages.&lt;br&gt;&lt;br&gt;
Let’s get through the process:&lt;/p&gt;
&lt;h1&gt;
  
  
  HTTP compression
&lt;/h1&gt;

&lt;p&gt;This has been an easy one. The SEO tool wanted the site to accept compression, so moving from requesting this (locahost:4000 is the local hexo server where the html rendering is immediately visible):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GET / HTTP/1.1Host: localhost:4000Accept-Encoding: gzip, deflate, br
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and getting no matching compression to asking for this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;GET / HTTP/1.1Host: marcoaguzzi.itAccept-Encoding: gzip, deflate, br
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and be answered&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Content-Encoding: br
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;which is the confirmation that Brotli compression is enabled.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cloudformation snippet
&lt;/h2&gt;

&lt;p&gt;The configuration should be done in the cloudfront serving the site hosted on S3. Namely, in the DefaultCacheBehavior by adding &lt;em&gt;Compress: True&lt;/em&gt; to the yaml. The other settings required in the cache (enable gzip and enable brotli) are already served by referencing the pre-made &lt;em&gt;CachingOptimized&lt;/em&gt; &lt;em&gt;CachePolicy&lt;/em&gt;. Here’s the snippet: for clarity, the &lt;em&gt;CachePolicyId&lt;/em&gt; is referenced in a map,&lt;br&gt;&lt;br&gt;
and, in this case, its value is &lt;em&gt;658327ea-f89d-4fab-a63d-7e88639e58f6&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"DefaultCacheBehavior": { "Compress": true, ..., "CachePolicyId": { "Fn::FindInMap": ["CacheMapping", "Global", "CachingOptimized"] }, ...}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/cloudfront/latest/APIReference/API_DefaultCacheBehavior.html"&gt;https://docs.aws.amazon.com/cloudfront/latest/APIReference/API_DefaultCacheBehavior.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html"&gt;https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/using-managed-cache-policies.html"&gt;https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/using-managed-cache-policies.html&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Custom error pages
&lt;/h1&gt;

&lt;p&gt;The plugin managing the pagination of the website renders links to page “0” and “last page + 1” (they’re hidden, though). Crawling the website and pointing to those links, cloudfront replies with a 403 because the pages are not mapped in the S3 bucket. Instead of exposing the 403 error with an awkward cloudfront xml, it could be better to serve a page styled as the rest of the website, and maybe returning HTTP 200 code, thus the human user will see a courtesy page, but the http client won’t know that it hit a wrong path.&lt;/p&gt;

&lt;p&gt;This can be achieved by instructing cloudfront to serve custom error pages when a HTTP error is raised. The path of the custom error pages must be present in the served S3 bucket.&lt;/p&gt;

&lt;h2&gt;
  
  
  Edit the Cloudformation template
&lt;/h2&gt;

&lt;p&gt;In the &lt;em&gt;DistributionConfig&lt;/em&gt; property section of &lt;em&gt;CloudfrontDistribution&lt;/em&gt; resource, adding a &lt;em&gt;CustomErrorResponses&lt;/em&gt; array will do the job. In this case only the HTTP 403 error is fired for unknown paths (per the S3 bucket configuration, there should be no 404 error),&lt;br&gt;&lt;br&gt;
so one line in the array will be enough. The object is quite self-explanatory: &lt;em&gt;ErrorCode&lt;/em&gt; is the error fired, &lt;em&gt;ResponseCode&lt;/em&gt; is the error code to be returned to the client, and &lt;em&gt;ResponsePagePath&lt;/em&gt; is the page to be shown:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"CloudFrontDistribution": { "Properties": { "DistributionConfig": { "CustomErrorResponses":[{ "ErrorCode" : 403, "ResponseCode" : 200, "ResponsePagePath" : "/errors/403.html" }], } }}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Hexo page
&lt;/h2&gt;

&lt;p&gt;The error page has been created like any other page in the site that is accessible from the topbar menu:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;hexo new page errors
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and a folder &lt;em&gt;errors&lt;/em&gt; with an &lt;em&gt;index.md&lt;/em&gt; file is created.&lt;br&gt;&lt;br&gt;
Next, the &lt;em&gt;index.md&lt;/em&gt; file has been renamed to &lt;em&gt;403.md&lt;/em&gt; in order for it to be rendered as &lt;em&gt;403.html&lt;/em&gt; (with the &lt;em&gt;hexo generate&lt;/em&gt; command), as the &lt;em&gt;ResponsePagePath&lt;/em&gt; value states. The rendered html can then be pushed on git and deployed as any other article.&lt;br&gt;&lt;br&gt;
For reference, the actual error page is &lt;a href="///errors/403.html"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  The full Gist
&lt;/h1&gt;

&lt;p&gt;The full parent cloudformation template can be found in this gist:&lt;/p&gt;

&lt;h1&gt;
  
  
  Click to view cloudformation template on gist
&lt;/h1&gt;

</description>
      <category>website</category>
      <category>aws</category>
      <category>cloudformation</category>
      <category>cloudfront</category>
    </item>
  </channel>
</rss>
