<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: John McGuin</title>
    <description>The latest articles on DEV Community by John McGuin (@johnmcguin).</description>
    <link>https://dev.to/johnmcguin</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/johnmcguin"/>
    <language>en</language>
    <item>
      <title>Github Actions Dynamic Matrix</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Fri, 23 Aug 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/github-actions-dynamic-matrix-509c</link>
      <guid>https://dev.to/johnmcguin/github-actions-dynamic-matrix-509c</guid>
      <description>&lt;p&gt;From the jump I should call out that I'm not a DevOps professional. There are quite possibly dragons in this solution; I may have done things weirdly. But, it worked well for me, and I'd like to share it for posterity and my own future reference. If you just want to consult the end solution, you can find that &lt;a href="https://github.com/johnmcguin/unicorn/blob/main/.github/workflows/publish.yml" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Problem
&lt;/h2&gt;

&lt;p&gt;I am working on a template repo called &lt;a href="https://github.com/johnmcguin/unicorn" rel="noopener noreferrer"&gt;unicorn&lt;/a&gt; that is designed to make it easier to deploy websites alongside job applications. It is a monorepo to house many &lt;a href="https://astro.build" rel="noopener noreferrer"&gt;Astro&lt;/a&gt; websites with the same template and UI. The&lt;a href="https://github.com/johnmcguin/unicorn/commit/26386fefa8458b09174e226e26ded63fcb9dfb2e#diff-551d1fcf87f78cc3bc18a7b332a4dc5d8773a512062df881c5aba28a6f5c48d7" rel="noopener noreferrer"&gt;initial iteration&lt;/a&gt;of the workflow simply used a matrix for the deployment jobs. This works perfectly well, but in practice, as I've used the template in my own job hunt, in a matter of a couple of weeks I already have a fairly large number of sites deployed. In this iteration, every website is deployed on every push to main. These are fairly quick jobs, but there is no need to run them. Better to save compute and energy and to only deploy the sites that are changed.&lt;/p&gt;

&lt;h2&gt;
  
  
  High level solution
&lt;/h2&gt;

&lt;p&gt;At a high level, the solution is to maintain a mapping of monorepo packages (the name of the directory found in &lt;code&gt;sites/&amp;lt;site&amp;gt;&lt;/code&gt;) to Pages projects. The workflow will need to reference this configuration, whether it is expressed in the &lt;code&gt;yml&lt;/code&gt; file itself or in a configuration file. The workflow will also need to determine what has changed. If just a site was changed, it should just deploy that site. If the common UI package has changed, then all sites should get deployed, for example. The workflow now has an additional job called "prepare". Prepare will prepare the matrix and the "publish" job will depend on its outputs. So the workflow should: check changes -&amp;gt; compare changes to full configuration sites -&amp;gt; deploy relevant sites.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuration
&lt;/h2&gt;

&lt;p&gt;This is the area that I am most curious if I ended up with a weird solution. The job needs a mapping of package names (the directory) to the name of the Cloudflare Pages project. These could differ and should not be tied together, but at the end of the day, we need to know how to map a certain directory to a Pages project. I am unsure if there is a more native way to do this within the workflow syntax, but I could not determine it. I ended up using a JSON file &lt;code&gt;sites.json&lt;/code&gt; for the users to configure this mapping. This is accomplished by&lt;a href="https://github.com/johnmcguin/unicorn/blob/main/.github/workflows/publish.yml#L16" rel="noopener noreferrer"&gt;adding a step&lt;/a&gt;in the "prepare" job that reads the&lt;a href="https://github.com/johnmcguin/unicorn/blob/main/sites.json" rel="noopener noreferrer"&gt;sites.json&lt;/a&gt; file and sets it as an environment variable to &lt;code&gt;$GITHUB_ENV&lt;/code&gt;. In the &lt;code&gt;sites.json&lt;/code&gt;, the keys are the directory names and the values are the Pages project names.&lt;/p&gt;

&lt;h2&gt;
  
  
  Determine changes
&lt;/h2&gt;

&lt;p&gt;To determine the changes, you could do this a number of ways, but the method I landed on was to use an existing action called &lt;a href="https://github.com/dorny/paths-filter" rel="noopener noreferrer"&gt;paths-filter&lt;/a&gt; from GitHub user&lt;a href="https://github.com/dorny" rel="noopener noreferrer"&gt;dorny&lt;/a&gt;. Thanks dorny! This had a nice API and it's nice to have less code to parse and grok. I won't go into a detailed description of the action here, but the&lt;a href="https://github.com/johnmcguin/unicorn/blob/main/.github/workflows/publish.yml#L18-L30" rel="noopener noreferrer"&gt;new step&lt;/a&gt;organizes git changes to be returned by user specified filters that can be queried to help develop logic flows in your pipelines. Here, I am querying by dependency changes and core UI package changes which would require re-deploying all sites. Otherwise, there will be 0 to many deployments based on whether sites have changed or not.&lt;/p&gt;

&lt;p&gt;There was one feature I was kinda missing / hoping for with paths-filter and that was to specify what type of data to return for the filters. For example, for the sites filter, I am only really concerned with the immediate children of sites. I don't &lt;em&gt;think&lt;/em&gt; there was a way to do this, so I did write a&lt;a href="https://github.com/johnmcguin/unicorn/blob/main/.github/workflows/publish.yml#L32-L43" rel="noopener noreferrer"&gt;follow up step&lt;/a&gt;which takes the outputs from paths-filter and sets a new environment variable to &lt;code&gt;$GITHUB_ENV&lt;/code&gt; which is the unique sites changed, only the directory of the site.&lt;/p&gt;

&lt;h2&gt;
  
  
  Script building the matrix
&lt;/h2&gt;

&lt;p&gt;To recap, we now have the user configured sites that they want to deploy, the unique sites that have changed from a git perspective, and a nice workflow API to check if sites have changed, dependencies, or core packages have changed. With all this in place, we can write a script that can build the matrix of sites to deploy. I am not very good with bash scripting, so I wrote a node script instead. The&lt;a href="https://github.com/johnmcguin/unicorn/blob/main/.github/workflows/publish.yml#L45-L57" rel="noopener noreferrer"&gt;build dynamic matrix&lt;/a&gt;step checks the output of paths-filter and calls the node script to build the matrix for all sites if deps OR core packages have changed. If sites have changed, build a matrix for only the target sites. If neither is true, then do not run the script and simply set "continue" to false. "Continue" is an output of the job for the publish job to consult. As I understand it, there is no native way in the workflow syntax to say "this was successful, but don't continue the rest of the pipeline". Hence, the explicit "continue" output. The script that builds the matrix is&lt;a href="https://github.com/johnmcguin/unicorn/blob/main/scripts/build_matrix.mjs" rel="noopener noreferrer"&gt;here&lt;/a&gt;, but the gist is that it needs to return the matrix data for the publish job. It leverages the&lt;a href="https://docs.github.com/en/actions/writing-workflows/workflow-syntax-for-github-actions#jobsjob_idstrategymatrixinclude" rel="noopener noreferrer"&gt;include syntax&lt;/a&gt;for a matrix job.&lt;/p&gt;

&lt;h2&gt;
  
  
  Outcome
&lt;/h2&gt;

&lt;p&gt;What we now have is a workflow that can publish only the sites that need to be deployed. If not, it will exit successfully after the prepare job and the workflow will not run the publish job. This is exactly what we were hoping for. I found this to be a really good little project to expand my knowledge of GitHub Actions. Hopefully you've found this to be a little helpful in your own Actions adventures as well.&lt;/p&gt;

</description>
      <category>git</category>
      <category>github</category>
      <category>githubactions</category>
      <category>devops</category>
    </item>
    <item>
      <title>Managing Multiple Apps On Fly.io</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Mon, 12 Aug 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/managing-multiple-apps-on-flyio-454a</link>
      <guid>https://dev.to/johnmcguin/managing-multiple-apps-on-flyio-454a</guid>
      <description>&lt;p&gt;&lt;strong&gt;TLDR;&lt;/strong&gt; remove the &lt;code&gt;app&lt;/code&gt; key in &lt;code&gt;fly.toml&lt;/code&gt; to get prompted to select the application when running fly commands that depend on an application.&lt;/p&gt;

&lt;p&gt;There is a time and a place for running your own VPS or VPC, but when I am in the mode of wanting to focus on writing and shipping application code as a solo developer, I don't think there is anything wrong with using a PaaS. The platform I've been reaching for recently is &lt;a href="https://fly.io" rel="noopener noreferrer"&gt;fly.io&lt;/a&gt;. I was initially drawn to check out fly because they're an Elixir shop (😍). I was impressed by the offering. While I don't use their database services, for deploying applications, they've struck a really nice balance of simplicity and power. Further, by deploying with docker based Elixir / BEAM releases, I should be able to fairly easily port this to a different cloud environment or VPS without too much fuss.&lt;/p&gt;

&lt;p&gt;As with any tech you are new to, the surface area of the product and APIs can at times be overwhelming. As a new user with many apps on Fly, I was bit by accidentally running commands against the wrong application a few times, leading to unwanted releases, unwanted scaling, etc, or at worst, broken releases. While the ability to deploy, provision, etc, directly from the command line is a powerful tool for the solo developer, it's not without it's drawbacks. You have to be careful. There are numerous commands that I wish for a &lt;code&gt;--dry-run&lt;/code&gt; flag, or that they would provide a confirmation stage before execution.&lt;/p&gt;

&lt;p&gt;My first workaround was to write bash scripts for my projects that wrapped the &lt;code&gt;fly&lt;/code&gt; commands I wanted to be extra careful with. These scripts would require an environment name, or application name, for example. This worked fine, but it both required being written and also required me and any future developers on the project to use this script for deployments. Maybe not a big deal once integrated in CD automations, but still more open to error than is desirable. There was a simpler, more error proof, more blessed path that I didn't initially know about. That solution is to simply remove the &lt;code&gt;app&lt;/code&gt; key from the &lt;code&gt;fly.toml&lt;/code&gt; file. Now, when you execute &lt;code&gt;fly&lt;/code&gt; commands that require an application to run against, the cli will prompt you for the application before it executes.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Configuring Ngrok With A Static Domain</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Fri, 09 Aug 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/configuring-ngrok-with-a-static-domain-2l73</link>
      <guid>https://dev.to/johnmcguin/configuring-ngrok-with-a-static-domain-2l73</guid>
      <description>&lt;p&gt;I have been working on a Shopify app called &lt;a href="https://filevaultpro.co" rel="noopener noreferrer"&gt;File Vault Pro&lt;/a&gt; to offer a better experience selling digital goods on Shopify. For this app, I used the Shopify CLI to create a Remix BFF, while writing the core application server in Elixir / Phoenix. The Shopify CLI automatically configures the Shopify application environment and handles a dynamically named https tunnel (via Cloudflare tunnels) really well, but I also needed an https tunnel to the backend service (I am using ngrok here), which could not be dynamically updated. This would require running the backend server with ngrok, updating the environment variable for the API endpoint with the dynamically generated URL in the Remix project and &lt;em&gt;then&lt;/em&gt; running the Remix server. Not the end of the world, but also not the most enjoyable workflow. Then I found that you can use custom static domains with ngrok.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create Custom Domain
&lt;/h2&gt;

&lt;p&gt;Ngrok now offers&lt;a href="https://ngrok.com/blog-post/free-static-domains-ngrok-users" rel="noopener noreferrer"&gt;one free static domain&lt;/a&gt;. First, create a custom domain in ngrok by following the steps outlined in the blog post referenced above.&lt;/p&gt;

&lt;h2&gt;
  
  
  Update Ngrok Configuration File
&lt;/h2&gt;

&lt;p&gt;With the new static domain name in hand, next we'll have to update or create the ngrok configuration file. By typing &lt;code&gt;ngrok config check&lt;/code&gt; you should see the file path to the ngrok configuration file. Now, open that in your editor of choice. Alternatively, running &lt;code&gt;ngrok config edit&lt;/code&gt; should open or create and open the configuration file in your system's default editor. The ngrok configuration file is a &lt;code&gt;.yml&lt;/code&gt; file. Full documentation for ngrok configuration options can be found in their&lt;a href="https://ngrok.com/docs/agent/config/" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;. For example's sake, here is a basic version that achieves our desired outcome:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '2'
authtoken: ' *********************'
tunnels:
  shopify_backend:
    proto: http
    addr: 4000
    domain: &amp;lt;your-domain&amp;gt;.ngrok-free.app

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can obtain your auth token via the ngrok dashboard. Give the tunnel whatever name you would like. I called mine 'shopify_backend'. With the static domain provisioned and the configuration file saved, you should now be able to start your service by name like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ngrok start shopify_backend

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, you don't have to worry about starting your services in the right order, or updating the environment variable that references your ngrok service every time you start the dependent service.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Unicorn</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Wed, 07 Aug 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/unicorn-4a41</link>
      <guid>https://dev.to/johnmcguin/unicorn-4a41</guid>
      <description>&lt;p&gt;Introducing &lt;a href="https://github.com/johnmcguin/unicorn" rel="noopener noreferrer"&gt;Unicorn&lt;/a&gt;, a monorepo template built on GitHub Actions and Cloudflare to help you stand out in your job applications by deploying a static site for each job application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create repo
&lt;/h2&gt;

&lt;p&gt;First, you'll need to clone the template repository to your own, working repository.&lt;/p&gt;

&lt;h2&gt;
  
  
  Clone into repo
&lt;/h2&gt;

&lt;p&gt;Next, clone your new repository to your local dev machine and install dependencies.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Create a site
&lt;/h2&gt;

&lt;p&gt;The easiest way to create a site is to use the npm run new command. This is the basis of creating a new site based on the template site that lives at sites/template. You can consult the help docs by running npm run new -- --help. Take a minute to read the output.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;npm run new &lt;span class="nt"&gt;--&lt;/span&gt; &lt;span class="nt"&gt;--help&lt;/span&gt;
&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; unicorn@0.1.0 new
&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; scripts/new_site.mjs new &lt;span class="nt"&gt;--help&lt;/span&gt;

Usage: Unicorn CLI new &lt;span class="o"&gt;[&lt;/span&gt;options]

Create a new website

Options:
  &lt;span class="nt"&gt;-d&lt;/span&gt;, &lt;span class="nt"&gt;--dir&lt;/span&gt; &amp;lt;name&amp;gt;      name of the directory
  &lt;span class="nt"&gt;-j&lt;/span&gt;, &lt;span class="nt"&gt;--job&lt;/span&gt; &amp;lt;title&amp;gt;     job title you are applying &lt;span class="k"&gt;for&lt;/span&gt;
  &lt;span class="nt"&gt;-c&lt;/span&gt;, &lt;span class="nt"&gt;--company&lt;/span&gt; &amp;lt;name&amp;gt;  name of the company. This will be used &lt;span class="k"&gt;in &lt;/span&gt;website copy.
  &lt;span class="nt"&gt;-h&lt;/span&gt;, &lt;span class="nt"&gt;--help&lt;/span&gt;            display &lt;span class="nb"&gt;help &lt;/span&gt;&lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="nb"&gt;command&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The options themselves should be fairly straightforward. What is happening behind the scenes is that the directory flag will make a copy of the template site and create a new site at sites/. The job and company flags will be used for text substitions throughout the website’s copy. This should inform the copy that you write for your own version of the template site. The template site is meant to be customized! It is recommended to write the template site out to minimize the amount of work required after running the script. When customizing the template, there is an assumption that the script makes, which is that the template is substituting all instaces of the string “Company” with the value passed into the --company flag. It replaces all instances of the string “Job Title” with the --job flag. Write your template accordingly! Alternatively, you can customize the script to suit your exact needs. This is not meant to be a silver bullet but more of a starting off point for your own personal automations to this process.&lt;/p&gt;

&lt;p&gt;When you are done with the npm run new script, you have a new site! The output states how exactly to run your new site. The only manual steps that you have are to:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a Cloudflare Pages project&lt;/li&gt;
&lt;li&gt;Add an entry to sites.json where the key is the directory name and its value is the name of the Cloudlfare Pages project.&lt;/li&gt;
&lt;li&gt;Customize the site (resume, copy, etc) to the specific job application.&lt;/li&gt;
&lt;li&gt;Push and monitor the action to make sure the deployment was successful. If everything went right, you should now have a new site live.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Deployment
&lt;/h2&gt;

&lt;p&gt;You've developed your website and are ready to deploy it. First, either create or login to your Cloudflare account. In your Cloudflare account, navigate to "Profile" -&amp;gt; "API Tokens" -&amp;gt; "Create Token". Under the "Custom token" section, click to create a custom token. Give the token a name and the edit permission to Cloudflare Pages. Copy this token and navigate to your GitHub repository. Add a GitHub secret named 'DEPLOY_TOKEN', pasting the token from the previous step as the value. Add another secret called 'CLOUDFLARE_ACCOUNT_ID'. Obtain the value for this from the Cloudflare dashboard. Navigate to the "Workers and Pages" section in your Cloudflare account. You should be able to copy the account id from the sidebar of this web page. The workflow will use these secrets to deploy your websites to your Cloudflare account.&lt;/p&gt;

&lt;p&gt;While in the Cloudflare dashboard, create your first site. Navigate to "Workers and Pages". Select "create application". Select "Pages" tab. Select "Create using direct upload". Select "Upload assets", although we will be skipping the actual upload of assets (the GH action workflow will do this for us, we just need to create the pages "project" at this point). Give the project a name, and select "Create project" without uploading assets. Take note of the project name you provided. Update the GitHub action's matrix property with this value. You should update the 'project' key.&lt;/p&gt;

&lt;p&gt;At this point, everything should be setup for a deployment to succeed. To test this out, push your working repository to the GitHub remote and watch the action run.&lt;/p&gt;

&lt;h2&gt;
  
  
  DNS
&lt;/h2&gt;

&lt;p&gt;DNS is out of scope for this post. For my own job applications, I use Cloudflare name servers and deploy each website as a subdomain to a personal TLD, which works quite well.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;If you have any issues following this post, please open an issue on GitHub.&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Environment Variables In Shopify Checkout Ui Extension</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Thu, 27 Jun 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/environment-variables-in-shopify-checkout-ui-extension-2i35</link>
      <guid>https://dev.to/johnmcguin/environment-variables-in-shopify-checkout-ui-extension-2i35</guid>
      <description>&lt;p&gt;This post refers&lt;a href="https://shopify.dev/docs/api/checkout-ui-extensions"&gt;Shopify Checkout UI extensions&lt;/a&gt;. See&lt;a href="https://dev.to/blog/authenticated-requests-from-shopify-ui-extensions/"&gt;this post&lt;/a&gt; for a brief introduction to UI extensions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Problem
&lt;/h2&gt;

&lt;p&gt;One big limitation of the Shopify UI extensions is that there is no native way to use environment variables, an &lt;a href="https://12factor.net/config"&gt;important principle&lt;/a&gt; when making apps. My use case, which I imagine is a very common one, is providing my frontend extension code with an api endpoint that will vary across development, staging, and production environments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution
&lt;/h2&gt;

&lt;p&gt;The solution I landed at is to kind of hack in environment variables into the code during build time. This will happen by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;writing a template file with all of your environment variables&lt;/li&gt;
&lt;li&gt;depending on a configuration file in your code which holds your environment variables&lt;/li&gt;
&lt;li&gt;writing a script that creates that runtime file using the template file as a template&lt;/li&gt;
&lt;li&gt;calling that script at build time, setting in your environment variables for the script to consume&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;.gitignore&lt;/code&gt; your runtime file&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Template File
&lt;/h3&gt;

&lt;p&gt;First, create a template file. Place this in your extension's &lt;code&gt;src&lt;/code&gt; directory. Mine lives at&lt;code&gt;src/config/config.template.txt&lt;/code&gt;. The contents are: &lt;code&gt;export const API_URL = "$API_URL";&lt;/code&gt;. That's it.&lt;/p&gt;

&lt;h3&gt;
  
  
  Depend on Config File (Source Code)
&lt;/h3&gt;

&lt;p&gt;In your source code which depends on the environment variable, pull in your exported variable(s) from a dependency we'll call &lt;code&gt;config.ts&lt;/code&gt;. This file will live right next to the&lt;code&gt;config.template.txt&lt;/code&gt; file. Don't create this, though, the script will. My import looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { API_URL } from './config/config';

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Script It
&lt;/h3&gt;

&lt;p&gt;Now that the runtime dependency is in place, we need to write a bash script to create the runtime dependency and populate it with our environment variables. I put my project scripts in a &lt;code&gt;scripts&lt;/code&gt;directory at the root of the project. I gave my script the verbose name&lt;code&gt;setup_ui_extension_environment.sh&lt;/code&gt;. It looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/usr/bin/env bash
# template file
template="extensions/&amp;lt;your-extensions-name-here&amp;gt;/src/config/config.template.txt"
# generated file
config="extensions/&amp;lt;your-extensions-name-here&amp;gt;/src/config/config.ts"
# make sure that the API_URL environment variable is set
if [-z "${API_URL}"]; then
    echo -e "\033[0;31mMust set API_URL environment variable"
    exit 1
fi

# generate the runtime file based on the template, using envsubst
envsubst &amp;lt; "$template" &amp;gt; extensions/&amp;lt;your-extensions-name-here&amp;gt;/src/config/config.ts
# output what you did
echo -e "\033[0;32mWrote $API_URL as API_URL from $template to $config"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;API_URL is my only environment variable at the moment, but you can imagine extending this to support many others.&lt;/p&gt;

&lt;h3&gt;
  
  
  Invoke The Script
&lt;/h3&gt;

&lt;p&gt;This manual will only cover development, but the concepts apply across environments and deployments. Make sure to invoke this script when you are building your extension code. My new &lt;code&gt;package.json&lt;/code&gt;&lt;code&gt;dev&lt;/code&gt; command looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"scripts": {
    "dev": "API_URL=https://my-app.ngrok-free.app npm run env:setup &amp;amp;&amp;amp; npm run config:use dev &amp;amp;&amp;amp; shopify app dev",
    "env:setup": "./scripts/setup_ui_extension_environment.sh"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Don't mind the extra config instructions, the important bit is to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;set the environment variable (i.e. API_URL=&lt;a href="https://my-app.ngrok-free.app"&gt;https://my-app.ngrok-free.app&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;invoke the script (i.e. npm run env:setup)&lt;/li&gt;
&lt;li&gt;before running the shopify dev environment (i.e. shopify app dev)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  .gitignore it
&lt;/h3&gt;

&lt;p&gt;It's generally prudent to &lt;code&gt;.gitignore&lt;/code&gt; generated files. We don't want to commit this because we it will vary across environments, and maybe even developers on your team, etc. It's a build time concern only.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Authenticated Requests From Shopify UI Extensions</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Sun, 16 Jun 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/authenticated-requests-from-shopify-ui-extensions-1gp1</link>
      <guid>https://dev.to/johnmcguin/authenticated-requests-from-shopify-ui-extensions-1gp1</guid>
      <description>&lt;p&gt;Shopify has a notion of&lt;a href="https://shopify.dev/docs/api/checkout-ui-extensions"&gt;Checkout UI extensions&lt;/a&gt;. These provide hooks, or targets, to write custom user experiences during the checkout flow. For my Shopify app,&lt;a href="https://filevaultpro.co/"&gt;File Vault Pro&lt;/a&gt;, users are able to associate files with specific products and variants. I needed to leverage UI extensions to provide download access during the checkout experience. There would be two targets for my use case, the "Thank You" and "Order Status" pages. This guide will focus on how to make an authorized request from an extension directly to your API. Creating and configuring an extension is beyond the scope of this guide, but there is plenty of documentation to do so. This guide also assumes an app created with the shopify remix template.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuration
&lt;/h2&gt;

&lt;p&gt;The extensions run in a sandboxed environment and need to be given a couple of permissions. First, from within your partners dashboard, select the app you are developing. From there, select the "API Access" tab. Scroll down and make sure to enable the "Allow network access in checkout and account UI extensions" section. Within your &lt;code&gt;shopify.extension.toml&lt;/code&gt; file, make sure to uncomment the line marked &lt;code&gt;network_access = true&lt;/code&gt;. This gives your extension the "capability" to make network requests.&lt;/p&gt;

&lt;h2&gt;
  
  
  Obtain Token
&lt;/h2&gt;

&lt;p&gt;From within the code of your extension, grab the session token with the hook &lt;code&gt;useSessionToken&lt;/code&gt;. Make sure that the import path is correct. For example, for the "Order Status" page, the import path is "&lt;a class="mentioned-user" href="https://dev.to/shopify"&gt;@shopify&lt;/a&gt;/ui-extensions-react/customer-account". Consult the documentation to ensure you are using the correct import path given your target. Now that you have verified that you can get the session token on the client side, it's time to move towards the server.&lt;/p&gt;

&lt;h2&gt;
  
  
  Server
&lt;/h2&gt;

&lt;p&gt;In my stack, the Remix app serves as a Backend for Frontend (BFF) with the main application code running in a separate service. Instead of proxying the request through the Remix app, I opted to go directly to the backend API. This involves a couple of steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;From the partner dashboard, select the app and grab the client secret. Add this as an environment variable to your server. This will be used to verify the incoming JWT passed from the extension.&lt;/li&gt;
&lt;li&gt;This will vary depending on your server, but the gist is that you will want to verify the JWT for any of the routes that you've created for the extension to consume. This will likely take the form of some sort of middleware. Follow&lt;a href="https://shopify.dev/docs/apps/build/authentication-authorization/session-tokens/set-up-session-tokens#verify-the-session-tokens-signature"&gt;the instructions&lt;/a&gt;to validate the JWT. It is highly recommended to find a JWT validation library to help with this process. I am using Elixir's &lt;a href="https://hexdocs.pm/joken/readme.html"&gt;Joken&lt;/a&gt; library which greatly simplifies this. Consult &lt;a href="https://jwt.io/libraries"&gt;JWT.io&lt;/a&gt; to identify a language specific library for you.&lt;/li&gt;
&lt;li&gt;CORS - you'll also need to enable CORS for the given routes that you are exposing to the extension. These will have a dynamic origin, so origin cannot be depended upon fully as is for CORS configuration.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Integration
&lt;/h2&gt;

&lt;p&gt;Test. At this point, if everything is configured correctly, you should be able to make an authenticated request to your API from the extension. Within the extension code, make a &lt;code&gt;fetch&lt;/code&gt;request passing in the session token as an authorization header.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let res = await fetch(`URL`, {
    headers: {
        Authorization: `Bearer ${token}`,
    },
    mode: 'cors',
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Alternatives
&lt;/h2&gt;

&lt;p&gt;I haven't explored these options, but I think that you should be able to call the Remix backend in a similar manner. I would guess but have not verified that an app created with the CLI is likely configured to automatically verify the JWT. I believe that an&lt;a href="https://shopify.dev/docs/api/shopify-app-remix/v2/authenticate/public/app-proxy"&gt;App Proxy&lt;/a&gt; could also be a viable option here, but have not explored these in depth.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Docker Dadbod Ui</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Fri, 14 Jun 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/docker-dadbod-ui-4dnb</link>
      <guid>https://dev.to/johnmcguin/docker-dadbod-ui-4dnb</guid>
      <description>&lt;p&gt;In normal mode, invoke &lt;code&gt;:DBUI&lt;/code&gt;, establish a new connection with &lt;code&gt;A&lt;/code&gt;. When prompted to enter a connection string, follow the&lt;a href="https://www.postgresql.org/docs/current/libpq-connect.html#LIBPQ-CONNSTRING-URIS"&gt;PG spec&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;e.g. &lt;code&gt;postgresql://&amp;lt;db-user&amp;gt;:&amp;lt;db-password&amp;gt;@localhost:5432/&amp;lt;db-name&amp;gt;&lt;/code&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Shell into Containerized PostgreSQL Database</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Fri, 14 Jun 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/shell-into-containerized-postgresql-database-2mig</link>
      <guid>https://dev.to/johnmcguin/shell-into-containerized-postgresql-database-2mig</guid>
      <description>&lt;p&gt;Shell into the running container:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker exec -it &amp;lt;container-id&amp;gt; /bin/bash

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run psql to connect to the database:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;psql -d &amp;lt;db-name&amp;gt; -U &amp;lt;user&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
    </item>
    <item>
      <title>Time Travel Directories</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Sun, 14 Jan 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/time-travel-directories-3c90</link>
      <guid>https://dev.to/johnmcguin/time-travel-directories-3c90</guid>
      <description>&lt;p&gt;The first time I saw somebody use &lt;code&gt;cd -&lt;/code&gt; my mind was blown. If it's new to you, &lt;code&gt;cd -&lt;/code&gt; will change to your previous directory. The shell is tracking a stack of recent directories, which can be seen via the builtin &lt;code&gt;dirs&lt;/code&gt; command. You can also directly interface with the directory stack via &lt;code&gt;pushd&lt;/code&gt;and &lt;code&gt;popd&lt;/code&gt;. This is so cool and it's the backing mechanism that &lt;code&gt;cd -&lt;/code&gt; uses. With that new knowledge, I created a small zsh function to help me time travel to any directory on the stack, instead of just the most recent.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;function travel() {
    local options=()
    while read -r dir; do
        options+=("$dir")
    done &amp;lt; &amp;lt;(dirs -p)

    select dir in "${options[@]}"; do test -n "$dir" &amp;amp;&amp;amp; break; echo "Invalid Selection"; done
    eval cd "$dir"
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When this function gets invoked, you're prompted to select an option from the directories on the stack and it will navigate back to it.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Better Navigation Experience With ZSH cdpath Variable</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Wed, 10 Jan 2024 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/better-navigation-experience-with-zsh-cdpath-variable-2hfo</link>
      <guid>https://dev.to/johnmcguin/better-navigation-experience-with-zsh-cdpath-variable-2hfo</guid>
      <description>&lt;p&gt;In *nix systems, the &lt;code&gt;cd&lt;/code&gt; command lets us navigate the filesystem. There are a few places that I visit most frequently though. For me, it's my &lt;code&gt;Projects&lt;/code&gt; directory, my &lt;code&gt;Brain&lt;/code&gt; directory (obsidian), and then a &lt;code&gt;Playground&lt;/code&gt; directory is where I throw experimentation or throw away code. That, and my &lt;code&gt;$HOME&lt;/code&gt;. You can add or experiment with these changes and they can be added to your &lt;code&gt;.zshrc&lt;/code&gt; or &lt;code&gt;.bashrc&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Aliases
&lt;/h2&gt;

&lt;p&gt;I use these aliases as navigation helpers to go backward in the filesystem.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;alias ..="cd .."
alias ..2="cd ../.."
alias ..3="cd ../../.."
alias ..4="cd ../../../.."

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;They should be pretty straight forward, but these basically say go backward &lt;code&gt;..&lt;/code&gt; n levels. Of course it's just hard coded but I'm never really needing much more than that. In combination with the cdpath modification, system navigation becomes beautiful 🤌.&lt;/p&gt;

&lt;h2&gt;
  
  
  cdpath OR $CDPATH
&lt;/h2&gt;

&lt;p&gt;In linux, when a &lt;code&gt;cd&lt;/code&gt; gets executed, the $CDPATH environment variable is consulted. The $CDPATH is similar to the $PATH variable in structure and behavior. If it's not set explicitly, the behavior just defaults to operating from the current directory. However, you can customize it for much prosperity, good fortune, and good times. I use ZSH so my example will use ZSH's cdpath which takes a list instead of a string, but the example can be easily modified to bash or your shell of choice.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;cdpath=(. $HOME $HOME/Projects $HOME/Brain $HOME/Playground)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This extends the cdpath to include all of these additions. What this means in practice is that I can still have normal cd behavior from the current directory, but I can also &lt;code&gt;cd&lt;/code&gt; directly into anything in any of the other paths. For example, if I have a project at &lt;code&gt;$HOME/Projects/blog&lt;/code&gt;, from anywhere on the system, I can&lt;code&gt;cd blog&lt;/code&gt;, or I could &lt;code&gt;cd Projects/blog&lt;/code&gt;.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How To Use Virtual Environment In Jupyter Notebooks</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Fri, 06 Oct 2023 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/how-to-use-virtual-environment-in-jupyter-notebooks-48o7</link>
      <guid>https://dev.to/johnmcguin/how-to-use-virtual-environment-in-jupyter-notebooks-48o7</guid>
      <description>&lt;p&gt;I've written almost no Python but have recently been dabbling a little bit to explore some of the AI tools such as langchain. I recently had the need to write a web scraper and thought it would be a good time to practice a little Python since it seems to be a great tool and choice for that domain. My goal was to write a simple script that would take a base URL, extract some data, paginate, and run the same extraction function until the last page. I wanted to write all results to a csv that I could then explore further. After creating the csv, I wanted to be able to explore the data in a Jupyter Notebook. It took me a while to figure out how to isolate my notebook dependencies to the virtual environment. Maybe this post will be found by other beginners trying to do the same.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating the virtual env
&lt;/h2&gt;

&lt;p&gt;First, I created the virtual env with &lt;code&gt;venv&lt;/code&gt; by running&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python3 -m venv env

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This provided me with a virtual environment for my scraper project. I activated the virtual env by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;source /env/bin/activate

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If these steps were successful so far, you should see the name of your virtual environment in your shell prompt. From here, you should have &lt;code&gt;python&lt;/code&gt; and &lt;code&gt;pip&lt;/code&gt; in your virtual environment's path. Now you can install your dependencies for your project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a custom Jupyter kernel
&lt;/h2&gt;

&lt;p&gt;Fast forward to the point where you have a working Python project. You've packaged up your dependencies for the project and now want to be able to run this virtual environment from within a notebook. This is the part that had me stumped for a little while. You likely need to install &lt;code&gt;ipykernel&lt;/code&gt; which is a package for creating Jupyter notebooks Python kernels. Make sure to activate your &lt;code&gt;venv&lt;/code&gt; and then in the active virtual environment shell, install &lt;code&gt;ipykernel&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m pip install ipykernel

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once the dependency is installed, you can now proceed to create the custom kernel.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m ipykernel install --user --name=env --display-name="&amp;lt;some-better-name-you-will-recognize&amp;gt;"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Selecting the custom kernel
&lt;/h2&gt;

&lt;p&gt;After launching &lt;code&gt;jupyter notebook&lt;/code&gt;, select "Kernel" -&amp;gt; "Change Kernel"&lt;/p&gt;

&lt;p&gt;If you had successfully created your kernel, you should see it referenced here under the display name that you used to flag the kernel (in my case this is called shopify-scraper-env).&lt;/p&gt;

&lt;p&gt;Once you've selected your custom kernel, you should be able to import project dependencies without &lt;code&gt;pip install&lt;/code&gt;ing them globally. If you can't import your dependency, something has likely gone wrong.&lt;/p&gt;

&lt;h2&gt;
  
  
  Debugging
&lt;/h2&gt;

&lt;p&gt;For me, I built the kernel outside of the &lt;code&gt;venv&lt;/code&gt; and the creation was succesful but I didn't have access to my packages. For starters, verify that your kernel was created with&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;jupyter kernelspec list

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If your kernel was created, you should see it listed here. If your kernel was listed, it should have printed out with a path. Navigate to that base path and run the &lt;code&gt;ls&lt;/code&gt; command. There should be a &lt;code&gt;kernel.json&lt;/code&gt; file. If you look into the &lt;code&gt;kernel.json&lt;/code&gt;, you should see which python executable the kernel is using. This should be using the version of python found in your virtual environment and if it's not, you likely need to make sure you repeat the steps to create the kernel from within your &lt;em&gt;activated&lt;/em&gt; &lt;code&gt;venv&lt;/code&gt;. The &lt;code&gt;kernel.json&lt;/code&gt; &lt;em&gt;should&lt;/em&gt; look something like this&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "argv": [
    "&amp;lt;path&amp;gt;/&amp;lt;to&amp;gt;/&amp;lt;your&amp;gt;/&amp;lt;project&amp;gt;/env/bin/python",
    "-m",
    "ipykernel_launcher",
    "-f",
    "{connection_file}"
  ],
  "display_name": "shopify-scraper-env",
  "language": "python",
  "metadata": {
    "debugger": true
  }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Updating dependencies
&lt;/h2&gt;

&lt;p&gt;If you're successfully importing your virtual environment dependencies from your notebook, congrats! You're successfully tinkering but need to add a new library or two for the investigatory work you're doing in the notebook. No worries! Basically repeat the process as before. After installing your dependencies, just re-run the command to generate the kernel. In my experience, after doing this, I was good to go. If this doesn't work automatically for you, you could try restarting the kernel through the notebook GUI.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Deploy Astro To Github Pages</title>
      <dc:creator>John McGuin</dc:creator>
      <pubDate>Thu, 28 Sep 2023 00:00:00 +0000</pubDate>
      <link>https://dev.to/johnmcguin/deploy-astro-to-github-pages-4mgm</link>
      <guid>https://dev.to/johnmcguin/deploy-astro-to-github-pages-4mgm</guid>
      <description>&lt;p&gt;If you are using GitHub to host a public repository, deploying your static Astro site onto GitHub Pages is a quick and easy option to getting your site online.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create DNS Entry
&lt;/h2&gt;

&lt;p&gt;First, you'll want to add a DNS entry for your subdomain. To do this, navigate to your DNS management dashboard for your domain name and add a new CNAME record where the host name is of the format &lt;code&gt;&amp;lt;your-subdomain&amp;gt;&lt;/code&gt; and the entry type is &lt;code&gt;CNAME&lt;/code&gt;. You should be mapping this to the canonical name of &lt;code&gt;&amp;lt;your-github-username&amp;gt;.github.io&lt;/code&gt;. From the DNS side of things, you should be good to go.&lt;/p&gt;

&lt;h2&gt;
  
  
  Astro Configuration
&lt;/h2&gt;

&lt;p&gt;Moving into the astro project, next you'll want to open your &lt;code&gt;astro.config.ts&lt;/code&gt; file and add a &lt;code&gt;site&lt;/code&gt; property to your config object such that you have&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export default defineConfig({
    site: 'https://&amp;lt;your-full-domain&amp;gt;'
})

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  GitHub Configuration
&lt;/h2&gt;

&lt;p&gt;To get the deployment ready from the GitHub side of things, navigate to your repository -&amp;gt; 'Settings' -&amp;gt; 'Pages'. Under the "Build and deployment" section, make sure to select "GitHub actions". Under the "Custom domain" section enter your full domain less the scheme. As an example, instead of &lt;code&gt;https://&amp;lt;your-sub&amp;gt;.&amp;lt;your-second-level&amp;gt;.&amp;lt;your-top-level&amp;gt;&lt;/code&gt; instead write &lt;code&gt;&amp;lt;your-sub&amp;gt;.&amp;lt;your-second-level&amp;gt;.&amp;lt;your-top-level&amp;gt;&lt;/code&gt;. GitHub will run a DNS check but since we have already created the CNAME record, expect this to pass. If this does not pass, double check the previous steps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deployment
&lt;/h2&gt;

&lt;p&gt;The final step is to configure a GitHub Action to deploy the project. GitHub Actions is a powerful CI/CD tool build directly into GitHub. It is a deep subject unto itself, but at a high level, specific repo &lt;code&gt;Events&lt;/code&gt; can trigger &lt;code&gt;Workflows&lt;/code&gt;. A &lt;code&gt;Workflow&lt;/code&gt; is process composed of one or many &lt;code&gt;Jobs&lt;/code&gt;. In this case, the thing we want to do is kick off a pages deployment. GitHub Actions workflows are defined in &lt;code&gt;.github/workflows&lt;/code&gt; directory, so if this doesn't exist, be sure to create it now. Then create a new file in this directory called &lt;code&gt;deploy.yml&lt;/code&gt;. In it, paste the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Deploy to GitHub Pages

on:
  push:
    branches: [main]
  workflow_dispatch:

permissions:
  contents: read
  pages: write
  id-token: write

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Repo
        uses: actions/checkout@v3
      - name: Install, build, and upload your site
        uses: withastro/action@v0
        with:
          node-version: 18

  deploy:
    needs: build
    runs-on: ubuntu-latest
    environment:
      name: github-pages
      url: ${{ steps.deployment.outputs.page_url }}
    steps:
      - name: Deploy to GitHub Pages
        id: deployment
        uses: actions/deploy-pages@v1

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What this does is runs a deployment workflow on the &lt;code&gt;push&lt;/code&gt; event to the branch &lt;code&gt;main&lt;/code&gt;. It utilizes existing actions and composes them together to build the project and upload it to github pages. From here, you should be all set. Pushes to &lt;code&gt;main&lt;/code&gt; should now deploy your astro website to your custom subdomain.&lt;/p&gt;

</description>
      <category>howto</category>
      <category>astro</category>
    </item>
  </channel>
</rss>
