<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kagunda JM</title>
    <description>The latest articles on DEV Community by Kagunda JM (@kagundajm).</description>
    <link>https://dev.to/kagundajm</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/kagundajm"/>
    <language>en</language>
    <item>
      <title>ODK Central Domain Change: A Painless Backup and Restore Tutorial</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Sat, 21 Oct 2023 20:00:00 +0000</pubDate>
      <link>https://dev.to/kagundajm/odk-central-domain-change-a-painless-backup-and-restore-tutorial-4gpk</link>
      <guid>https://dev.to/kagundajm/odk-central-domain-change-a-painless-backup-and-restore-tutorial-4gpk</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;ODK Central is a powerful tool for collecting, managing, and analyzing data using mobile devices. However, there may be situations where you need to move your ODK deployment from one server to another, such as changing hosting providers, upgrading hardware, or switching domains.&lt;/p&gt;

&lt;p&gt;This tutorial will walk you through the steps of creating a direct backup of your ODK Central database, copying the backup file and other essential data to the destination server, and restoring the backup file on the new server. The tutorial will also show you how to update the domain names in the Enketo Redis databases and config files to avoid errors when previewing forms on the new server.&lt;/p&gt;

&lt;p&gt;By following this tutorial, you will be able to migrate your ODK deployment from one server to another with minimal downtime and data loss.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Before you begin, ensure that you have the following prerequisites:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;A running instance of ODK Central on the source and destination servers.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Both source and destination servers are running the same version of ODK Central.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Access to the command line interface (CLI) of both source and destination servers.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Sufficient disk space on both servers to store the backup files and other essential data.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you have met these requirements, you will be able to follow this tutorial easily and avoid errors or issues that may arise during the backup and restore process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Backup ODK Central Main Data on the Source Server
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Log in to your ODK Central source server.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;From the Terminal window, create a folder &lt;em&gt;odk-backups&lt;/em&gt; for the ODK Central backup files in the logged-in user's home folder. You can use a different name that follows your folder naming conventions.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;mkdir&lt;/span&gt; ~/odk-backups
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Run the command below to perform a direct backup of your ODK Central database. Replace &lt;strong&gt;&lt;code&gt;backup-password&lt;/code&gt;&lt;/strong&gt;, &lt;strong&gt;&lt;code&gt;your-odk-username@your-odk-domain-name&lt;/code&gt;&lt;/strong&gt;, &lt;strong&gt;&lt;code&gt;your-odk-domain-name&lt;/code&gt;&lt;/strong&gt;, and &lt;strong&gt;&lt;code&gt;backup-filename.zip&lt;/code&gt;&lt;/strong&gt; placeholder values with actual values for your ODK Central installation.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  curl &lt;span class="nt"&gt;-X&lt;/span&gt; POST &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{"passphrase": "backup-password"}'&lt;/span&gt; &lt;span class="nt"&gt;-su&lt;/span&gt; your-odk-username@your-odk-domain-name https://your-odk-domain-name/v1/backup &lt;span class="nt"&gt;--output&lt;/span&gt; ~/odk-backups/backup-filename.zip
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;&lt;strong&gt;&lt;code&gt;{"passphrase": "backup-password"}&lt;/code&gt;&lt;/strong&gt;: password that will be used to encrypt the backup file&lt;br&gt;
&lt;strong&gt;&lt;code&gt;your-odk-username@your-odk-domain-name&lt;/code&gt;&lt;/strong&gt;: Username that you use to log into the ODK Central web interface&lt;br&gt;
&lt;strong&gt;&lt;code&gt;your-odk-domain-name&lt;/code&gt;&lt;/strong&gt;: Registered domain name for your ODK Central installation&lt;br&gt;
&lt;strong&gt;&lt;code&gt;backup-filename.zip&lt;/code&gt;&lt;/strong&gt;: The name that will be assigned to the backup file.&lt;/p&gt;

&lt;p&gt;You will be asked to provide your user password before the command can run. The password you type will be the password for the user that you use to log into the ODK Central web interface&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Wait for the backup process to run to completion.&lt;/p&gt;

&lt;p&gt;Below is an example of commands contained in the previous steps.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;root:~#
root:~# &lt;span class="nb"&gt;mkdir &lt;/span&gt;odk-backups
root:~#
root:~# curl &lt;span class="nt"&gt;-X&lt;/span&gt; POST &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{"passphrase": "bk202309"}'&lt;/span&gt; &lt;span class="nt"&gt;-k&lt;/span&gt;  &lt;span class="nt"&gt;-u&lt;/span&gt; testapp@example.com https://odk-src.kagundajm.codes/v1/backup &lt;span class="nt"&gt;--output&lt;/span&gt; ~/odk-backups/odk-bk20230913-08AM.zip
Enter host password &lt;span class="k"&gt;for &lt;/span&gt;user &lt;span class="s1"&gt;'testapp@example.com'&lt;/span&gt;:
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                Dload  Upload   Total   Spent    Left  Speed
100  481M    0  481M    0    26  4819k      0 &lt;span class="nt"&gt;--&lt;/span&gt;:--:--  0:01:42 &lt;span class="nt"&gt;--&lt;/span&gt;:--:-- 10.4M
root:~#
root:~# &lt;span class="nb"&gt;cd &lt;/span&gt;odk-backups/
root:~#
root:~/odk-backups# &lt;span class="nb"&gt;ls&lt;/span&gt; &lt;span class="nt"&gt;-la&lt;/span&gt;
total 493356
drwxr-xr-x  2 root root      4096 Sep 12 05:17 &lt;span class="nb"&gt;.&lt;/span&gt;
drwx------ 10 root root      4096 Sep 12 05:17 ..
&lt;span class="nt"&gt;-rw-r--r--&lt;/span&gt;  1 root root 505183986 Sep 12 05:19 odk-bk20230913-08AM.zip
root:~/odk-backups#
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  List ODK Central Docker Containers
&lt;/h2&gt;

&lt;p&gt;This step is not mandatory, but it helps you understand the source of Docker container names used in this tutorial.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;On the source server, change directory to the ODK Central installation folder&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;central
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;List names and state of ODK Central Docker containers using the following command:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker ps &lt;span class="nt"&gt;--format&lt;/span&gt; &lt;span class="s1"&gt;'table {{.Names}}\t{{.State}}'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;&lt;strong&gt;&lt;code&gt;--format&lt;/code&gt;&lt;/strong&gt; option will list names and headers of the required fields. &lt;strong&gt;&lt;code&gt;\t&lt;/code&gt;&lt;/strong&gt; is a tab character to separate the fields.&lt;/p&gt;

&lt;p&gt;Running the command will output a list similar to the following:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;NAMES                          STATE
central-nginx-1                running
central-service-1              running
central-enketo-1               running
central-pyxform-1              running
central-enketo_redis_cache-1   running
central-mail-1                 running
central-postgres14-1           running
central-enketo_redis_main-1    running
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Backup Enketo Redis Data and Secrets
&lt;/h2&gt;

&lt;p&gt;Redis stores data in memory (RAM) which allows for fast data access and retrieval. The ODK Central installation includes two Redis database instances; &lt;strong&gt;&lt;code&gt;central-enketo_redis_main-1&lt;/code&gt;&lt;/strong&gt; and &lt;strong&gt;&lt;code&gt;central-enketo_redis_cache-1&lt;/code&gt;&lt;/strong&gt;. To backup the Redis databases, you run commands to flush the in-memory data to disk.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Log in to your ODK Central source server if you had logged out.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Change directory to the ODK Central installation folder&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd&lt;/span&gt; ~/central
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Flush in-memory data to disk on the main instance of Enketo Redis database by running the following command.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;exec &lt;/span&gt;central-enketo_redis_main-1 redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 save
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The command instructs Docker to execute the Redis Command Line Interface (CLI) tool (&lt;strong&gt;&lt;code&gt;redis-cli&lt;/code&gt;&lt;/strong&gt; )within the Docker container named &lt;strong&gt;central-enketo_redis_main-1&lt;/strong&gt; which is listening on port 6379. &lt;strong&gt;&lt;code&gt;save&lt;/code&gt;&lt;/strong&gt; creates a snapshot of the current state of the Redis database and writes it to disk.&lt;/p&gt;

&lt;p&gt;The server should display &lt;strong&gt;OK&lt;/strong&gt; after the command completes and a backup file named &lt;em&gt;enketo-main.rdb&lt;/em&gt; will be created in the &lt;em&gt;/data/&lt;/em&gt; folder of the &lt;strong&gt;&lt;code&gt;central-enketo_redis_main-1&lt;/code&gt;&lt;/strong&gt; container.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy the &lt;em&gt;enketo-main.rdb&lt;/em&gt; file from the Docker container to the backup folder you created.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;cp  &lt;/span&gt;central-enketo_redis_main-1:/data/enketo-main.rdb ~/odk-backups/enketo-main.rdb
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Flush in-memory data to disk on the cache instance of Enketo Redis database.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;exec &lt;/span&gt;central-enketo_redis_cache-1 redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6380 save
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The server should display &lt;strong&gt;OK&lt;/strong&gt; after the command completes and a backup file named &lt;em&gt;enketo-cache.rdb&lt;/em&gt; will be created in the &lt;em&gt;/data/&lt;/em&gt; folder of the &lt;strong&gt;&lt;code&gt;central-enketo_redis_cache-1&lt;/code&gt;&lt;/strong&gt; container.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy the &lt;em&gt;enketo-cache.rdb&lt;/em&gt; backup file from the Docker container to the backup folder you created.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;cp  &lt;/span&gt;central-enketo_redis_cache-1:/data/enketo-cache.rdb ~/odk-backups/enketo-cache.rdb
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The &lt;strong&gt;&lt;code&gt;central-service-1&lt;/code&gt;&lt;/strong&gt; container has a &lt;em&gt;/etc/secrets&lt;/em&gt; folder which contains files used to store secret keys for the Enketo server. These keys are used to encrypt and decrypt data. Copy the &lt;em&gt;secrets&lt;/em&gt; folder to the &lt;em&gt;odk-backups&lt;/em&gt; folder.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;cp &lt;/span&gt;central-service-1:/etc/secrets ~/odk-backups
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The following is an example sequence of the commands and outputs for the above steps.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  root:~/odk-backups# &lt;span class="nb"&gt;cd&lt;/span&gt; ~/central
  root:~/central#
  root:~/central# docker &lt;span class="nb"&gt;exec &lt;/span&gt;central-enketo_redis_main-1 redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 save
  OK
  root:~/central# docker &lt;span class="nb"&gt;cp  &lt;/span&gt;central-enketo_redis_main-1:/data/enketo-main.rdb ~/odk-backups/enketo-main.rdb
  Successfully copied 39.9kB to /root/odk-backups/enketo-main.rdb
  root:~/central#
  root:~/central# docker &lt;span class="nb"&gt;exec &lt;/span&gt;central-enketo_redis_cache-1 redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6380 save
  OK
  root:~/central#
  root:~/central# docker &lt;span class="nb"&gt;cp  &lt;/span&gt;central-enketo_redis_cache-1:/data/enketo-cache.rdb ~/odk-backups/enketo-cache.rdb
  Successfully copied 2.05kB to /root/odk-backups/enketo-cache.rdb
  root:~/central#
  root:~/central# docker &lt;span class="nb"&gt;cp &lt;/span&gt;central-service-1:/etc/secrets ~/odk-backups
  Successfully copied 4.61kB to /root/odk-backups
  root:~/central#
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy Enketo config file from Docker container to a text file in the backup folder.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;cp &lt;/span&gt;central-enketo-1:/srv/src/enketo_express/config/config.json ~/odk-backups/enketo-config-keys.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open the text file using the &lt;strong&gt;nano&lt;/strong&gt; editor.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nano  ~/odk-backups/enketo-config-keys.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Using &lt;strong&gt;Control&lt;/strong&gt; + &lt;strong&gt;K&lt;/strong&gt; (&lt;strong&gt;^K&lt;/strong&gt;) keys, remove all key/pair values in the text file except "encryption key", "less secure encryption key", and "api key". After you have completed removing all other key/values pairs, your text file should contain entries similar to the following:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="s2"&gt;"encryption key"&lt;/span&gt;: &lt;span class="s2"&gt;"JAUzy4moYjA284rORjMcd2FIQCEdniSx9rmYZrqZSdxwzmG1WY6Apx6QAphgAFc9"&lt;/span&gt;,
&lt;span class="s2"&gt;"less secure encryption key"&lt;/span&gt;: &lt;span class="s2"&gt;"KgyIvGRFtnctP99qGzi0ii1GVg1j453U"&lt;/span&gt;,
&lt;span class="s2"&gt;"api key"&lt;/span&gt;: &lt;span class="s2"&gt;"YJXZRqiOirbSijVROqnh3HGCvuhEwhmKFrkyDkwl4HHIYo0Z3YNitfQH1McNp7srlf1RcvCfXX8vzCVoINW7UgZAIT6jbjR8gxaYJR3wzE8dNGZgdEitwl7cs2abkWUA"&lt;/span&gt;,
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Press &lt;strong&gt;Control&lt;/strong&gt; + &lt;strong&gt;O&lt;/strong&gt; (&lt;strong&gt;^O&lt;/strong&gt;)  to save the changes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Press &lt;strong&gt;ENTER&lt;/strong&gt; key to confirm  the file name.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Press &lt;strong&gt;Control&lt;/strong&gt; + &lt;strong&gt;X&lt;/strong&gt; (&lt;strong&gt;^X&lt;/strong&gt;)  to close the editor.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Log out of the source server&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Copy Backups From Source Server To Destination Server
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Log in to your ODK Central destination server.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;If you are using username/password to connect to the source server, run the  command below to copy the &lt;em&gt;~/odk-backups&lt;/em&gt; folder to the destination server. You will be prompted to type the user's password before the &lt;code&gt;rsync&lt;/code&gt; command starts the syncing process.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;rsync &lt;span class="nt"&gt;-avz&lt;/span&gt; user@remote_server:~/odk-backups  ~
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;Use the below &lt;code&gt;rsync&lt;/code&gt; command if you use SSH keys to log in to your source server.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;rsync &lt;span class="nt"&gt;-avz&lt;/span&gt; &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="s2"&gt;"ssh -i ssh-key-file"&lt;/span&gt;  user@remote_server:~/odk-backups  ~
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;&lt;strong&gt;&lt;code&gt;-a&lt;/code&gt;&lt;/strong&gt; : Archive mode, which preserves file attributes and permissions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;-v&lt;/code&gt;&lt;/strong&gt; : Verbose output, so you can see the progress.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;-z&lt;/code&gt;&lt;/strong&gt; : Compresses data during transfer, reducing the network bandwidth used.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;-e "ssh -i ssh-key-file"&lt;/code&gt;&lt;/strong&gt; : Tells &lt;code&gt;rsync&lt;/code&gt; to use SSH for connecting to the remote server using  &lt;code&gt;-i ssh-key-file&lt;/code&gt; SSH private key file for authentication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;user@remote_server:~/odk-backups/&lt;/code&gt;&lt;/strong&gt; : &lt;strong&gt;&lt;code&gt;user&lt;/code&gt;&lt;/strong&gt; is the username for the remote server while   &lt;strong&gt;&lt;code&gt;remote_server&lt;/code&gt;&lt;/strong&gt; is the IP address or registered hostname for the remote server. &lt;em&gt;odk-backups&lt;/em&gt; refers to the folder containing the backup filles on the remote server.&lt;/p&gt;

&lt;p&gt;Replace &lt;strong&gt;&lt;code&gt;user&lt;/code&gt;&lt;/strong&gt;, &lt;strong&gt;&lt;code&gt;remote_server&lt;/code&gt;&lt;/strong&gt;, and &lt;strong&gt;&lt;code&gt;ssh-key-file&lt;/code&gt;&lt;/strong&gt;  with actual values for your environment.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;A message and prompt similar to the one below may be displayed if SSH does not have a record of the remote server's public key. Make sure you trust the remote server before typing ‘yes’ to continue connecting.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  The authenticity of host &lt;span class="s1"&gt;'kagundajm.codes (93.184.216.34)'&lt;/span&gt; can&lt;span class="s1"&gt;'t be established.
  ED25519 key fingerprint is SHA256:7dvpverJGLBDWqd8uxsLX9kWgBiZnYcXL07B+w3eIds.
  This key is not known by any other names
  Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Restore ODK Central Direct Backup File
&lt;/h2&gt;

&lt;p&gt;The first time ODK Central runs, it creates the &lt;em&gt;/data/transfer&lt;/em&gt; folder on the host server and on the &lt;strong&gt;&lt;code&gt;central-service-1&lt;/code&gt;&lt;/strong&gt; Docker container. The purpose of the folder is to allow exchange of data between the host server and the Docker container. Any changes made to &lt;em&gt;/data/transfer&lt;/em&gt; in the &lt;strong&gt;&lt;code&gt;central-service-1&lt;/code&gt;&lt;/strong&gt; container will be reflected in &lt;em&gt;/data/transfer&lt;/em&gt; on your host server, and vice versa. To restore the Direct Backup file, you require to place the file in this folder.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Log in to your ODK Central destination server if you have logged out.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Move the Direct Backup file to the  &lt;em&gt;/data/transfer&lt;/em&gt;  folder using the command below. Replace &lt;strong&gt;&lt;code&gt;{backup-filename.zip}&lt;/code&gt;&lt;/strong&gt; with the actual name that you used during the backup process.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;mv&lt;/span&gt; ~/odk-backups/&lt;span class="o"&gt;{&lt;/span&gt;backup-filename.zip&lt;span class="o"&gt;}&lt;/span&gt; /data/transfer/
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;If you encounter a &lt;strong&gt;"cannot create regular file: Permission denied"&lt;/strong&gt; error, prefix the command with &lt;strong&gt;&lt;code&gt;sudo&lt;/code&gt;&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo mv&lt;/span&gt; ~/odk-backups/&lt;span class="o"&gt;{&lt;/span&gt;backup-filename.zip&lt;span class="o"&gt;}&lt;/span&gt; /data/transfer/
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Change directory to ODK Central installation folder by running the following commands.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;cd &lt;/span&gt;central
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Before running the command below, be aware that the command will replace all data on the destination server with the backup snapshot data. Once you are certain you want to replace the existing data, restore the backup using the following command:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose &lt;span class="nb"&gt;exec &lt;/span&gt;service node /usr/odk/lib/bin/restore.js /data/transfer/&lt;span class="o"&gt;{&lt;/span&gt;backup-filename.zip&lt;span class="o"&gt;}&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt; backup-password&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The command tells Docker to go into the running container named &lt;strong&gt;&lt;code&gt;service&lt;/code&gt;&lt;/strong&gt; and execute a Node.js script (&lt;em&gt;restore.js&lt;/em&gt;) which expects a backup file (&lt;em&gt;backup-filename.zip&lt;/em&gt;) and backup password (backup-password).&lt;/p&gt;

&lt;p&gt;Replace &lt;strong&gt;&lt;code&gt;backup-filename.zip&lt;/code&gt;&lt;/strong&gt; with the file name you moved to the &lt;em&gt;/data/transfer/&lt;/em&gt; folder and &lt;strong&gt;&lt;code&gt;{backup-password}&lt;/code&gt;&lt;/strong&gt; with the password used during the backup process.&lt;/p&gt;

&lt;p&gt;If you did not set a backup password during the backup process, press &lt;strong&gt;ENTER&lt;/strong&gt; key immediately after typing the backup filename. Avoid trailing spaces after the filename &lt;strong&gt;&lt;code&gt;backup-filename.zip&lt;/code&gt;&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose &lt;span class="nb"&gt;exec &lt;/span&gt;service node /usr/odk/lib/bin/restore.js /data/transfer/&lt;span class="o"&gt;{&lt;/span&gt;backup-filename.zip&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;If the server displays "&lt;strong&gt;no configuration file provided: not found&lt;/strong&gt;", you are running the command outside of ODK Central installation folder.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Wait for the backup restoration process to complete. The duration of your wait time will depend on the size of your backup file. After the restore process runs to completion, a success message including tasks that you require to revisit and verify will be displayed.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;kagunda:~&lt;span class="err"&gt;$&lt;/span&gt;
kagunda:~&lt;span class="nv"&gt;$ &lt;/span&gt;&lt;span class="nb"&gt;cd&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;cd &lt;/span&gt;central
kagunda:~/central&lt;span class="nv"&gt;$ &lt;/span&gt;docker compose &lt;span class="nb"&gt;exec &lt;/span&gt;service node /usr/odk/lib/bin/restore.js /data/transfer/odk-bk20230913-08AM.zip bk202309
Success. You will have to log out of the site and log back &lt;span class="k"&gt;in&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
    IMPORTANT: EVERYTHING has been restored to the way things were at the &lt;span class="nb"&gt;time &lt;/span&gt;of backup, including:
    &lt;span class="k"&gt;*&lt;/span&gt; all passwords and email addresses.
    &lt;span class="k"&gt;*&lt;/span&gt; anything deleted since the backup was made now exists again.
    &lt;span class="k"&gt;*&lt;/span&gt; your backup settings.
    Please revisit all of these and make sure they are okay.
&lt;span class="s1"&gt;'{"success":true}'&lt;/span&gt;
kagunda:~/central&lt;span class="err"&gt;$&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Restore Enketo Redis Data and Secrets
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Make sure you are stilled logged in to the destination server and you are working from the ODK Central installation  folder.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy the main Enketo Redis database (&lt;em&gt;enketo-main.rdb&lt;/em&gt;) to the Docker container using the following commands:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker stop central-enketo_redis_main-1
docker &lt;span class="nb"&gt;cp&lt;/span&gt; ~/odk-backups/enketo-main.rdb central-enketo_redis_main-1:/data/enketo-main.rdb&lt;span class="p"&gt;;&lt;/span&gt;
docker start central-enketo_redis_main-1&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;Example commands and output:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;kagunda:~/central&lt;span class="err"&gt;$&lt;/span&gt;
kagunda:~/central&lt;span class="nv"&gt;$ &lt;/span&gt;docker stop central-enketo_redis_main-1
central-enketo_redis_main-1
kagunda:~/central&lt;span class="nv"&gt;$ &lt;/span&gt;docker &lt;span class="nb"&gt;cp&lt;/span&gt; ~/odk-backups/enketo-main.rdb central-enketo_redis_main-1:/data/enketo-main.rdb&lt;span class="p"&gt;;&lt;/span&gt;
Successfully copied 39.9kB to central-enketo_redis_main-1:/data/enketo-main.rdb
kagunda:~/central&lt;span class="err"&gt;$&lt;/span&gt;
kagunda:~/central&lt;span class="nv"&gt;$ &lt;/span&gt;docker start central-enketo_redis_main-1&lt;span class="p"&gt;;&lt;/span&gt;
central-enketo_redis_main-1
kagunda:~/central&lt;span class="err"&gt;$&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy the Enketo Redis cache database (&lt;em&gt;enketo-cache.rdb&lt;/em&gt;) to the Docker container:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker stop central-enketo_redis_cache-1
docker &lt;span class="nb"&gt;cp&lt;/span&gt; ~/odk-backups/enketo-cache.rdb central-enketo_redis_cache-1:/data/enketo-cache.rdb&lt;span class="p"&gt;;&lt;/span&gt;
docker start central-enketo_redis_cache-1&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy the &lt;em&gt;secrets&lt;/em&gt; folder to the &lt;strong&gt;&lt;code&gt;central-service-1&lt;/code&gt;&lt;/strong&gt; Docker container:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker stop central-service-1
docker &lt;span class="nb"&gt;cp&lt;/span&gt; ~/odk-backups/secrets central-service-1:/etc&lt;span class="p"&gt;;&lt;/span&gt;
docker start central-service-1&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Update Destination Server Enketo Config File.
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Copy the config keys text file to the &lt;em&gt;config&lt;/em&gt; folder in the Enketo Docker container.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;cp&lt;/span&gt;  ~/odk-backups/enketo-config-keys.txt central-enketo-1:/srv/src/enketo_express/config/
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open a Bash shell in the Enketo Docker container.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-it&lt;/span&gt; central-enketo-1 /bin/bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Install &lt;strong&gt;nano&lt;/strong&gt; editor.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;apt-get &lt;span class="nb"&gt;install &lt;/span&gt;nano &lt;span class="nt"&gt;-y&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;&lt;strong&gt;&lt;code&gt;-y&lt;/code&gt;&lt;/strong&gt; option will assume "yes" as the answer to any prompts that may occur during the installation process.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open the Enketo config file with the &lt;strong&gt;nano&lt;/strong&gt; editor.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nano config/config.json&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Prefix "encryption key", "less secure encryption key", and "api key" keys with &lt;strong&gt;#&lt;/strong&gt; (or any other character). This will help you differentiate between the existing keys from the new keys when updating.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"app name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Enketo"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"base path"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"-"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;#&lt;/span&gt;&lt;span class="nl"&gt;"encryption key"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"contents of enketo-secret"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"id length"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;31&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;#&lt;/span&gt;&lt;span class="nl"&gt;"less secure encryption key"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"contents of enketo-less-secret"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"linked form and data server"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;#&lt;/span&gt;&lt;span class="nl"&gt;"api key"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"contents of enketo-api-key"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="err"&gt;...&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Move to the beginning of the file and press the &lt;strong&gt;ENTER&lt;/strong&gt; key after the opening curly brace (&lt;strong&gt;{&lt;/strong&gt;) to create a blank line.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Press &lt;strong&gt;Control + R&lt;/strong&gt; (&lt;strong&gt;^R&lt;/strong&gt;) keys. The editor will display a &lt;strong&gt;"File to insert [from ./]:"&lt;/strong&gt; prompt. Type &lt;em&gt;config/enketo-config-keys.txt&lt;/em&gt; next to the prompt and press the &lt;strong&gt;ENTER&lt;/strong&gt; key to insert the contents of the file at the cursor location.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Move the cursor to the "encryption key" line of the data that has been inserted to the file.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Press  &lt;strong&gt;Control&lt;/strong&gt; + &lt;strong&gt;K&lt;/strong&gt; (&lt;strong&gt;^K&lt;/strong&gt;) keys to cut the line.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Move the cursor to the beginning of "encryption key" prefixed with a &lt;strong&gt;#&lt;/strong&gt; character.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Press  &lt;strong&gt;Control&lt;/strong&gt; + &lt;strong&gt;U&lt;/strong&gt; (&lt;strong&gt;^U&lt;/strong&gt;) keys to paste.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Repeat the previous three steps to update "less secure encryption key" and "api key"  values.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Remove the lines prefixed with &lt;strong&gt;#&lt;/strong&gt; by moving to each line and pressing &lt;strong&gt;Control&lt;/strong&gt; + &lt;strong&gt;K&lt;/strong&gt; (&lt;strong&gt;^K&lt;/strong&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save the changes and exit the &lt;strong&gt;nano&lt;/strong&gt; editor.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Installation of &lt;strong&gt;nano&lt;/strong&gt; editor was temporarily. Therefore, uninstall the editor and exit the Enketo container using the following commands.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;apt-get remove nano &lt;span class="nt"&gt;-y&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="nb"&gt;exit&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Restart the Enketo Docker container to effect the changes in the configuration file.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker restart central-enketo-1
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Optionally show the active status of the docker containers.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker ps &lt;span class="nt"&gt;--format&lt;/span&gt; &lt;span class="s1"&gt;'table {{.Names}}\t{{.State}}'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Change Domain Names
&lt;/h2&gt;

&lt;p&gt;If the ODK Central domain names on the source and destination servers are different, you will encounter the following error when you attempt to preview restored forms on the destination server.&lt;/p&gt;

&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;This form does not exist or you no longer have access to it. Please check the URL for any missing characters.&lt;br&gt;
If the form existed previously, it may have been archived, disabled or deleted. If this is unexpected, please contact the person who asked you to fill the form.&lt;br&gt;
(Attempted to access form with ID: )&lt;/p&gt;
&lt;/blockquote&gt;


&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyt1c2xm9ii2eouvc7l5q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyt1c2xm9ii2eouvc7l5q.png" alt="ODK Central form does not exist error."&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To resolve the error, you must replace the old domain values with the new domain values in the Redis databases.&lt;/p&gt;

&lt;h3&gt;
  
  
  Create Domain Update Script Files
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;On the destination server, create &lt;em&gt;~/main-odk-domains.sh&lt;/em&gt; script file.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nano ~/main-odk-domains.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Update the file with the following:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/sh&lt;/span&gt;

&lt;span class="nv"&gt;OLD_DOMAIN&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"old-domain.com"&lt;/span&gt;
&lt;span class="nv"&gt;NEW_DOMAIN&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"new-domain.com"&lt;/span&gt;

&lt;span class="nv"&gt;keys&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 KEYS  &lt;span class="s2"&gt;"*"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;for &lt;/span&gt;key &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="nv"&gt;$keys&lt;/span&gt;
&lt;span class="k"&gt;do
  if&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 &lt;span class="nb"&gt;type&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"string"&lt;/span&gt; &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
    if &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="nt"&gt;-q&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$OLD_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
      &lt;/span&gt;&lt;span class="nv"&gt;new_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="s2"&gt;"s/&lt;/span&gt;&lt;span class="nv"&gt;$OLD_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="nv"&gt;$NEW_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;/g"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
      redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379  RENAME &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$new_key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="k"&gt;fi
  elif&lt;/span&gt; &lt;span class="o"&gt;[&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 &lt;span class="nb"&gt;type&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"hash"&lt;/span&gt; &lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
    &lt;/span&gt;&lt;span class="nv"&gt;existingRosaServer&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379  HGET &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="s2"&gt;"openRosaServer"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$existingRosaServer&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="nt"&gt;-q&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$OLD_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
      &lt;/span&gt;&lt;span class="nv"&gt;newRosaServer&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$existingRosaServer&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="s2"&gt;"s/&lt;/span&gt;&lt;span class="nv"&gt;$OLD_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="nv"&gt;$NEW_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;/g"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
      redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 HSET &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="s2"&gt;"openRosaServer"&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$newRosaServer&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="k"&gt;fi

    if &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;grep&lt;/span&gt; &lt;span class="nt"&gt;-q&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$OLD_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;then
      &lt;/span&gt;&lt;span class="nv"&gt;new_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; | &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="s2"&gt;"s/&lt;/span&gt;&lt;span class="nv"&gt;$OLD_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="nv"&gt;$NEW_DOMAIN&lt;/span&gt;&lt;span class="s2"&gt;/g"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;
      redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379  RENAME &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$new_key&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
    &lt;span class="k"&gt;fi
  fi
done&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The script fetches all keys from the Redis database and iterates over each key. For each key, it substitutes any occurrences of old domain values with the new domain values and renames the key in the database. Where the type of key is &lt;strong&gt;hash&lt;/strong&gt;, the value of the &lt;strong&gt;openRosaServer&lt;/strong&gt; field is updated to the new domain.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;sed&lt;/strong&gt; command is used to substitute old domain values with new domain values. The &lt;strong&gt;&lt;code&gt;g&lt;/code&gt;&lt;/strong&gt; at the end of the command tells &lt;strong&gt;sed&lt;/strong&gt; to replace all occurrences of OLD_DOMAIN in the key.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;code&gt;grep -q&lt;/code&gt;&lt;/strong&gt; command searches for old domain value matches within the key and exits with either a success or failure. The &lt;strong&gt;&lt;code&gt;-q&lt;/code&gt;&lt;/strong&gt; option suppresses display of the output.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save the file and exit the &lt;strong&gt;nano&lt;/strong&gt; editor.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Make the script file executable&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;chmod&lt;/span&gt; +x  ~/main-odk-domains.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Copy &lt;em&gt;main-odk-domains.sh&lt;/em&gt; to &lt;em&gt;cache-odk-domains.sh&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cp&lt;/span&gt; ~/main-odk-domains.sh ~/cache-odk-domains.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open &lt;em&gt;cache-odk-domains.sh&lt;/em&gt; with a text editor and replace any occurrences of &lt;strong&gt;&lt;code&gt;redis-cli -p 6379&lt;/code&gt;&lt;/strong&gt; with  &lt;strong&gt;&lt;code&gt;redis-cli -p 6380&lt;/code&gt;&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save the changes and exit the editor.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Update Enketo Redis Main Database
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Copy the &lt;em&gt;main-odk-domains.sh&lt;/em&gt; script file to the main Enketo Redis database Docker container.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker &lt;span class="nb"&gt;cp&lt;/span&gt; ~/main-odk-domains.sh central-enketo_redis_main-1:/root
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open a Bourne shell in the main Enketo Redis database container.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-it&lt;/span&gt; central-enketo_redis_main-1 sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Run the script file&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;/root/main-odk-domains.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Optionally, confirm the update has been successful&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;redis-cli &lt;span class="nt"&gt;-p&lt;/span&gt; 6379 KEYS &lt;span class="s2"&gt;"*NEW_DOMAIN*"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Exit the Docker container&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;exit&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Update Enketo Redis Cache Database
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Copy the &lt;em&gt;cache-odk-domains.sh&lt;/em&gt; script file to the cache Enketo Redis database Docker container.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt; docker &lt;span class="nb"&gt;cp&lt;/span&gt; ~/cache-odk-domains.sh central-enketo_redis_cache-1:/root
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open a Bourne shell in the cache Enketo Redis database container&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;  docker &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-it&lt;/span&gt; central-enketo_redis_cache-1 sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Run the script file&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;/root/cache-odk-domains.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Exit the Docker container&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;exit&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This tutorial has shown you how to backup and restore ODK Central data. You have learned how to create a direct backup of your ODK Central database, copy the backup file and other essential data to the destination server, and restore the backup file on the new server. You have also learned how to update the domain names in the Enketo Redis databases and config files to avoid errors when previewing forms on the new server.&lt;/p&gt;

&lt;p&gt;If you found this article useful, please share it with your friends and colleagues who might benefit from it.&lt;/p&gt;

&lt;p&gt;Thank you for reading!&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://forum.getodk.org/t/central-backup-restore/33047/10" rel="noopener noreferrer"&gt;Central Backup Restore&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://docs.getodk.org/central-backup/#restoring-a-backup" rel="noopener noreferrer"&gt;Restoring a backup&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://forum.getodk.org/t/backup-api-call-fails-after-several-minutes-with-err-stream-premature-close/40924/2" rel="noopener noreferrer"&gt;Backup api call fails after several minutes with ERR_STREAM_PREMATURE_CLOSE&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://forum.getodk.org/t/failed-to-restore-odk-central-using-a-downloaded-backup-file/40785/5" rel="noopener noreferrer"&gt;Failed to restore ODK central using a downloaded backup file&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://forum.getodk.org/t/trying-to-migrate-my-odk-deployment-and-its-data-from-aws-to-digital-ocean/39404/2" rel="noopener noreferrer"&gt;Trying to migrate my ODK deployment and its data from AWS to Digital Ocean&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://forum.getodk.org/t/im-running-central-on-ubuntu-and-want-to-migrate-to-another-server/40212/2" rel="noopener noreferrer"&gt;I’m running Central on Ubuntu and want to migrate to another server&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>docker</category>
      <category>odk</category>
      <category>backup</category>
      <category>bash</category>
    </item>
    <item>
      <title>How to Transform SQL Queries to Crosstabs in PostgreSQL</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Wed, 08 Mar 2023 18:59:34 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-transform-sql-queries-to-crosstabs-in-postgresql-1ma6</link>
      <guid>https://dev.to/kagundajm/how-to-transform-sql-queries-to-crosstabs-in-postgresql-1ma6</guid>
      <description>&lt;p&gt;A SELECT SQL query retrieves data from a database in a tabular form. The first row, or header row, has the column names, and all the other rows have the data that was retrieved.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fted00nndri7ivfz0g1d8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fted00nndri7ivfz0g1d8.png" alt="Listing of salesmen monthly sales data"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In a crosstab, the data is condensed and the names of one or more columns are rotated. You can add row and column totals to a crosstab. For example, a list of salesmen's monthly sales may include the months in the column headers. By rotating the data, it is easier to read and understand how the facts relate to one another.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs31qo6xzt67yr7c98a48.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs31qo6xzt67yr7c98a48.png" alt="Sample crosstab"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Other names you might come across for crosstabs are matrix reports, pivot for SQL server databases, and pivot tables on spreadsheets like Microsoft Excel, Google Sheets, and LibreOffice.&lt;/p&gt;

&lt;p&gt;The following topics are covered in this post:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Crosstabs using CASE conditional expression&lt;/li&gt;
&lt;li&gt;Using Common Table Expressions (CTE) for a crosstab&lt;/li&gt;
&lt;li&gt;Crosstabs using aggregate FILTER clause&lt;/li&gt;
&lt;li&gt;Using the &lt;code&gt;crosstab()&lt;/code&gt; function in PostgreSQL&lt;/li&gt;
&lt;li&gt;Using a PostgreSQL &lt;code&gt;crosstab()&lt;/code&gt; function with more than three columns&lt;/li&gt;
&lt;li&gt;Using ARRAY data type to re-arrange extra crosstab columns&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This post uses queries from a database populated with data from the &lt;a href="https://github.com/pthom/northwind_psql" rel="noopener noreferrer"&gt;Northwind database for Postgres&lt;/a&gt;.  The queries are run using &lt;a href="https://dbeaver.io/" rel="noopener noreferrer"&gt;DBeaver&lt;/a&gt; SQL client.&lt;/p&gt;

&lt;h2&gt;
  
  
  Crosstabs Using CASE Conditional Expression &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;The SQL CASE expression enables you to choose a value depending on a condition, much like an &lt;strong&gt;if-then-else&lt;/strong&gt; conditional statement. The syntax for the CASE expression is as follows:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;

&lt;span class="k"&gt;CASE&lt;/span&gt; &lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="n"&gt;condition&lt;/span&gt; &lt;span class="k"&gt;THEN&lt;/span&gt; &lt;span class="k"&gt;result&lt;/span&gt;
     &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="k"&gt;WHEN&lt;/span&gt; &lt;span class="p"&gt;...]&lt;/span&gt;
     &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="k"&gt;ELSE&lt;/span&gt; &lt;span class="k"&gt;result&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="k"&gt;END&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;If a &lt;code&gt;condition&lt;/code&gt; evaluates to &lt;code&gt;true&lt;/code&gt;, the &lt;code&gt;result&lt;/code&gt; value will be chosen, otherwise; the &lt;code&gt;result&lt;/code&gt; value contained in the optional &lt;strong&gt;ELSE&lt;/strong&gt; expression will be chosen.&lt;/p&gt;

&lt;p&gt;In the example below, the SQL CASE statement is used to make a crosstab of monthly total sales by employees for the first four months of 1997.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;If you run the query above, you will get a crosstab like the one below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5yrk210yfxsk7d1yqdm7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5yrk210yfxsk7d1yqdm7.png" alt="employees monthly total sales crosstab during the first four months of 1997"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can combine the previous query with a &lt;strong&gt;UNION ALL&lt;/strong&gt; operator and the following &lt;strong&gt;SQL SELECT&lt;/strong&gt; query to include the total for each month in the cross-tab.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyy8d35ql3j8qnv5lcels.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyy8d35ql3j8qnv5lcels.png" alt="employees monthly total sales crosstab during the first four months 1997 including month totals using SQL CASE expression"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The SQL CASE expression is supported by most database systems.&lt;/p&gt;

&lt;h2&gt;
  
  
  Using Common Table Expressions (CTE) For A Crosstab &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;Common Table Expressions (CTE) in SQL let you create a temporary table in a SELECT SQL statement. It is then possible to reference this named temporary table in subsequent SELECT, INSERT, UPDATE, or DELETE SQL statements. The CTEs simplify the SQL queries by breaking them down into smaller, more manageable parts that are easier to read and understand. In PostgreSQL, CTEs are called &lt;a href="https://www.postgresql.org/docs/current/queries-with.html" rel="noopener noreferrer"&gt;WITH queries&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You can rewrite the previous CASE crosstab query using a CTE as follows:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h2&gt;
  
  
  Crosstabs Using Aggregate FILTER Clause &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;From PostgreSQL 9.4, you can use the &lt;a href="https://www.postgresql.org/docs/current/sql-expressions.html#SYNTAX-AGGREGATES" rel="noopener noreferrer"&gt;FILTER clause&lt;/a&gt; to perform  &lt;a href="https://www.postgresql.org/docs/15/tutorial-agg.html" rel="noopener noreferrer"&gt;aggregate functions&lt;/a&gt; on specific records. The FILTER clause is less wordy and has a cleaner syntax than the CASE statement.&lt;/p&gt;

&lt;p&gt;The following SQL query uses the FILTER clause to generate a crosstab.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The SQL &lt;strong&gt;COALESCE&lt;/strong&gt; function replaces any null values in the crosstab with zero (0) values.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftjbqz51iux46skpx44rg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftjbqz51iux46skpx44rg.png" alt="employees monthly total sales crosstab during the first four months of 1997 including month totals using aggregate FILTER clause"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using PostgreSQL &lt;code&gt;crosstab()&lt;/code&gt; Function &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;The &lt;code&gt;crosstab()&lt;/code&gt; function is part of the optional &lt;a href="https://www.postgresql.org/docs/current/tablefunc.html" rel="noopener noreferrer"&gt;&lt;code&gt;tablefunc&lt;/code&gt;&lt;/a&gt; module.&lt;/p&gt;

&lt;p&gt;You can run the &lt;code&gt;SELECT COUNT(*) FROM pg_extension WHERE extname='tablefunc';&lt;/code&gt; query to see if the &lt;code&gt;tablefunc&lt;/code&gt; extension is installed on the database you are using. If the result of the query is 0, install and activate the &lt;code&gt;tablefunc⁣&lt;/code&gt; extension using the &lt;code&gt;CREATE EXTENSION IF NOT EXIST tablefunc;⁣&lt;/code&gt; SQL command. The &lt;code&gt;tablefunc⁣&lt;/code&gt; module can be installed by non-superusers who have the &lt;code&gt;CREATE⁣&lt;/code&gt; privilege.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;crosstab&lt;/code&gt; function has several options, &lt;code&gt;crosstab(sql text)&lt;/code&gt;, &lt;code&gt;crosstabN(sql text)&lt;/code&gt;, and &lt;code&gt;crosstab(source_sql text, category_sql text)&lt;/code&gt;. Examples in this post use the &lt;code&gt;crosstab(source_sql text, category_sql text)&lt;/code&gt; function option.&lt;/p&gt;

&lt;p&gt;The first parameter (&lt;code&gt;source_sql text&lt;/code&gt;) in a &lt;code&gt;crosstab(source_sql text, category_sql text)&lt;/code&gt; function is the source SQL &lt;strong&gt;SELECT&lt;/strong&gt; query statement and must return at least three (3) columns of data to pivot or rotate. The first column (&lt;code&gt;row_name&lt;/code&gt;) contains data values to be used as row identifiers in the final result; data in the second column (&lt;code&gt;category&lt;/code&gt;) represents category values that will be rotated to column headers in the pivot table, and the third column (&lt;code&gt;value&lt;/code&gt;) contains data to be assigned to each cell of the final crosstab. The second parameter &lt;code&gt;text category_sql&lt;/code&gt; is a query returning a category list for the columns.&lt;/p&gt;

&lt;p&gt;The crosstab function returns a set of records with unknown data types. Therefore, you must alias the returned columns with column names and types using the &lt;code&gt;AS (col1 type, col2 type, ...)&lt;/code&gt; clause. Failing to alias the columns will cause &lt;strong&gt;a column definition list is required for functions returning "record"&lt;/strong&gt; error.&lt;/p&gt;

&lt;p&gt;The SQL query below shows how to use the &lt;code&gt;crosstab(source_sql text, category_sql text)&lt;/code&gt; function. A dollar-quoted string constant (&lt;code&gt;$$&lt;/code&gt;) has been used to maintain formatting consistency and remove the need to escape single quotes (') by doubling them.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbn1oy8u9gr4ha72enc4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffbn1oy8u9gr4ha72enc4.png" alt="employees monthly total sales crosstab during the first four months of 1997 using the crosstab() function"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using A PostgreSQL &lt;code&gt;crosstab()&lt;/code&gt; Function With More Than Three Columns &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;If your &lt;code&gt;source_sql text&lt;/code&gt; SQL query returns more than three columns, the additional or extra columns must be placed between the  &lt;code&gt;row_name&lt;/code&gt; and &lt;code&gt;category&lt;/code&gt; columns. In the previous crosstab SQL query, any extra or additional columns must be placed between the &lt;code&gt;salesman&lt;/code&gt; and &lt;code&gt;order_month&lt;/code&gt; columns.&lt;/p&gt;

&lt;p&gt;PostgreSQL &lt;a href="https://www.postgresql.org/docs/current/sql-syntax-lexical.html" rel="noopener noreferrer"&gt;converts identifiers/column names to lowercase&lt;/a&gt; by default. To capitalize the crosstab column headers, enclose them within double quotes.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9kqrf9t96ms4zjh61tqj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9kqrf9t96ms4zjh61tqj.png" alt="employees monthly total sales crosstab during the first four months of 1997 using crosstab() function with more than 3 columns"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Using ARRAY Data Type To Re-Arrange Extra Crosstab Columns &lt;a&gt;&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;A crosstab with more than three columns has limitations in the order placement of the additional columns in the final crosstab. However, you can re-arrange the crosstab columns by inserting the extra columns into an &lt;a href="https://www.postgresql.org/docs/current/arrays.html" rel="noopener noreferrer"&gt;arrays&lt;/a&gt; data type.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F53dyjty9dxaj14jesysn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F53dyjty9dxaj14jesysn.png" alt="employees monthly total sales crosstab during the first four months 1997 using crosstab() function with re-arranged columns"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Crosstabs are a powerful way to summarize and analyze data from a database. By presenting data in a condensed and organized format, crosstabs make it easier to analyze relationships between different variables.&lt;/p&gt;

&lt;p&gt;The CASE conditional expression, aggregate FILTER clause, and the PostgreSQL crosstab() function are some of the methods for creating crosstabs. If the crosstab() function uses more than three columns to summarize data, the ARRAY data type may be used to re-arrange the extra columns into the correct positions. Using crosstabs, PostgreSQL users can create reports and dashboards that help them make informed business decisions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://www.postgresonline.com/article_pfriendly/14.html" rel="noopener noreferrer"&gt;CrossTab Queries in PostgreSQL using tablefunc contrib&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://postgresql.verite.pro/blog/2018/06/19/crosstab-pivot.html" rel="noopener noreferrer"&gt;Static and dynamic pivots&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>postgres</category>
      <category>crosstab</category>
      <category>sql</category>
      <category>database</category>
    </item>
    <item>
      <title>How To Install ODK Central on Ubuntu 20.04</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Thu, 02 Dec 2021 12:22:54 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-install-odk-central-on-ubuntu-2004-4pom</link>
      <guid>https://dev.to/kagundajm/how-to-install-odk-central-on-ubuntu-2004-4pom</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Open Data Kit (ODK) is a set (components) of free, open-source software used for facilitating data collection using mobile devices. &lt;a href="https://docs.getodk.org/central-intro/"&gt;ODK Central&lt;/a&gt; or Central is the server component. Without the server, the process of data collection cannot proceed.&lt;/p&gt;

&lt;p&gt;Installing ODK Central on a DigitalOcean server is the recommended method and is &lt;a href="https://docs.getodk.org/central-install-digital-ocean/"&gt;well documented&lt;/a&gt;. Where situations demand installing Central on a different cloud provider, documentation of the process is either scanty or altogether lacking.&lt;/p&gt;

&lt;p&gt;This post steps through the process of installing Central on a custom Linux server not hosted by DigitalOcean and contains the following sections:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prerequisites&lt;/li&gt;
&lt;li&gt;Checking DNS Propagation&lt;/li&gt;
&lt;li&gt;Checking for Available Disk Space&lt;/li&gt;
&lt;li&gt;Installation of the Docker Engine&lt;/li&gt;
&lt;li&gt;Customizing Docker Engine Data Root Folder&lt;/li&gt;
&lt;li&gt;Configuring Docker Engine to run as a non-root user&lt;/li&gt;
&lt;li&gt;Installing Docker Compose&lt;/li&gt;
&lt;li&gt;Installing ODK Central&lt;/li&gt;
&lt;li&gt;Starting Up ODK Central&lt;/li&gt;
&lt;li&gt;Create ODK Central Administrator&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;You can SSH to your server using a user who has &lt;code&gt;sudo&lt;/code&gt; rights.&lt;/li&gt;
&lt;li&gt;You have purchased a domain name. In this post, I will be using &lt;strong&gt;odk-central.example.com&lt;/strong&gt; as the domain name. Remember to substitute the domain name with your domain name.&lt;/li&gt;
&lt;li&gt;You have created the domain (odk-central.example.com) and sub-domain (&lt;a href="http://www.odk-central.example.com"&gt;www.odk-central.example.com&lt;/a&gt;) names on your cloud provider. If the www sub-domain is not created, &lt;a href="https://letsencrypt.org/"&gt;Let’s Encrypt&lt;/a&gt; fails with &lt;strong&gt;Challenge failed for domain odk-central.example.com&lt;/strong&gt; when validating requests for SSL certificates for your domain.&lt;/li&gt;
&lt;li&gt;You have updated the A/AAAA records for your domain and sub-domain to point to your server. An A-record links your domain name or sub-domain name to an IP address for your server.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Checking DNS Propagation
&lt;/h2&gt;

&lt;p&gt;Using the &lt;strong&gt;ping&lt;/strong&gt; command, we will test whether our domain name is reachable from our local computer.&lt;/p&gt;

&lt;p&gt;Open a terminal or command window and run the following command &lt;code&gt;ping odk-central.example.com -c4&lt;/code&gt;. The &lt;code&gt;-c4&lt;/code&gt; option species the number of requests. The &lt;code&gt;-c4&lt;/code&gt; option is not required when running the command from a Windows computer; windows defaults to four ping requests. Replace odk-central.example.com with your domain name.&lt;/p&gt;

&lt;p&gt;If the response you get is similar to &lt;strong&gt;"ping: cannot resolve odk-central.example.com: Unknown host"&lt;/strong&gt; or &lt;strong&gt;"Ping request could not find host odk-central.example.com. Please check the name and try again."&lt;/strong&gt;, then your server is not reachable.&lt;/p&gt;

&lt;p&gt;Changes you make to the A-records could take several hours to propagate. There are online tools that may assist in checking the reachability of your domains and sub-domains. Some of these tools are:&lt;br&gt;
&lt;a href="https://dnspropagation.net/"&gt;DNSPropagation&lt;/a&gt;, &lt;a href="https://dns.google/"&gt;DNS Google&lt;/a&gt;, DNS Tester](&lt;a href="https://www.dnstester.net/"&gt;https://www.dnstester.net/&lt;/a&gt;) among others can help in checking on the propagation process.&lt;/p&gt;

&lt;p&gt;Do not forget to check whether you can also ping the &lt;strong&gt;www&lt;/strong&gt; sub-domain.&lt;/p&gt;
&lt;h2&gt;
  
  
  Checking for Available Disk Space
&lt;/h2&gt;

&lt;p&gt;ODK Central uses multiple Docker containers. During the installation process, Docker images required by ODK Central service get downloaded to the server. Before starting the install process, verify that your server has adequate space. Establish whether the available space is on one hard disk drive or spread over multiple hard disk drives.&lt;/p&gt;

&lt;p&gt;To display the drives and space used, run &lt;code&gt;df --human-readable&lt;/code&gt; or &lt;code&gt;df -h&lt;/code&gt; command from a terminal window.&lt;/p&gt;

&lt;p&gt;Running the &lt;code&gt;df&lt;/code&gt; command on a server provisioned on DigitalOcean may display an output similar to the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Filesystem      Size  Used Avail Use% Mounted on
udev            3.9G     0  3.9G   0% /dev
tmpfs           797M  1.6M  795M   1% /run
/dev/vda1       155G   11G  145G   7% /
tmpfs           3.9G     0  3.9G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The sample output shows that the server has one partition &lt;code&gt;/dev/vda1&lt;/code&gt; mounted on  the root (&lt;code&gt;/&lt;/code&gt;) directory with 145Gb of free space.&lt;/p&gt;

&lt;p&gt;However, running the &lt;code&gt;df&lt;/code&gt; command on my custom server displays a different output as shown below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;Filesystem                   Size  Used Avail Use% Mounted on
/dev/root                     29G  2.1G   27G   7% /
devtmpfs                     3.9G     0  3.9G   0% /dev
tmpfs                        3.9G     0  3.9G   0% /dev/shm
tmpfs                        796M  1.1M  795M   1% /run
/dev/sdb1                    503G   73M  478G   1% /data
...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From the output, observe that the custom server has two disk partitions; &lt;code&gt;/dev/root&lt;/code&gt; mounted on root (&lt;code&gt;/&lt;/code&gt;) directory with 27Gb of free space and &lt;code&gt;/dev/sdb1&lt;/code&gt; mounted on &lt;code&gt;/data&lt;/code&gt; directory with 476Gb of free space. To avoid running out of disk space on the root directory, we need to configure the Docker Engine to store data on the data directory; likewise, we require to install ODK Central on the data directory.&lt;/p&gt;

&lt;h2&gt;
  
  
  Install Docker Engine
&lt;/h2&gt;

&lt;p&gt;A Docker Engine installation on Ubuntu requires the server to be running a 64-bit version of Ubuntu 18.04 or higher.&lt;/p&gt;

&lt;p&gt;You can check the installed version of Ubuntu on your server by running &lt;code&gt;lsb_release -a&lt;/code&gt; or &lt;code&gt;cat /etc/lsb-release&lt;/code&gt; command from a terminal window.&lt;/p&gt;

&lt;p&gt;Use the following steps to install Docker:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Update the server's local repository package information by running the &lt;code&gt;sudo apt update&lt;/code&gt; command. Running this command is required every time before you install or upgrade software.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Install packages required to install Docker using &lt;code&gt;sudo apt install ca-certificates curl gnupg lsb-release&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;ca-certificates&lt;/code&gt; - common digital certificates used to verify and encrypt data between a server and other websites&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;curl&lt;/code&gt; - tool used to transfer data from one server to another server.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;gnupg&lt;/code&gt; - command line tool that encrypts and signs files; it also known as &lt;code&gt;gpg&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;lsb-release&lt;/code&gt; - a utility that displays Linux Standard Base (LSB) information about a Linux distribution. o display your distribution information, run the &lt;code&gt;lsb_release -a&lt;/code&gt; or &lt;code&gt;cat /etc/lsb-release&lt;/code&gt; command from a terminal window.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Download Docker's official GPG key specific to Ubuntu and save the key to &lt;em&gt;/usr/share/keyrings/docker-archive-keyring.gpg&lt;/em&gt; file on our server.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-fsSL&lt;/span&gt; https://download.docker.com/linux/ubuntu/gpg | &lt;span class="nb"&gt;sudo &lt;/span&gt;gpg &lt;span class="nt"&gt;--dearmor&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; /usr/share/keyrings/docker-archive-keyring.gpg
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Add Docker's stable repository to our server's local packages repository.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="s2"&gt;"deb [arch=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;dpkg &lt;span class="nt"&gt;--print-architecture&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu &lt;/span&gt;&lt;span class="se"&gt;\&lt;/span&gt;&lt;span class="s2"&gt;
&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;lsb_release &lt;span class="nt"&gt;-cs&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt; stable"&lt;/span&gt; | &lt;span class="nb"&gt;sudo tee&lt;/span&gt; /etc/apt/sources.list.d/docker.list &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; /dev/null
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The following is an explanation of the sub-commands used in the previous command:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;dpkg --print-architecture&lt;/code&gt; - determines the system's architecture such as &lt;code&gt;amd64&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;lsb_release -cs&lt;/code&gt; - outputs the Linux distribution release codename, such as &lt;code&gt;bionic&lt;/code&gt; for Ubuntu 18.04 or &lt;code&gt;focal&lt;/code&gt; for Ubuntu 20.04&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;tee&lt;/code&gt; will read input from &lt;code&gt;echo&lt;/code&gt; command and write the output to &lt;em&gt;/etc/apt/sources.list.d/docker.list&lt;/em&gt; file&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Update packages in our server's repository - &lt;code&gt;sudo apt update&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install the latest version of Docker Engine &lt;code&gt;sudo apt install docker-ce docker-ce-cli containerd.io&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Verify Docker installation using  &lt;code&gt;sudo docker -v&lt;/code&gt; command. This should output version number for installed Docker.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The &lt;strong&gt;docker&lt;/strong&gt; service starts automatically on Ubuntu; however, we can run the &lt;code&gt;systemctl enable docker&lt;/code&gt; command to confirm that the service will start.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Configure Docker Data Root Folder
&lt;/h2&gt;

&lt;p&gt;By default, Docker stores all its data within the &lt;em&gt;/var/lib/docker&lt;/em&gt; directory. You can verify the location by running &lt;code&gt;docker info&lt;/code&gt; and looking for &lt;strong&gt;Docker Root Dir:&lt;/strong&gt; entry from the output.&lt;/p&gt;

&lt;p&gt;Our custom server has limited space on the root directory. As such, we need to configure Docker to save its data within our data directory.&lt;/p&gt;

&lt;p&gt;The preferred option to configure Docker is through a  &lt;em&gt;/etc/docker/daemon.json&lt;/em&gt; JSON  settings file. However, Docker does not automatically create the &lt;em&gt;/etc/docker/daemon.json&lt;/em&gt; file.&lt;/p&gt;

&lt;p&gt;Stop the Docker service by running the &lt;code&gt;sudo service docker stop&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;Run the &lt;code&gt;sudo nano /etc/docker/daemon.json&lt;/code&gt; to create the file.&lt;/p&gt;

&lt;p&gt;Update the file with the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"data-root"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"/data/docker-data"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To save and close the &lt;em&gt;/etc/docker/daemon.json&lt;/em&gt; file, press &lt;strong&gt;CTRL+X&lt;/strong&gt; keys together, then &lt;strong&gt;Y&lt;/strong&gt; to confirm that you want to save the modified file and finally press the &lt;strong&gt;ENTER&lt;/strong&gt; key to confirm the file name.&lt;/p&gt;

&lt;p&gt;Restart the Docker daemon by running the &lt;code&gt;sudo service docker start&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;Confirm that Docker's storage location has been updated to match the entry in our &lt;em&gt;/etc/docker/daemon.json&lt;/em&gt; file by running the &lt;code&gt;docker info | grep "Docker Root Dir"&lt;/code&gt; command. The output should reflect the changes we made in our configuration file - &lt;strong&gt;Docker Root Dir: /data/docker-data&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Use Docker As Non-Root User
&lt;/h2&gt;

&lt;p&gt;Every time a non-root user runs the &lt;code&gt;docker&lt;/code&gt; command, the user must prefix the command with &lt;code&gt;sudo&lt;/code&gt;. To avoid typing &lt;code&gt;sudo docker&lt;/code&gt;,  add the connected user to the &lt;strong&gt;docker&lt;/strong&gt; group.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create the &lt;strong&gt;docker&lt;/strong&gt; group by running the &lt;code&gt;sudo groupadd docker&lt;/code&gt; command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add connected user to the &lt;strong&gt;docker&lt;/strong&gt; group by running the &lt;code&gt;sudo usermod -aG docker $USER&lt;/code&gt; command. You can replace &lt;strong&gt;$USER&lt;/strong&gt; with your preferred user  - &lt;code&gt;sudo usermod -aG docker kagunda&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Activate the changes to groups by running the &lt;code&gt;newgrp docker&lt;/code&gt; command or log out and log back in again if running the command does not work.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Verify that you can access and download Docker images without &lt;code&gt;sudo&lt;/code&gt; by running the &lt;code&gt;docker run hello-world&lt;/code&gt; command.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Install Docker Compose
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Run the following command to download the current stable release of Docker Compose:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;curl &lt;span class="nt"&gt;-L&lt;/span&gt; &lt;span class="s2"&gt;"https://github.com/docker/compose/releases/download/1.29.2/docker-compose-&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;uname&lt;/span&gt; &lt;span class="nt"&gt;-s&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;-&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;uname&lt;/span&gt; &lt;span class="nt"&gt;-m&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; /usr/local/bin/docker-compose
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Assign execute permissions to the &lt;strong&gt;docker-compose&lt;/strong&gt; binary:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo chmod&lt;/span&gt; +x /usr/local/bin/docker-compose
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Optionally, create a symbolic link to &lt;em&gt;/usr/bin&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo ln&lt;/span&gt; &lt;span class="nt"&gt;-s&lt;/span&gt; /usr/local/bin/docker-compose /usr/bin/docker-compose
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Test the &lt;strong&gt;docker-compose&lt;/strong&gt; installation by running the &lt;code&gt;docker-compose --version&lt;/code&gt; command.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Install ODK Central
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;UFW (Uncomplicated Firewall) comes pre-installed on Ubuntu but is disabled. However, you can run the &lt;code&gt;ufw disable&lt;/code&gt; command to confirm the firewall is disabled.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;On our custom server, we want to install ODK Central on the data directory. To change the directory, run the &lt;code&gt;cd /data&lt;/code&gt; command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Download the ODK Central software using the &lt;code&gt;sudo git clone https://github.com/getodk/central&lt;/code&gt; command. If you  connected to the server as a root user,  omit the &lt;code&gt;sudo&lt;/code&gt; prefix - &lt;code&gt;git clone https://github.com/getodk/central&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;root@odk-server:/data#
root@odk-server:/data# git clone https://github.com/getodk/central
Cloning into &lt;span class="s1"&gt;'central'&lt;/span&gt;...
remote: Enumerating objects: 1015, &lt;span class="k"&gt;done&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
remote: Counting objects: 100% &lt;span class="o"&gt;(&lt;/span&gt;1014/1014&lt;span class="o"&gt;)&lt;/span&gt;, &lt;span class="k"&gt;done&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
remote: Compressing objects: 100% &lt;span class="o"&gt;(&lt;/span&gt;499/499&lt;span class="o"&gt;)&lt;/span&gt;, &lt;span class="k"&gt;done&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
remote: Total 1015 &lt;span class="o"&gt;(&lt;/span&gt;delta 550&lt;span class="o"&gt;)&lt;/span&gt;, reused 912 &lt;span class="o"&gt;(&lt;/span&gt;delta 490&lt;span class="o"&gt;)&lt;/span&gt;, pack-reused 1
Receiving objects: 100% &lt;span class="o"&gt;(&lt;/span&gt;1015/1015&lt;span class="o"&gt;)&lt;/span&gt;, 191.22 KiB | 5.03 MiB/s, &lt;span class="k"&gt;done&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
Resolving deltas: 100% &lt;span class="o"&gt;(&lt;/span&gt;550/550&lt;span class="o"&gt;)&lt;/span&gt;, &lt;span class="k"&gt;done&lt;/span&gt;&lt;span class="nb"&gt;.&lt;/span&gt;
root@odk-server:/data#
root@odk-server:/data#
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After the above command completes, type the &lt;code&gt;cd central&lt;/code&gt; command to move to the ODK Central installation folder.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Download missing ODK Central components by running the &lt;code&gt;git submodule update -i&lt;/code&gt; command.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;root@odk-server:/data#
root@odk-server:/data# &lt;span class="nb"&gt;cd &lt;/span&gt;central/
root@odk-server:/data/central# git submodule update &lt;span class="nt"&gt;-i&lt;/span&gt;
Submodule &lt;span class="s1"&gt;'client'&lt;/span&gt; &lt;span class="o"&gt;(&lt;/span&gt;https://github.com/getodk/central-frontend.git&lt;span class="o"&gt;)&lt;/span&gt; registered &lt;span class="k"&gt;for &lt;/span&gt;path &lt;span class="s1"&gt;'client'&lt;/span&gt;
Submodule &lt;span class="s1"&gt;'server'&lt;/span&gt; &lt;span class="o"&gt;(&lt;/span&gt;https://github.com/getodk/central-backend.git&lt;span class="o"&gt;)&lt;/span&gt; registered &lt;span class="k"&gt;for &lt;/span&gt;path &lt;span class="s1"&gt;'server'&lt;/span&gt;
Cloning into &lt;span class="s1"&gt;'/data/central/client'&lt;/span&gt;...
Cloning into &lt;span class="s1"&gt;'/data/central/server'&lt;/span&gt;...
Submodule path &lt;span class="s1"&gt;'client'&lt;/span&gt;: checked out &lt;span class="s1"&gt;'5cc6fd79d112ce36d6298c61bb8817689c4c323b'&lt;/span&gt;
Submodule path &lt;span class="s1"&gt;'server'&lt;/span&gt;: checked out &lt;span class="s1"&gt;'1d1a3a59969e61383da74119e405e67778b7a170'&lt;/span&gt;
root@odk-server:/data/central#
root@odk-server:/data/central#
root@odk-server:/data/central#
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy ODK Central settings template file by running the &lt;code&gt;mv .env.template .env&lt;/code&gt; command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Edit the settings file by running the &lt;code&gt;nano .env&lt;/code&gt; command. This will launch the &lt;strong&gt;nano&lt;/strong&gt; editor and display the contents of the &lt;em&gt;.env&lt;/em&gt; settings file.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Change the value of &lt;strong&gt;DOMAIN&lt;/strong&gt; to use your domain name. In my case, the value will be &lt;strong&gt;DOMAIN=odk-central.example.com&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Change the value of &lt;strong&gt;SYSADMIN_EMAIL&lt;/strong&gt; to use a valid email for your email. Let's Encrypt service will use this email address to notify you if something goes wrong when issuing your security certificates.&lt;/li&gt;
&lt;li&gt;Press &lt;strong&gt;CTRL+X&lt;/strong&gt; keys together, then &lt;strong&gt;Y&lt;/strong&gt; to confirm that you want to save the modified file and finally press &lt;strong&gt;ENTER&lt;/strong&gt;  key to confirm the file name.
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--T_RMGe2f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/spjjghcpwbwrqnlamrsq.png" alt='"modified odk central .env settings file"' width="600" height="356"&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type &lt;code&gt;docker-compose build&lt;/code&gt; to build the stack of applications required to run ODK Central service. The process will take download all required ODK Central  Docker images and will take time. After the process completes, you should see a success message, and the terminal prompt gets displayed.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run the &lt;code&gt;docker-compose up --no-start&lt;/code&gt; command. This command tells Docker to create all containers required by ODK Central without starting these containers.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Starting Up ODK Central
&lt;/h2&gt;

&lt;p&gt;Make sure you are still in the &lt;em&gt;central&lt;/em&gt; directory and run the &lt;code&gt;docker-compose up -d&lt;/code&gt;. The &lt;code&gt;-d&lt;/code&gt; option starts the ODK Central containers in the background and leaves them running.&lt;/p&gt;

&lt;p&gt;To check whether all ODK Central containers have finished loading, run the &lt;code&gt;docker-compose ps&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;nginx&lt;/code&gt; is the web server container for ODK Central. It should display &lt;strong&gt;Up&lt;/strong&gt; or  &lt;strong&gt;Up (healthy)&lt;/strong&gt; under the &lt;strong&gt;State&lt;/strong&gt; column. If the state column displays &lt;strong&gt;Up (health: starting)&lt;/strong&gt;, give it more time to complete starting up. A state of &lt;strong&gt;Exit 0&lt;/strong&gt;  for the &lt;strong&gt;secrets&lt;/strong&gt; container means everything is fine.&lt;/p&gt;

&lt;p&gt;After the web server has started successfully, visit your domain in a web browser. If everything worked out as expected, the ODK Central login page should be displayed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EAOlDkze--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hcx0lsebal6g6eb8ap8d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EAOlDkze--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hcx0lsebal6g6eb8ap8d.png" alt='"ODK Central user login page"' width="640" height="321"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create ODK Central Administrator
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Make sure you are on the &lt;em&gt;central&lt;/em&gt; folder. If you are in any other folder, move to the &lt;em&gt;central&lt;/em&gt; folder by running the &lt;code&gt;cd /data/central&lt;/code&gt; command. Substitute &lt;em&gt;/data/central&lt;/em&gt; with the location where you installed ODK Central.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Run the following command to create an account within ODK Central.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;service odk-cmd &lt;span class="nt"&gt;--email&lt;/span&gt; YOUREMAIL@ADDRESSHERE.com user-create
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;Substitute &lt;strong&gt;&lt;a href="mailto:YOUREMAIL@ADDRESSHERE.com"&gt;YOUREMAIL@ADDRESSHERE.com&lt;/a&gt;&lt;/strong&gt; with the email for the user you wish to create. You will be requested to provide a password for the new account.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Promote the account to an administrator using the following command:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;service odk-cmd &lt;span class="nt"&gt;--email&lt;/span&gt; YOUREMAIL@ADDRESSHERE.com user-promote
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;TTo create other users, navigate to your domain in a web browser, log in using the administrator account you have created and create new users from the ODK Central interface.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you ever forget or lose the administrator's password, you can reset the password by running the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;service odk-cmd &lt;span class="nt"&gt;--email&lt;/span&gt; YOUREMAIL@ADDRESSHERE.com user-set-password
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;ODK Central is the server-side software used for data collection using mobile devices. When installing ODK Central on a cloud provider other than  DigitalOcean, you are on your own.&lt;/p&gt;

&lt;p&gt;In this post, I installed ODK Central on a server with two hard drives and having limited space on the root directory. I started by looking at ways to check for DNS propagation for your domain names, installed Docker, configured Docker to save data in a custom directory, cloned the ODK Central repository, and finally configured and tested the ODK Central application was up and running.&lt;/p&gt;

&lt;p&gt;I hope the post may be helpful to anyone wishing to install ODK Central on a custom cloud provider other than DigitalOcean.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/engine/install/ubuntu/"&gt;Install Docker Engine on Ubuntu&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/engine/install/linux-postinstall/"&gt;Post-installation steps for Linux&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/config/daemon/#configure-the-docker-daemon"&gt;Configure the Docker daemon&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.freecodecamp.org/news/where-are-docker-images-stored-docker-container-paths-explained/"&gt;Where are Docker Images Stored? Docker Container Paths Explained&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/moby/moby/issues/7667"&gt;Proposal: Docker Engine Keys for Docker Remote API Authentication and Authorization #7667&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.docker.com/compose/install/"&gt;Install Docker Compose&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.getodk.org/central-install-digital-ocean/#central-install-digital-ocean"&gt;Installing Central on DigitalOcean&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>odk</category>
      <category>productivity</category>
      <category>docker</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>How to Drop Multiple Databases in PostgreSQL</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Thu, 28 Oct 2021 16:42:20 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-drop-multiple-databases-in-postgresql-2pno</link>
      <guid>https://dev.to/kagundajm/how-to-drop-multiple-databases-in-postgresql-2pno</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--V_GsuJkx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e1ja0mov84ai1v6qjal8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--V_GsuJkx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/e1ja0mov84ai1v6qjal8.png" alt='alt="drop multiple databases in postgresql"' width="880" height="307"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;While performing database integration tests, things went south, and I ended up with more than 15 (fifteen) temporary PostgreSQL test databases.&lt;/p&gt;

&lt;p&gt;I use &lt;a href="https://github.com/OmniDB/OmniDB"&gt;OmniDB&lt;/a&gt;, an open-source application for managing databases. Version (3.0.2b) of OmniDB does not have an option for selecting and dropping multiple databases. To drop a database in OmniDB, you right-click on a database, select &lt;strong&gt;Drop Database&lt;/strong&gt; from the context menu, run the query to drop the database, and finally close the query window. The thought of repeating the four steps on all those test databases led me to think of other efficient alternatives.&lt;/p&gt;

&lt;p&gt;The following three alternatives for dropping multiple databases in PostgreSQL came to mind:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Anonymous Code Block&lt;/li&gt;
&lt;li&gt;Interactive Terminal&lt;/li&gt;
&lt;li&gt;Using a shell script&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;My test databases are prefixed with &lt;strong&gt;test&lt;/strong&gt;; I will extract the names of the databases from &lt;a href="https://www.postgresql.org/docs/current/catalog-pg-database.html"&gt;&lt;code&gt;pg_database&lt;/code&gt;&lt;/a&gt; database using &lt;code&gt;SELECT datname FROM pg_database WHERE datname LIKE 'test%' AND datistemplate=false&lt;/code&gt; SQL query. The WHERE clause in the SQL query will restrict the database names to my test databases and non-template databases.&lt;/p&gt;

&lt;p&gt;In the following sections, I explore the above three options in dropping multiple databases in PostgreSQL.&lt;/p&gt;

&lt;h2&gt;
  
  
  Anonymous Code Block
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://www.postgresql.org/docs/current/sql-do.html"&gt;&lt;strong&gt;DO&lt;/strong&gt;&lt;/a&gt; statement in PostgreSQL executes an anonymous code block. In the past, I have used an anonymous code block to &lt;a href="https://kags.me.ke/post/postgresql-how-to-drop-all-tables/"&gt;drop tables from a database&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;DO&lt;/span&gt; &lt;span class="err"&gt;$$&lt;/span&gt;
  &lt;span class="k"&gt;DECLARE&lt;/span&gt;
      &lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="n"&gt;RECORD&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;BEGIN&lt;/span&gt;
    &lt;span class="k"&gt;FOR&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="k"&gt;IN&lt;/span&gt;
        &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;datname&lt;/span&gt;
        &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;pg_database&lt;/span&gt;
        &lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;datname&lt;/span&gt; &lt;span class="k"&gt;LIKE&lt;/span&gt; &lt;span class="s1"&gt;'test%'&lt;/span&gt; &lt;span class="k"&gt;AND&lt;/span&gt; &lt;span class="n"&gt;datistemplate&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="k"&gt;false&lt;/span&gt;
    &lt;span class="n"&gt;LOOP&lt;/span&gt;
        &lt;span class="k"&gt;EXECUTE&lt;/span&gt; &lt;span class="s1"&gt;'DROP DATABASE '&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="n"&gt;quote_ident&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;datname&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="s1"&gt;';'&lt;/span&gt; &lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="n"&gt;LOOP&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;END&lt;/span&gt; &lt;span class="err"&gt;$$&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This time, however, the anonymous code block throws an exception:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;DROP DATABASE cannot be executed from a &lt;span class="k"&gt;function
&lt;/span&gt;CONTEXT: SQL statement &lt;span class="s2"&gt;"DROP DATABASE test_01024da6a3;"&lt;/span&gt;
PL/pgSQL &lt;span class="k"&gt;function &lt;/span&gt;inline_code_block line 12 at EXECUTE
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--svvf-P-n--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vgq2ic48uvb4oum1j9dl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--svvf-P-n--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vgq2ic48uvb4oum1j9dl.png" alt='"Drop multiple databases in anonymous code block"' width="880" height="607"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As it turns out, you  &lt;a href="https://www.postgresql.org/docs/current/sql-dropdatabase.html"&gt;cannot execute DROP DATABASE  statements inside a transaction block&lt;/a&gt; and &lt;a href="https://www.postgresql.org/docs/current/plpgsql-transactions.html"&gt;anonymous code blocks being functions are transactional&lt;/a&gt; and this is the reason for the exception.&lt;/p&gt;

&lt;h2&gt;
  
  
  Interactive Terminal
&lt;/h2&gt;

&lt;p&gt;PostgreSQL interactive terminal (&lt;strong&gt;&lt;a href="https://www.postgresql.org/docs/current/app-psql.html"&gt;psql&lt;/a&gt;&lt;/strong&gt;) allows you to enter, edit, execute commands, and view results of SQL queries. In addition to SQL queries, &lt;strong&gt;psql&lt;/strong&gt; provides meta-commands which are commands processed by &lt;strong&gt;psql&lt;/strong&gt; without being sent to a PostgreSQL server; all meta-commands have a backslash () prefix. In the following paragraphs, I explain how to dynamically create a &lt;code&gt;DROP DATABASE&lt;/code&gt; SQL query and run the query using the &lt;code&gt;\gexec&lt;/code&gt; meta-command.&lt;/p&gt;

&lt;p&gt;Before you execute PostgreSQL SQL commands, you have to connect to a PostgreSQL server. To connect to a PostgreSQL server,  run the &lt;code&gt;psql -d postgres&lt;/code&gt; command from a terminal window. If you don't specify the user name,  PostgreSQL will default to the current operating system user. The default user has permission to create and drop databases. You can connect as a different user by appending the &lt;strong&gt;-U&lt;/strong&gt; command-line option to the &lt;code&gt;psql&lt;/code&gt; command (&lt;code&gt;psql -U username -d postgres&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Upon successful connection, the &lt;code&gt;psql&lt;/code&gt; prompt you get will either be &lt;code&gt;postgres=&amp;gt;&lt;/code&gt; or &lt;code&gt;postgres=#&lt;/code&gt; if you are a database superuser. At the prompt,  type your SQL query to construct &lt;code&gt;DROP DATABASE&lt;/code&gt; statements for the databases you want to drop. Do not terminate the query with a semicolon (;) --&lt;code&gt;SELECT 'DROP DATABASE ' || quote_ident(datname) || ';' FROM pg_database WHERE datname LIKE 'test%' AND datistemplate=false&lt;/code&gt;. If you terminate the query with a semicolon (;), the query will run when you press the &lt;strong&gt;RETURN&lt;/strong&gt; key.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;\gexec&lt;/code&gt; meta-command sends the current query buffer to the server and treats each column of each row of the query's output as an SQL statement to be executed. In our case, our query will output a &lt;strong&gt;DROP DATABASE&lt;/strong&gt; SQL query for each test database. &lt;code&gt;\gexec&lt;/code&gt; meta-command will then run each statement from the output.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="c"&gt;#&lt;/span&gt;
&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="c"&gt;# SELECT 'DROP DATABASE ' || quote_ident(datname) || ';'&lt;/span&gt;
&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="c"&gt;# FROM pg_database&lt;/span&gt;
&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="c"&gt;# WHERE datname LIKE 'test%' AND datistemplate=false&lt;/span&gt;
&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="c"&gt;#&lt;/span&gt;
&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="c"&gt;# \gexec&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Using Shell Script
&lt;/h2&gt;

&lt;p&gt;Scripts help in automating repetitive tasks. You create a text file, type commands in the file, make the file executable, and any time you want to repeat the commands, you execute the script file instead of retyping the commands.&lt;/p&gt;

&lt;p&gt;Creating and executing a shell script involves the following steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Use a text editor to create your script file&lt;/li&gt;
&lt;li&gt;Insert commands in the file.&lt;/li&gt;
&lt;li&gt;Save and close the file.&lt;/li&gt;
&lt;li&gt;Make the script executable.&lt;/li&gt;
&lt;li&gt;Execute the script.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Create a &lt;em&gt;drop-test-dbs.sh&lt;/em&gt; text file and update with the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="nv"&gt;test_databases_file&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;~/projects/test_dbs.txt
psql &lt;span class="nt"&gt;-d&lt;/span&gt; postgres &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="s2"&gt;"COPY (SELECT datname FROM pg_database WHERE datname LIKE 'test%' AND datistemplate=false) TO '&lt;/span&gt;&lt;span class="nv"&gt;$test_databases_file&lt;/span&gt;&lt;span class="s2"&gt;'"&lt;/span&gt;

&lt;span class="k"&gt;while &lt;/span&gt;&lt;span class="nb"&gt;read &lt;/span&gt;dbname
&lt;span class="k"&gt;do
  &lt;/span&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"dropping DB &lt;/span&gt;&lt;span class="nv"&gt;$dbname&lt;/span&gt;&lt;span class="s2"&gt;..."&lt;/span&gt;
  dropdb &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$dbname&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;span class="k"&gt;done&lt;/span&gt; &amp;lt; &lt;span class="nv"&gt;$test_databases_file&lt;/span&gt;

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"removing &lt;/span&gt;&lt;span class="nv"&gt;$test_databases_file&lt;/span&gt;&lt;span class="s2"&gt; file"&lt;/span&gt;
&lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nv"&gt;$test_databases_file&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;#!/bin/bash&lt;/code&gt; is called a &lt;strong&gt;shebang&lt;/strong&gt; or a &lt;strong&gt;bang&lt;/strong&gt; line. It specifies the interpreter that will execute the commands.&lt;/p&gt;

&lt;p&gt;The script uses &lt;strong&gt;psql&lt;/strong&gt; to connect to a PostgreSQL server, run SQL query to fetch names of the test databases and &lt;a href="https://www.postgresql.org/docs/current/sql-copy.html"&gt;copy&lt;/a&gt; the database names to a &lt;em&gt;test_dbs.txt&lt;/em&gt; file. We then open the created file (containing database names), and for each database name, we drop the database using the &lt;a href="https://www.postgresql.org/docs/current/app-dropdb.html"&gt;dropdb&lt;/a&gt; utility program. Finally, we remove the file containing the database names.&lt;/p&gt;

&lt;p&gt;After creating our script file, we then open the console or terminal window and assign the execute permission to the file using the &lt;strong&gt;chmod&lt;/strong&gt; command (&lt;code&gt;chmod +x drop-test-dbs.sh&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Finally, to drop the databases, we execute the commands in the file by running the file from a console or terminal window (&lt;code&gt;./drop-test-dbs.sh&lt;/code&gt;). If the script file is not within the current folder, remember to include the file path (&lt;code&gt;file-location/drop-test-dbs.sh&lt;/code&gt;).&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;There are various methods for dropping multiple databases in PostgreSQL.&lt;/p&gt;

&lt;p&gt;ou can use desktop or web-based database management tools to drop databases. Some database management tools allow dropping one database at a time; others can drop multiple databases.&lt;/p&gt;

&lt;p&gt;You cannot drop PostgreSQL databases within a transaction; PostgreSQL functions (including anonymous functions) are transactional.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;psql&lt;/code&gt; is a powerful command-line tool that ships with PostgreSQL. You can use &lt;code&gt;psql&lt;/code&gt; to connect to a server and interact with databases. Using &lt;code&gt;psql&lt;/code&gt; and querying the &lt;code&gt;pg_database&lt;/code&gt; database for the required databases together with &lt;code&gt;\gexec&lt;/code&gt; meta-command is another option for dropping multiple databases in PostgreSQL. However, failing to filter for required databases will drop all databases from your server.&lt;/p&gt;

&lt;p&gt;If the task of dropping multiple databases is repetitive, you can consolidate the &lt;code&gt;psql&lt;/code&gt; commands into a shell script file and execute the script file.&lt;/p&gt;

&lt;p&gt;If you would like to create multiple databases for testing the methods described in this post, you can create and run a script file with the following commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;#!/bin/bash&lt;/span&gt;

&lt;span class="k"&gt;for &lt;/span&gt;n &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;1..15&lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="k"&gt;do&lt;/span&gt;
  &lt;span class="c"&gt;# Use Bash command substitution to store the&lt;/span&gt;
  &lt;span class="c"&gt;# output (GUID) from uuidgen command&lt;/span&gt;
  &lt;span class="nv"&gt;DB_ID&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;uuidgen&lt;span class="si"&gt;)&lt;/span&gt;

  &lt;span class="c"&gt;# Use Bash Substring expansion to extract&lt;/span&gt;
  &lt;span class="c"&gt;# 8 characters of the GUID starting from index 0&lt;/span&gt;
  &lt;span class="nv"&gt;db&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"test_db_&lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;DB_ID&lt;/span&gt;:0:8&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;

  createdb &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$db&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
  &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"DB &lt;/span&gt;&lt;span class="nv"&gt;$db&lt;/span&gt;&lt;span class="s2"&gt; created"&lt;/span&gt;
&lt;span class="k"&gt;done&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>postgres</category>
      <category>database</category>
      <category>sql</category>
    </item>
    <item>
      <title>How To Export Data From Database Query To XML</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Wed, 30 Jun 2021 02:55:51 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-export-data-from-database-query-to-xml-l3i</link>
      <guid>https://dev.to/kagundajm/how-to-export-data-from-database-query-to-xml-l3i</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://en.wikipedia.org/wiki/XML"&gt;Extensible Markup Language (XML)&lt;/a&gt; has been in existence for more than two decades. XML is a markup language for encoding documents in a format that is both human and machine readable. In XML, you define your own custom tags, elements and attributes to meet your specific needs. XML is case sensitive, allows comments and hierarchy is important. An XML file is a text file and can be opened with any text editor. The following is a sample of an XML document:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="c"&gt;&amp;lt;!-- sample comment --&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;Customers&amp;gt;&lt;/span&gt; &lt;span class="c"&gt;&amp;lt;!-- root element --&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;Customer&amp;gt;&lt;/span&gt;  &lt;span class="c"&gt;&amp;lt;!-- nested element --&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;Helena&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/Customer&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;Customer&amp;gt;&lt;/span&gt;
  ...
  &lt;span class="nt"&gt;&amp;lt;/Customer&amp;gt;&lt;/span&gt;
  ...
&lt;span class="nt"&gt;&amp;lt;/Customers&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When we run a query against a relational database table, the resulting data is presented mostly in tabular format. Each horizontal row represents a record while the columns flow vertically and represent an attribute of the record.&lt;/p&gt;

&lt;p&gt;In this post, I explore how MS SQL Server and PostgreSQL databases may be queried to output the data in an XML format instead of tabular format. Each of these databases contain inbuilt functionality to  output XML. You can also manually write queries combining XML elements and data from the databases to create XML output. Manually created XML output is tedious, will be missing a root element and will therefore not conform to a &lt;a href="https://www.w3.org/TR/REC-xml/#sec-well-formed"&gt;well formed xml document&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create XML Manually
&lt;/h2&gt;

&lt;p&gt;The sample data used within this post will be from a hypothetical &lt;strong&gt;customers&lt;/strong&gt;  table  with the following structure and sample data.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;id&lt;/th&gt;
&lt;th&gt;first_name&lt;/th&gt;
&lt;th&gt;last_name&lt;/th&gt;
&lt;th&gt;city&lt;/th&gt;
&lt;th&gt;state&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;François&lt;/td&gt;
&lt;td&gt;Tremblay&lt;/td&gt;
&lt;td&gt;Montreal&lt;/td&gt;
&lt;td&gt;QC&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;Bjørn&lt;/td&gt;
&lt;td&gt;Hansen&lt;/td&gt;
&lt;td&gt;Oslo&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;Helena&lt;/td&gt;
&lt;td&gt;Holý&lt;/td&gt;
&lt;td&gt;Prague&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4&lt;/td&gt;
&lt;td&gt;Fernanda&lt;/td&gt;
&lt;td&gt;Ramos&lt;/td&gt;
&lt;td&gt;Brasília&lt;/td&gt;
&lt;td&gt;DF&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Create a file named &lt;em&gt;customers.xml&lt;/em&gt; and open it with a text editor of your choice.&lt;/p&gt;

&lt;p&gt;Create a root element in the file so that the XML file we create will be a well formed xml document by inserting the following contents to the file you:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;Customers&amp;gt;&lt;/span&gt;

&lt;span class="nt"&gt;&amp;lt;/Customers&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Compose your query for data by enclosing your data columns within XML elements. Use the &lt;strong&gt;CONCAT()&lt;/strong&gt; function to combine the XML elements and the data. &lt;strong&gt;CONCAT()&lt;/strong&gt;  function is supported by both MS SQL Server and PostgreSQL. Use &lt;strong&gt;COALESCE()&lt;/strong&gt; function to replace any null values with an empty string.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;CONCAT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
   &lt;span class="s1"&gt;'&amp;lt;Customer&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s1"&gt;'&amp;lt;Id&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&amp;lt;/Id&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s1"&gt;'&amp;lt;FirstName&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;first_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&amp;lt;/FirstName&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s1"&gt;'&amp;lt;LastName&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;last_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&amp;lt;/LastName&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s1"&gt;'&amp;lt;City&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;COALESCE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;city&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="s1"&gt;'&amp;lt;/City&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="s1"&gt;'&amp;lt;State&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;COALESCE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;state&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="s1"&gt;'&amp;lt;/State&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
   &lt;span class="s1"&gt;'&amp;lt;/Customer&amp;gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="n"&gt;customers_xml&lt;/span&gt;
 &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;customers&lt;/span&gt; &lt;span class="k"&gt;LIMIT&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run the query, select the results, copy and paste them between the opening and closing root element in the XML file we created.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;Customers&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;Customer&amp;gt;&amp;lt;Id&amp;gt;&lt;/span&gt;1&lt;span class="nt"&gt;&amp;lt;/Id&amp;gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;François&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&amp;lt;LastName&amp;gt;&lt;/span&gt;Tremblay&lt;span class="nt"&gt;&amp;lt;/LastName&amp;gt;&amp;lt;City&amp;gt;&lt;/span&gt;Montreal&lt;span class="nt"&gt;&amp;lt;/City&amp;gt;&amp;lt;State&amp;gt;&lt;/span&gt;QC&lt;span class="nt"&gt;&amp;lt;/State&amp;gt;&amp;lt;/Customer&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;Customer&amp;gt;&amp;lt;Id&amp;gt;&lt;/span&gt;2&lt;span class="nt"&gt;&amp;lt;/Id&amp;gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;Bjørn&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&amp;lt;LastName&amp;gt;&lt;/span&gt;Hansen&lt;span class="nt"&gt;&amp;lt;/LastName&amp;gt;&amp;lt;City&amp;gt;&lt;/span&gt;Oslo&lt;span class="nt"&gt;&amp;lt;/City&amp;gt;&amp;lt;State&amp;gt;&amp;lt;/State&amp;gt;&amp;lt;/Customer&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/Customers&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save your XML file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dbfiddle.uk/?rdbms=postgres_13&amp;amp;fiddle=f386322015711406bed8038dff284003"&gt;SQL Fiddle&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Output XML Using Database Inbuilt Functions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  PostgreSQL
&lt;/h3&gt;

&lt;p&gt;PostgreSQL has &lt;code&gt;query_to_xml(query text, nulls boolean, tableforest boolean, targetns text)&lt;/code&gt; function for displaying data in XML format. The &lt;code&gt;query_to_xml&lt;/code&gt; function requires four (4) arguments:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;query&lt;/code&gt; - The actual SQL query as text.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;nulls&lt;/code&gt; - If elements with null values should be included. When the value is false, columns with null values will be omitted from the XML document when the XML elements are generated. A &lt;strong&gt;true&lt;/strong&gt; value for the parameter will output a self-closing element containing &lt;code&gt;&amp;lt;columnname xsi:nil="true"/&amp;gt;&lt;/code&gt; attribute.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;tableforest&lt;/code&gt; - Will put each row in different XML documents. That means that each query data row will be wrapped in a &lt;strong&gt;row&lt;/strong&gt; root element.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;targetns&lt;/code&gt; - The namespace to put the result in. Pass a blank string to use the default namespace.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;After running the query, &lt;strong&gt;table&lt;/strong&gt; tag will appear as the root element while  &lt;strong&gt;row&lt;/strong&gt; tag will be  set for each row in the data. You can then update the table and row tags with your required tag names.&lt;/p&gt;

&lt;p&gt;Using our sample table, we generate XML using the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;query_to_xml&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s1"&gt;'SELECT t.id "Id"
     , t.first_name "FirstName"
     , t.last_name "LastName"
     , t.city "City"
     , t.state "State"
    FROM customers t LIMIT 2'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The output after running the function will be as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;table&lt;/span&gt; &lt;span class="na"&gt;xmlns:xsi=&lt;/span&gt;&lt;span class="s"&gt;"http://www.w3.org/2001/XMLSchema-instance"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;row&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;Id&amp;gt;&lt;/span&gt;1&lt;span class="nt"&gt;&amp;lt;/Id&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;François&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;LastName&amp;gt;&lt;/span&gt;Tremblay&lt;span class="nt"&gt;&amp;lt;/LastName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;City&amp;gt;&lt;/span&gt;Montreal&lt;span class="nt"&gt;&amp;lt;/City&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;State&amp;gt;&lt;/span&gt;QC&lt;span class="nt"&gt;&amp;lt;/State&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/row&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;row&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;Id&amp;gt;&lt;/span&gt;2&lt;span class="nt"&gt;&amp;lt;/Id&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;Bjørn&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;LastName&amp;gt;&lt;/span&gt;Hansen&lt;span class="nt"&gt;&amp;lt;/LastName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;City&amp;gt;&lt;/span&gt;Oslo&lt;span class="nt"&gt;&amp;lt;/City&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;State&lt;/span&gt; &lt;span class="na"&gt;xsi:nil=&lt;/span&gt;&lt;span class="s"&gt;"true"&lt;/span&gt;&lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/row&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/table&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://dbfiddle.uk/?rdbms=postgres_13&amp;amp;fiddle=16909c7c0600b8966b471dccd52b2dfd"&gt;SQL Fiddle&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select and copy the XML output.&lt;/p&gt;

&lt;p&gt;Create a file named &lt;em&gt;customers.xml&lt;/em&gt; and open the file with a text editor.&lt;/p&gt;

&lt;p&gt;Paste the contents you  copied above to the file, replace &lt;strong&gt;table&lt;/strong&gt;  with &lt;strong&gt;Customers&lt;/strong&gt;, &lt;strong&gt;row&lt;/strong&gt; with &lt;strong&gt;Customer&lt;/strong&gt; and save the file.&lt;/p&gt;

&lt;p&gt;If you want your XML document to include the XML Schema, then use &lt;code&gt;query_to_xml_and_xmlschema&lt;/code&gt; function instead of  &lt;code&gt;query_to_xml&lt;/code&gt; function.&lt;/p&gt;

&lt;p&gt;Using the &lt;strong&gt;&lt;a href="https://wiki.postgresql.org/wiki/COPY"&gt;COPY&lt;/a&gt;&lt;/strong&gt; function, you can copy the  XML output directly to a file using the following syntax:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;COPY&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;query_to_xml&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
      &lt;span class="s1"&gt;'SELECT t.id "Id"
      , t.first_name "FirstName"
      , t.last_name "LastName"
      , t.city "City"
      , t.state "State"
      FROM customers t'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;TO&lt;/span&gt; &lt;span class="s1"&gt;'~/tmp/customers.xml'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that the &lt;strong&gt;&lt;a href="https://www.postgresql.org/docs/current/sql-copy.html"&gt;COPY&lt;/a&gt;&lt;/strong&gt; command has limited file access and user permissions. To run the command, the user running the command must be either a &lt;strong&gt;superuser&lt;/strong&gt; or a member of the &lt;strong&gt;pg_write_server_files&lt;/strong&gt;. However, great care should be taken when granting a user any of these roles as these roles are able to access any file on the server file system.&lt;/p&gt;

&lt;h3&gt;
  
  
  SQL Server
&lt;/h3&gt;

&lt;p&gt;SQL Server uses a &lt;strong&gt;FOR XML&lt;/strong&gt; clause in select query to transform the query results to  XML output.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;TOP&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
  &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="n"&gt;Id&lt;/span&gt;
  &lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;first_name&lt;/span&gt; &lt;span class="n"&gt;FirstName&lt;/span&gt;
  &lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;last_name&lt;/span&gt; &lt;span class="n"&gt;LastName&lt;/span&gt;
  &lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;city&lt;/span&gt; &lt;span class="n"&gt;City&lt;/span&gt;
  &lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;COALESCE&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;state&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;State&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;customers&lt;/span&gt;
&lt;span class="k"&gt;FOR&lt;/span&gt; &lt;span class="n"&gt;XML&lt;/span&gt; &lt;span class="n"&gt;PATH&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'Customer'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;ROOT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'Customers'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Remember to enclose any nullable columns within the &lt;strong&gt;COALESCE()&lt;/strong&gt; function to prevent them being omitted from the XML output in case they don't have values.&lt;/p&gt;

&lt;p&gt;SQL Server has the advantage of allowing for customization of the root element and object node elements depending on &lt;strong&gt;FOR XML&lt;/strong&gt; mode selected. The following will be the output after running the above query:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;Customers&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;Customer&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;Id&amp;gt;&lt;/span&gt;1&lt;span class="nt"&gt;&amp;lt;/Id&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;François&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;LastName&amp;gt;&lt;/span&gt;Tremblay&lt;span class="nt"&gt;&amp;lt;/LastName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;City&amp;gt;&lt;/span&gt;Montreal&lt;span class="nt"&gt;&amp;lt;/City&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;State&amp;gt;&lt;/span&gt;QC&lt;span class="nt"&gt;&amp;lt;/State&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/Customer&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;Customer&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;Id&amp;gt;&lt;/span&gt;2&lt;span class="nt"&gt;&amp;lt;/Id&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;FirstName&amp;gt;&lt;/span&gt;Bjørn&lt;span class="nt"&gt;&amp;lt;/FirstName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;LastName&amp;gt;&lt;/span&gt;Hansen&lt;span class="nt"&gt;&amp;lt;/LastName&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;City&amp;gt;&lt;/span&gt;Oslo&lt;span class="nt"&gt;&amp;lt;/City&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;State&amp;gt;&amp;lt;/State&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/Customer&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/Customers&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://dbfiddle.uk/?rdbms=sqlserver_2019&amp;amp;fiddle=e3ff65a7e577e6d987be34ee5b74d962"&gt;SQL Fiddle&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select and copy the XML output.&lt;/p&gt;

&lt;p&gt;Create a file named &lt;em&gt;customers.xml&lt;/em&gt; and open the file with a text editor.&lt;/p&gt;

&lt;p&gt;Paste the contents you  copied above to the file and save the file.&lt;/p&gt;

&lt;p&gt;You can also create the xml file in a single step by using the &lt;a href="https://docs.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15"&gt;bulk copy program utility (&lt;strong&gt;bcp&lt;/strong&gt;)&lt;/a&gt;. The &lt;strong&gt;bcp&lt;/strong&gt; utility can be used for importing and exporting data from/to a data file.&lt;/p&gt;

&lt;p&gt;The following command uses &lt;strong&gt;bcp&lt;/strong&gt; utility to create an XML file from the sample data we have been using.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bcp &lt;span class="s2"&gt;"SELECT TOP 2
  id Id
  ,first_name FirstName
  ,last_name LastName
  ,city City
  ,COALESCE(state, '') State
FROM customers
FOR XML PATH('Customer'), ROOT('Customers')"&lt;/span&gt; queryout ~/Documents/customers.xml &lt;span class="nt"&gt;-S&lt;/span&gt; localhost &lt;span class="nt"&gt;-d&lt;/span&gt; testdb  &lt;span class="nt"&gt;-c&lt;/span&gt; &lt;span class="nt"&gt;-U&lt;/span&gt; sa
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;-S&lt;/code&gt; specifies the SQL Server instance to connect to&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-d&lt;/code&gt; species the database to connect to&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-c&lt;/code&gt; performs the operation using a character data type without prompting for each field&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-U&lt;/code&gt; specifies the login ID used to connect to SQL Server&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;In this post, we looked at various options available to create XML data from PostgreSQl and MS SQL Server database queries. One option would be to generate the XML data manually while another option would be to use in-built functions contained within these databases. It is also possible to output XML data files using PostgreSQL &lt;strong&gt;COPY&lt;/strong&gt; command or SQL Server &lt;strong&gt;bcp&lt;/strong&gt; utility.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.postgresql.org/docs/current/functions-xml.html"&gt;XML Functions&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/sql/relational-databases/xml/for-xml-sql-server?view=sql-server-ver15"&gt;FOR XML (SQL Server)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>database</category>
      <category>xml</category>
      <category>postgres</category>
      <category>sqlserver</category>
    </item>
    <item>
      <title>Installing MS SQL Server 2019 on Parrot OS</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Tue, 22 Jun 2021 11:13:55 +0000</pubDate>
      <link>https://dev.to/kagundajm/installing-ms-sql-server-2019-on-parrot-os-4h3b</link>
      <guid>https://dev.to/kagundajm/installing-ms-sql-server-2019-on-parrot-os-4h3b</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Parrot OS is a light Linux distribution based on Debian. Microsoft SQL Server or SQL Server is a relational database management system (RDBMS) developed by Microsoft. Starting with SQL Server 2017, Microsoft added Linux support and thus making any newer versions cross platform.&lt;/p&gt;

&lt;p&gt;This post steps through the process of SQL Server 2019 Developer edition installation on Parrot OS and contains the following topics:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;downloading and configuring MS SQL Server 2019 package repository&lt;/li&gt;
&lt;li&gt;running the SQL Server setup process&lt;/li&gt;
&lt;li&gt;installation of SQL Server command line tools&lt;/li&gt;
&lt;li&gt;installation Azure Data Studio&lt;/li&gt;
&lt;li&gt;running queries in Azure Data Studio&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The commands used in this post will also work on any Debian based distribution like Ubuntu, MX Linux, Linux Mint, Knoppix among many others.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Parrot OS has very minimal memory requirements (256MB) and can be run directly from a USB flash drive. However, to run MS SQL Server 2019, Parrot OS must be installed on a hard disk. The computer running Parrot OS must also meet the following requirements:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;have minimum of 2 GB memory&lt;/li&gt;
&lt;li&gt;have 6 GB or more free hard disk space&lt;/li&gt;
&lt;li&gt;contain a x64 processor with 2 or more cores&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Configure SQL Server 2019 Repository
&lt;/h2&gt;

&lt;p&gt;The repository is used to acquire the database engine package, mssql-server, and related SQL Server packages.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Open the Terminal window&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Register Microsoft Ubuntu repository by running &lt;code&gt;curl https://packages.microsoft.com/config/ubuntu/20.04/mssql-server-2019.list | sudo tee /etc/apt/sources.list.d/mssql-server-2019.list&lt;/code&gt; command and type the password for the super user. &lt;code&gt;tee&lt;/code&gt; command will write the downloaded file to &lt;em&gt;/etc/apt/sources.list.d&lt;/em&gt; folder.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run &lt;code&gt;ls /etc/apt/sources.list.d&lt;/code&gt; command and confirm that &lt;em&gt;mssql-server-2019.list&lt;/em&gt; file has been copied to  &lt;em&gt;/etc/apt/sources.list.d&lt;/em&gt; folder   folder and you should see &lt;em&gt;mssql-server-2019.list&lt;/em&gt; among the files&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run &lt;code&gt;sudo apt-get update&lt;/code&gt; to update the system&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If you  receive an error '&lt;strong&gt;The following signatures couldn't be verified because the public key is not available: NO_PUBKEY EB3E94ADBE1229CF&lt;/strong&gt;', add the key to trusted keys using &lt;code&gt;gpg --keyserver keyserver.ubuntu.com --recv-keys EB3E94ADBE1229CF&lt;/code&gt; and re-run &lt;code&gt;sudo apt-get update&lt;/code&gt; command.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Install SQL Server Developer Edition
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Run &lt;code&gt;sudo apt-get install -y mssql-server&lt;/code&gt; command to install SQL Server&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After the package installation completes, run &lt;code&gt;sudo /opt/mssql/bin/mssql-conf setup&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type &lt;strong&gt;2&lt;/strong&gt; to select &lt;strong&gt;Developer&lt;/strong&gt; edition and press the &lt;strong&gt;return&lt;/strong&gt; key&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type &lt;strong&gt;Y&lt;/strong&gt; to accept the license terms and press &lt;strong&gt;return&lt;/strong&gt; key&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type and confirm the system administrator password and press &lt;strong&gt;return&lt;/strong&gt; key. The password should have a minimum of eight characters&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once the configuration is complete, type &lt;code&gt;systemctl status mssql-server --no-pager&lt;/code&gt; to verify that the service is up and running. &lt;code&gt;--no-pager&lt;/code&gt; will force any long lines to wrap.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Install SQL Server Command Line Tools
&lt;/h2&gt;

&lt;p&gt;Command-line tools are used for running SQL commands from a Terminal window. Various command-line tools are available among them &lt;a href="https://en.wikipedia.org/wiki/Sqsh"&gt;&lt;strong&gt;sqsh&lt;/strong&gt;&lt;/a&gt; or &lt;strong&gt;SQSHELL&lt;/strong&gt;, &lt;a href="https://docs.microsoft.com/en-us/sql/tools/sqlcmd-utility?view=sql-server-ver15"&gt;&lt;strong&gt;sqlcmd&lt;/strong&gt;&lt;/a&gt; and &lt;a href="https://github.com/dbcli/mssql-cli/"&gt;&lt;strong&gt;mssql-cli&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;sqsh&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Parrot OS comes bundled with &lt;strong&gt;sqsh&lt;/strong&gt; command line tool and therefore, no installation is required.&lt;/p&gt;

&lt;p&gt;To use &lt;code&gt;sqsh&lt;/code&gt;, open a Terminal window and run &lt;code&gt;sqsh -S localhost -U sa -C 'SELECT @@VERSION&lt;/code&gt; command and type the SQL Server administrator password you used  during setup. This will display the SQL Server version and Linux OS Version and the Terminal prompt is re-displayed.&lt;/p&gt;

&lt;p&gt;To login into &lt;strong&gt;sqsh&lt;/strong&gt; interactive terminal, run &lt;code&gt;sqsh -S localhost -U sa&lt;/code&gt;. You will be prompted for the password and if the connection is successful, you will get the &lt;code&gt;sqsh&lt;/code&gt; command prompt '&lt;strong&gt;&amp;gt;1&lt;/strong&gt;'. You exit from the &lt;code&gt;sqsh&lt;/code&gt; interactive session by typing the command &lt;code&gt;exit&lt;/code&gt; or quit` to end the session.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;sqlcmd&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;sqlcmd&lt;/strong&gt; utility in SQL Server lets you submit T-SQL statements or batches to a local or remote instance of SQL Server. Before  using&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Register Microsoft Ubuntu repository by running &lt;code&gt;curl https://packages.microsoft.com/config/ubuntu/20.04/prod.list | sudo tee /etc/apt/sources.list.d/msprod.list&lt;/code&gt; command&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run &lt;code&gt;sudo apt-get update&lt;/code&gt; and &lt;code&gt;sudo apt-get install mssql-tools unixodbc-dev&lt;/code&gt; to update the sources list and installation commands.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Add &lt;em&gt;/opt/mssql-tools/bin/&lt;/em&gt; to your &lt;strong&gt;PATH&lt;/strong&gt; environment variable by running &lt;code&gt;echo 'export PATH="$PATH:/opt/mssql-tools/bin"' &amp;gt;&amp;gt; ~/.bashrc&lt;/code&gt; and &lt;code&gt;source ~/.bashrc&lt;/code&gt; commands&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To test connectivity, run &lt;code&gt;sqlcmd -S localhost -U sa -Q 'SELECT @@VERSION&lt;/code&gt; and enter the password.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To open an &lt;strong&gt;sqlcmd&lt;/strong&gt; interactive session, you use the same command just like &lt;strong&gt;sqsh&lt;/strong&gt; command-line tool but replace &lt;code&gt;sqsh&lt;/code&gt; with &lt;code&gt;sqlcmd&lt;/code&gt; - &lt;code&gt;sqlcmd -S localhost -U sa&lt;/code&gt;. You also use &lt;strong&gt;exit&lt;/strong&gt; or &lt;strong&gt;quit&lt;/strong&gt; to end the session.&lt;/p&gt;

&lt;p&gt;While  &lt;strong&gt;sqsh&lt;/strong&gt; and  &lt;strong&gt;sqlcmd&lt;/strong&gt; may have similarities,  &lt;strong&gt;sqlcmd&lt;/strong&gt; has an improved feedback and output format than &lt;strong&gt;sqsh&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;mssql-cli&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;mssql-cli&lt;/strong&gt; is another cross-platform open source interactive command line query tool for SQL Server. &lt;strong&gt;Auto-completion&lt;/strong&gt;, &lt;strong&gt;Syntax highlighting&lt;/strong&gt; and &lt;strong&gt;Multi-line queries&lt;/strong&gt; are some of the features that make &lt;strong&gt;mssql-cli&lt;/strong&gt; more appealing to use compared to &lt;strong&gt;sqsh&lt;/strong&gt; and &lt;strong&gt;sqlcmd&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;However, at the time of this post, there is no support for &lt;a href="https://github.com/dbcli/mssql-cli/pull/505"&gt;Ubuntu 20.04 and Debian 10&lt;/a&gt; which Parrot OS 4.11 is based on.&lt;/p&gt;

&lt;h2&gt;
  
  
  Install and Launch Azure Data Studio
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mm673DdW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2uqwm6cit0ofn67vaoj8.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mm673DdW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2uqwm6cit0ofn67vaoj8.jpeg" alt='Alt "azure data studio overview screen"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Some people prefer working with GUI tools instead of command-line tools. If you have installed SQL Server Management Studio (SSMS),&lt;br&gt;
you can connect to an MS SQL Server database installed on Linux through SSMS. As SSMS is only available in Windows operating systems, you can use &lt;a href="https://github.com/Microsoft/azuredatastudio"&gt;Azure Data Studio&lt;/a&gt; instead of SSMS. Azure Data Studio is a free and cross-platform.&lt;/p&gt;

&lt;p&gt;In the following sections, I step through the process of installing, connecting and running queries on Azure Data Studio.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://go.microsoft.com/fwlink/?linkid=2163436"&gt;Download Azure Data Studio&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To extract the file, run &lt;code&gt;sudo dpkg -i ./Downloads/azuredatastudio-linux-&amp;lt;version string&amp;gt;.deb&lt;/code&gt;. Replace &lt;code&gt;&amp;lt;version string&amp;gt;&lt;/code&gt; with the version of the downloaded file. The version I downloaded is &lt;strong&gt;1.29.0&lt;/strong&gt; so the command to run will be &lt;code&gt;sudo dpkg -i ./Downloads/azuredatastudio-linux-1.29.0.deb&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Creating a Connection In Azure Data Studio
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;To Launch Azure Data Studio run &lt;code&gt;azuredatastudio&lt;/code&gt; from a Terminal window. You can also launch by clicking on the &lt;strong&gt;Application Launcher&lt;/strong&gt;, typing &lt;strong&gt;Azure&lt;/strong&gt; in the search area and select &lt;strong&gt;Azure Data Studio&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on &lt;strong&gt;Add Connection&lt;/strong&gt; on the left sidebar below &lt;strong&gt;SERVERS&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Under &lt;strong&gt;Connection Details&lt;/strong&gt;, fill in the fields as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Server Name: &lt;strong&gt;localhost&lt;/strong&gt; or just type a &lt;strong&gt;.&lt;/strong&gt; (dot).&lt;/li&gt;
&lt;li&gt;User name: &lt;strong&gt;sa&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Password: &lt;/li&gt;
&lt;li&gt;Name (optional): optionally type a preferred name or leave blank to use &lt;strong&gt;localhost&lt;/strong&gt; as the connection name.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on &lt;strong&gt;Connect&lt;/strong&gt; button&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Run Query In Azure Data Studio
&lt;/h2&gt;

&lt;p&gt;In Azure Data Studio, commands are run through the query window.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Launch &lt;strong&gt;Azure Data Studio&lt;/strong&gt; if not already launched&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Right click on &lt;strong&gt;localhost&lt;/strong&gt; or the name of the connection you created&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;strong&gt;New Query&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Paste or type your query statements. For our testing, type &lt;code&gt;SELECT @@VERSION&lt;/code&gt;, and&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on  &lt;strong&gt;Run&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;In this post we went through the process of installing and setting up MS SQL Server 2019 Developer Edition on Parrot OS. The same steps would also be applicable to on any Debian based Linux distribution. After setting up of SQL Server was complete, we looked at working with SQL Server using command-line tools and Azure Data Studio.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-setup?view=sql-server-ver15"&gt;Installation guidance for SQL Server on Linux&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/sql/linux/quickstart-install-connect-ubuntu?view=sql-server-linux-ver15&amp;amp;preserve-view=true"&gt;Quickstart: Install SQL Server and create a database on Ubuntu&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://manpages.ubuntu.com/manpages/focal/man1/sqsh.1.html"&gt;sqsh Man Page&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/sql/azure-data-studio/what-is-azure-data-studio?view=sql-server-linux-ver15"&gt;What is Azure Data Studio?&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.microsoft.com/en-us/sql/azure-data-studio/quickstart-sql-server?view=sql-server-linux-ver15"&gt;Quickstart: Use Azure Data Studio to connect and query SQL Server&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>linux</category>
      <category>database</category>
      <category>sql</category>
      <category>sqlserver</category>
    </item>
    <item>
      <title>Add Snowpack to ASP.NET Core Web App</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Fri, 19 Mar 2021 01:42:19 +0000</pubDate>
      <link>https://dev.to/kagundajm/add-snowpack-to-asp-net-core-web-app-1pb6</link>
      <guid>https://dev.to/kagundajm/add-snowpack-to-asp-net-core-web-app-1pb6</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;I stumbled on &lt;a href="https://www.snowpack.dev/" rel="noopener noreferrer"&gt;Snowpack&lt;/a&gt; while searching for a solution to resolve a &lt;a href="https://github.com/parcel-bundler/parcel" rel="noopener noreferrer"&gt;ParcelJS&lt;/a&gt; error.&lt;/p&gt;

&lt;p&gt;Building my application assets with ParcelJS worked without errors but when running the application, none of the JS scripts were running.  Upon checking the browser console window, I was getting the following error: &lt;strong&gt;Uncaught (in promise) Error: Cannot find module&lt;/strong&gt;. I never managed to resolve the error but Snowpack appeared in one of the searches.&lt;/p&gt;

&lt;p&gt;A quick look at the following two paragraphs on Snowpack's landing page and I made a decision to replace ParcelJS with Snowpack.&lt;/p&gt;

&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;Snowpack is a lightning-fast frontend build tool, designed for the modern web. It is an alternative to heavier, more complex bundlers like webpack or Parcel in your development workflow. ...&lt;/p&gt;

&lt;p&gt;Once you try it, it's impossible to go back to anything else.&lt;/p&gt;
&lt;/blockquote&gt;


&lt;/blockquote&gt;

&lt;p&gt;Snowpack builds individual files and caches the built files. Whenever any cached file changes, then Snowpack rebuilds that single file alone. Snowpack will also rebuild and cache any new files added to a project. Snowpack runs through your &lt;em&gt;package.json&lt;/em&gt; file application dependencies builds and converts the dependencies to JavaScript ESM modules where necessary and places these modules in the build output folder. File bundling is an opt-in using Snowpack's plugins.&lt;/p&gt;

&lt;p&gt;In this post, I go through the steps of adding Snowpack to the build and publish process of an ASP.NET Core web application. I create a new application but the process is also applicable to an existing project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Make sure you have installed downloaded and installed &lt;a href="https://dotnet.microsoft.com/download/dotnet-core" rel="noopener noreferrer"&gt;.NET SDK&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You can verify that .NET is installed by opening a new terminal window and running the command &lt;code&gt;dotnet&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Snowpack v3.0 introduced &lt;a href="https://www.snowpack.dev/guides/streaming-imports" rel="noopener noreferrer"&gt;&lt;strong&gt;Streaming Imports&lt;/strong&gt;&lt;/a&gt; which fetches imported packages on-demand thereby  eliminating  dependency on package managers like npm or yarn. In this post, I will stick with the traditional way of managing packages therefore make sure you have &lt;strong&gt;Node.js&lt;/strong&gt; and &lt;strong&gt;npm&lt;/strong&gt; or &lt;strong&gt;yarn&lt;/strong&gt; installed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create Web Application
&lt;/h2&gt;

&lt;p&gt;If you have an existing ASP.NET Core web application, you can skip this step. Also replace any mention of  &lt;strong&gt;SnowpackTest&lt;/strong&gt; project with the name of your  project.&lt;/p&gt;

&lt;p&gt;Open terminal window and run the following command to create a new web application &lt;code&gt;dotnet new webApp -o SnowpackTest&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Navigate to the project folder by running &lt;code&gt;cd SnowpackTest&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Open the project using your preferred source code editor. If you are using Visual Studio Code (what I will use in this post) then run &lt;code&gt;code .&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Create Snowpack project root
&lt;/h2&gt;

&lt;p&gt;Create &lt;em&gt;Client&lt;/em&gt; (or use your preferred name) folder in the root of the project. This will serve as the root project for Snowpack.&lt;/p&gt;

&lt;p&gt;Move &lt;em&gt;css&lt;/em&gt; and &lt;em&gt;js&lt;/em&gt; folders from &lt;em&gt;wwwroot&lt;/em&gt; to &lt;em&gt;Client&lt;/em&gt; folder.&lt;/p&gt;

&lt;p&gt;Move &lt;em&gt;favicon.ico&lt;/em&gt; to the root of &lt;em&gt;Client&lt;/em&gt; folder.&lt;/p&gt;

&lt;p&gt;Open VS Code terminal (select &lt;strong&gt;View&lt;/strong&gt;, &lt;strong&gt;Terminal&lt;/strong&gt; from menu). This will open at the project's root folder.&lt;/p&gt;

&lt;p&gt;Change directory to the &lt;em&gt;Client&lt;/em&gt; folder by running &lt;code&gt;cd Client&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;Create and empty &lt;em&gt;package.json&lt;/em&gt; file by running &lt;code&gt;npm init --yes&lt;/code&gt; from the terminal.&lt;code&gt;--yes&lt;/code&gt; allows the command to generate the file without prompting you for any required data.&lt;/p&gt;

&lt;p&gt;Install dependencies contained within the &lt;em&gt;lib&lt;/em&gt; folder  under &lt;em&gt;wwwroot&lt;/em&gt; by running &lt;code&gt;npm i bootstrap popper.js jquery jquery-validation jquery-validation-unobtrusive&lt;/code&gt; command&lt;/p&gt;

&lt;p&gt;Install Snowpack as a dev dependency &lt;code&gt;npm install --save-dev snowpack@3.0.13&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Re-Create Files in &lt;em&gt;wwwroot/lib&lt;/em&gt; Folder
&lt;/h2&gt;

&lt;p&gt;We will re-create files contained within the &lt;em&gt;wwwroot/lib&lt;/em&gt; folder under &lt;em&gt;Client&lt;/em&gt; folder. The files we create are going reference files located within the &lt;em&gt;node_modules&lt;/em&gt; folder.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create &lt;em&gt;bootstrap.css&lt;/em&gt; file under &lt;em&gt;Client/css&lt;/em&gt; folder. Update the file with the following &lt;code&gt;@import "bootstrap/dist/css/bootstrap.css";&lt;/code&gt; statement.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create &lt;em&gt;bootstrap.js&lt;/em&gt; file under &lt;em&gt;Client/js&lt;/em&gt; folder and insert &lt;code&gt;import bootstrap from 'bootstrap';&lt;/code&gt; to the file.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create &lt;em&gt;jquery.js&lt;/em&gt; file under &lt;em&gt;Client/js&lt;/em&gt; folder and insert &lt;code&gt;import jquery from 'jquery';&lt;/code&gt; to the file.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;I will combine the jQuery validation files but you can create individual separate files. Create &lt;em&gt;jquery-validation.js&lt;/em&gt; file under &lt;em&gt;Client/js&lt;/em&gt; folder. Insert the following two statements to the file &lt;code&gt;import 'jquery-validation';&lt;/code&gt; and &lt;code&gt;import 'jquery-validation-unobtrusive';&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The structure of your &lt;em&gt;Client&lt;/em&gt; folder after completing above steps should resemble the following:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fujka0ggbvbipd4y73o3q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fujka0ggbvbipd4y73o3q.png" alt="Alt "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After the above steps, you can delete &lt;em&gt;wwwroot&lt;/em&gt; folder. We will configure Snowpack to re-create the folder for us during the build process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configure Snowpack
&lt;/h2&gt;

&lt;p&gt;Snowpack supports the following configuration files:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;package.json&lt;/em&gt; - the file should contain a Snowpack config object (&lt;code&gt;"snowpack": {...}&lt;/code&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;snowpack.config.js&lt;/em&gt; - this is a Javascript file exporting a Snowpack config object (&lt;code&gt;module.exports ={ ... }&lt;/code&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;snowpack.config.json&lt;/em&gt; - A JSON file containing config (&lt;code&gt;{ ... }&lt;/code&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can also use command-line interface (CLI) flag to provide a different config file (&lt;code&gt;snowpack --config ./path/to/snowpack.deploy.json&lt;/code&gt;). CLI flags take precedence over other configuration file settings.&lt;/p&gt;

&lt;p&gt;Open &lt;em&gt;package.json&lt;/em&gt; located within &lt;em&gt;Client&lt;/em&gt; folder and add a Snowpack config object above the &lt;code&gt;dependencies&lt;/code&gt; config object or anywhere within the file.&lt;/p&gt;

&lt;p&gt;Update the config object with the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="nl"&gt;"scripts"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"build-snowpack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"snowpack build"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nl"&gt;"snowpack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"exclude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"*.js"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="s2"&gt;"*.json"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"buildOptions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"out"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"../wwwroot/dist"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We want to exclude any configuration files located within the Snowpack project root from the build process by using the &lt;code&gt;exclude&lt;/code&gt; directive. By default, Snowpack excludes files within &lt;em&gt;node_modules&lt;/em&gt; folder.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;out&lt;/code&gt; is the local directory where Snowpack will output our final build.&lt;/p&gt;

&lt;h2&gt;
  
  
  Test Snowpack Build
&lt;/h2&gt;

&lt;p&gt;Open VS Code terminal if not already open.&lt;/p&gt;

&lt;p&gt;If the terminal opens within the web application root folder, run &lt;code&gt;npm run build-snowpack --prefix ./Client&lt;/code&gt; command. &lt;code&gt;--prefix&lt;/code&gt; points &lt;strong&gt;npm&lt;/strong&gt; to the _package.json` file location.&lt;/p&gt;

&lt;p&gt;If the terminal is on &lt;em&gt;Client&lt;/em&gt; folder, run &lt;code&gt;npm run build-snowpack&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;After the command completes, you get an output similar to the following:  &lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[snowpack] ! building source files...    
[snowpack] ✔ build complete [0.08s]    
[snowpack] ! building dependencies...    
[snowpack] ✔ dependencies ready! [2.83s]    
[snowpack] ! verifying build...    
[snowpack] ✔ verification complete [0.00s]    
[snowpack] ! writing build to disk...    
[snowpack] watching for changes...    
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The &lt;em&gt;wwwroot&lt;/em&gt; folder will be re-created at the root of the project.  CSS and JS files within &lt;em&gt;Client&lt;/em&gt; folder will be copied to the &lt;em&gt;wwwroot/dist&lt;/em&gt; folder under respective folders.&lt;/p&gt;

&lt;p&gt;Snowpack will also create a &lt;em&gt;_snowpack&lt;/em&gt; folder within &lt;em&gt;wwwroot/dist&lt;/em&gt; folder. This is the folder for Snowpack metadata. You can set a different name for this folder by setting the value under &lt;a href="https://www.snowpack.dev/reference/configuration#buildoptions.metaurlpath" rel="noopener noreferrer"&gt;buildOptions.metaUrlPath&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7ly0yfovbvoyas8ymff.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7ly0yfovbvoyas8ymff.png" alt="Alt "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Any CSS and JS files within &lt;em&gt;wwwroot/dist&lt;/em&gt; folder that reference packages from the &lt;em&gt;node_modules&lt;/em&gt;  get updated during  the build process  to point to the &lt;em&gt;_snowpack&lt;/em&gt; folder. For example &lt;em&gt;Client/css/bootstrap.css&lt;/em&gt; contains &lt;code&gt;@import "bootstrap/dist/css/bootstrap.css";&lt;/code&gt;. After the build process, &lt;em&gt;wwwroot/dist/css/bootstrap.css&lt;/em&gt; will contain &lt;code&gt;import "../_snowpack/pkg/bootstrap/dist/css/bootstrap.css";&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Update JS and CSS File References
&lt;/h2&gt;

&lt;p&gt;Open &lt;em&gt;_Layout.cshtml&lt;/em&gt; file located within &lt;em&gt;Pages/Shared&lt;/em&gt; folder. Replace  CSS link references with the following:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;link rel="shortcut icon" href="~/dist/favicon.ico"&amp;gt;
&amp;lt;link rel="stylesheet" href="~/dist/css/bootstrap.css" /&amp;gt;
&amp;lt;link rel="stylesheet" href="~/dist/css/site.css" /&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;At the bottom of the file, update the script references with the following:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;script type="module" src="~/dist/js/jquery.js"&amp;gt; &amp;lt;/script&amp;gt;
&amp;lt;script type="module" src="~/dist/js/bootstrap.js"&amp;gt;&amp;lt;/script&amp;gt;
&amp;lt;script type="module" src="~/dist/js/site.js"&amp;gt;&amp;lt;/script&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Open &lt;code&gt;_ValidationScriptsPartial.cshtml&lt;/code&gt; within  &lt;em&gt;Pages/Shared&lt;/em&gt; folder and replace the contents with  &lt;code&gt;&amp;lt;script type="module" src="~/dist/js/jquery-validation.js"&amp;gt;&amp;lt;/script&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;By declaring &lt;code&gt;type="module"&lt;/code&gt; on our script tags, we tell the browser that we are using JavaScript's ES Modules (ESM). Failing to add &lt;code&gt;type="module"&lt;/code&gt; on the scripts will display an &lt;strong&gt;Uncaught SyntaxError: Cannot use import statement outside a module&lt;/strong&gt; on the browser console window when the application runs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configure Application Build and Changes Watch
&lt;/h2&gt;

&lt;p&gt;Before we start watching Snowpack build process, let's update our &lt;em&gt;package.json&lt;/em&gt; file with a command for watching the build process. Open &lt;em&gt;./Client/package.json&lt;/em&gt; file and append &lt;code&gt;"watch-snowpack": "snowpack build --watch"&lt;/code&gt; to the  scripts object. `--watch will ensure that the build process does not complete and exit but will continue watching for any file changes that Snowpack is to build.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="nl"&gt;"scripts"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"build-snowpack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"snowpack build"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
   &lt;/span&gt;&lt;span class="nl"&gt;"watch-snowpack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"snowpack build --watch"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Open VS Code integrated terminal if not already open. Make sure you are at  the root of project. If you  are on &lt;em&gt;Client&lt;/em&gt; folder, run &lt;code&gt;cd ..&lt;/code&gt; command to take  you back to the project root folder.&lt;/p&gt;

&lt;p&gt;Run &lt;code&gt;dotnet watch run&lt;/code&gt; command. This will build the project and continue watching for any changes to the files.&lt;/p&gt;

&lt;p&gt;Split the terminal window by right clicking the terminal window and selecting &lt;strong&gt;Split&lt;/strong&gt; from the context menu.&lt;/p&gt;

&lt;p&gt;On the new terminal tab, run &lt;code&gt;npm run watch-snowpack --prefix ./Client&lt;/code&gt; command. Snowpack will build all the files and continue monitoring files for any changes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxv8ah5lfadf9fzsdtepd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxv8ah5lfadf9fzsdtepd.png" alt="Alt "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Test Snowpack Build and Watch
&lt;/h2&gt;

&lt;p&gt;In a web browser, navigate to &lt;strong&gt;&lt;a href="https://localhost:5001/" rel="noopener noreferrer"&gt;https://localhost:5001/&lt;/a&gt;&lt;/strong&gt; to display the default welcome page.&lt;/p&gt;

&lt;p&gt;Open &lt;em&gt;Pages/Index.cshtml&lt;/em&gt; in the code editor and replace &lt;code&gt;&amp;lt;p&amp;gt;Learn about &amp;lt;a href="https://docs.microsoft.com/aspnet/core"&amp;gt;building Web apps with ASP.NET Core&amp;lt;/a&amp;gt;.&amp;lt;/p&amp;gt;&lt;/code&gt; with the following&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card w-50 mx-auto"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
   &lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card-body"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;h5&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"name"&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card-title"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;Name&lt;span class="nt"&gt;&amp;lt;/h5&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card-text"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;Email: &lt;span class="nt"&gt;&amp;lt;span&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"email"&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"small font-weight-bold"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card-text"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;phone: &lt;span class="nt"&gt;&amp;lt;span&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"phone"&lt;/span&gt;  &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"small font-weight-bold"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"card-text"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;Website: &lt;span class="nt"&gt;&amp;lt;span&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"website"&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"small font-weight-bold"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/span&amp;gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
        &lt;span class="nt"&gt;&amp;lt;button&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"fetchUser"&lt;/span&gt; &lt;span class="na"&gt;class=&lt;/span&gt;&lt;span class="s"&gt;"btn btn-primary mt-5"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;Fetch User&lt;span class="nt"&gt;&amp;lt;/button&amp;gt;&lt;/span&gt;
      &lt;span class="nt"&gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save the  file and .NET should rebuild the project for you.&lt;/p&gt;

&lt;p&gt;We are going to fetch dummy random users from &lt;strong&gt;&lt;a href="https://jsonplaceholder.typicode.com" rel="noopener noreferrer"&gt;https://jsonplaceholder.typicode.com&lt;/a&gt;&lt;/strong&gt; and display them on the homepage.&lt;/p&gt;

&lt;p&gt;Open &lt;em&gt;Client/site.js&lt;/em&gt; and update the file with the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;fetchUser&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fetchUser&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
 &lt;span class="nx"&gt;fetchUser&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nf"&gt;addEventListener&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;click&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;userId&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;floor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;random&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`https://jsonplaceholder.typicode.com/users/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;userId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ok&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;user&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
      &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;name&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;innerHTML&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;email&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;innerHTML&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;email&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;phone&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;innerHTML&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;phone&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;website&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;innerHTML&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;website&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
 &lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Saving the changes, you will observe that Snowpack displays a &lt;code&gt;[snowpack] File changed...&lt;/code&gt; message on the terminal.&lt;/p&gt;

&lt;p&gt;Refresh the page on the browser and the page gets replaced with blank user data.&lt;/p&gt;

&lt;p&gt;If you click on the &lt;strong&gt;Fetch User&lt;/strong&gt; button, the homepage should now display data for a random user.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fekhm5hgwa7gjvp2nr4zm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fekhm5hgwa7gjvp2nr4zm.png" alt="Alt "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Creating a new JS or CSS file within the &lt;em&gt;Client&lt;/em&gt; folder should  trigger a build and the new  file appended to the &lt;em&gt;wwwwroot&lt;/em&gt; folder. Renamimg a file will however leave the previous file within the &lt;em&gt;wwwroot&lt;/em&gt; folder. This should not be an issue as during project publishing, we will perform a clean build.&lt;/p&gt;

&lt;p&gt;Kill all terminal tabs by right clicking on each open terminal tab and selecting &lt;strong&gt;Kill Terminal&lt;/strong&gt; from the context menu.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prepare Project for Publishing
&lt;/h2&gt;

&lt;p&gt;The project publishing process will involve the following 3 tasks:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Embed Snowpack build process to the .NET publishing process&lt;/li&gt;
&lt;li&gt;Minify our CSS and JS files&lt;/li&gt;
&lt;li&gt;Remove un-used CSS classes&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  1. Embed Snowpack build process to the .NET publishing process
&lt;/h3&gt;

&lt;p&gt;Open the project file &lt;em&gt;SnowpackTest.csproj&lt;/em&gt; at the root of the project and update as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;Project&lt;/span&gt; &lt;span class="na"&gt;Sdk=&lt;/span&gt;&lt;span class="s"&gt;"Microsoft.NET.Sdk.Web"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;

  &lt;span class="nt"&gt;&amp;lt;PropertyGroup&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;TargetFramework&amp;gt;&lt;/span&gt;net5.0&lt;span class="nt"&gt;&amp;lt;/TargetFramework&amp;gt;&lt;/span&gt;

    &lt;span class="nt"&gt;&amp;lt;ClientRoot&amp;gt;&lt;/span&gt;$(ProjectDir)Client\&lt;span class="nt"&gt;&amp;lt;/ClientRoot&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;DefaultItemExcludes&amp;gt;&lt;/span&gt;$(DefaultItemExcludes);$(ClientRoot)/*.*;$(ProjectDir)*.Development.json&lt;span class="nt"&gt;&amp;lt;/DefaultItemExcludes&amp;gt;&lt;/span&gt;

  &lt;span class="nt"&gt;&amp;lt;/PropertyGroup&amp;gt;&lt;/span&gt;

  &lt;span class="nt"&gt;&amp;lt;Target&lt;/span&gt; &lt;span class="na"&gt;Name=&lt;/span&gt;&lt;span class="s"&gt;"NpmRunPublish"&lt;/span&gt; &lt;span class="na"&gt;BeforeTargets=&lt;/span&gt;&lt;span class="s"&gt;"Publish"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;Exec&lt;/span&gt; &lt;span class="na"&gt;WorkingDirectory=&lt;/span&gt;&lt;span class="s"&gt;"$(ProjectDir)"&lt;/span&gt; &lt;span class="na"&gt;Command=&lt;/span&gt;&lt;span class="s"&gt;"npm run publish --prefix ./Client"&lt;/span&gt; &lt;span class="nt"&gt;/&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/Target&amp;gt;&lt;/span&gt;

&lt;span class="nt"&gt;&amp;lt;/Project&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;&amp;lt;ClientRoot&amp;gt;$(ProjectDir)Client\&amp;lt;/ClientRoot&amp;gt;&lt;/code&gt; defines a property and value that will be referenced later in the project file.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$(ProjectDir)&lt;/code&gt; and &lt;code&gt;$(DefaultItemExcludes)&lt;/code&gt; are &lt;a href="https://docs.microsoft.com/en-us/cpp/build/reference/common-macros-for-build-commands-and-properties" rel="noopener noreferrer"&gt;predefined MSBuild properties&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;DefaultItemExcludes&lt;/code&gt; defines patterns for files that MSBuild should exclude from the build process.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$(ClientRoot)&lt;/code&gt; references the property created earlier within the project file.&lt;/p&gt;

&lt;p&gt;Within &lt;em&gt;Client/package.json&lt;/em&gt;, add &lt;code&gt;dotnet-publish&lt;/code&gt; command under &lt;strong&gt;scripts&lt;/strong&gt; object.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="nl"&gt;"scripts"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"build-snowpack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"snowpack build"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"watch-snowpack"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"snowpack build --watch"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"publish"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"NODE_ENV=production snowpack build --config snowpack.publish.json"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;publish&lt;/code&gt; uses CLI to tell Snowpack the configuration file (&lt;em&gt;snowpack.publish.json&lt;/em&gt;) to use.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;NODE_ENV=production&lt;/code&gt; signals to Node.js that we are running in production and not development environment which is the default.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Minify CSS and JS files.
&lt;/h3&gt;

&lt;p&gt;Create &lt;em&gt;Client/snowpack.publish.json&lt;/em&gt; file and update with the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"exclude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"*.js"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"*.json"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"plugins"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"@snowpack/plugin-optimize"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"target"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"es2020"&lt;/span&gt;&lt;span class="p"&gt;}]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"buildOptions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"out"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"../publish/wwwroot/dist"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You customize Snowpack build process using &lt;a href="https://www.snowpack.dev/plugins" rel="noopener noreferrer"&gt;build plugins&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Open the integrated terminal and install Snowpack optimize plugin using following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;--save-dev&lt;/span&gt; &lt;span class="nt"&gt;--prefix&lt;/span&gt; ./Client @snowpack/plugin-optimize
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The optimize plugin will minify the CSS and JS files.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Remove Un-used CSS Classes
&lt;/h3&gt;

&lt;p&gt;Removing CSS class names will reduce the size of our CSS files.&lt;/p&gt;

&lt;p&gt;Run the following command from the integrated terminal::&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;--save-dev&lt;/span&gt; &lt;span class="nt"&gt;--prefix&lt;/span&gt; ./Client @snowpack/plugin-postcss postcss postcss-cli @fullhuman/postcss-purgecss
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you are running the command from the &lt;em&gt;Client&lt;/em&gt; folder, omit the  &lt;code&gt;--prefix ./Client&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;Update  &lt;em&gt;Client/snowpack.publish.json&lt;/em&gt; file to include Snowpack PostCSS plugin.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"exclude"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"*.js"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"*.json"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"plugins"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"@snowpack/plugin-postcss"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"@snowpack/plugin-optimize"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"target"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"es2020"&lt;/span&gt;&lt;span class="p"&gt;}]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"buildOptions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"out"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"../publish/wwwroot/dist"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Configure PostCSS to remove any unused CSS classes from our CSS files.&lt;/p&gt;

&lt;p&gt;Create &lt;em&gt;Client/postcss.config.js&lt;/em&gt; file and update with the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;purgecss&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@fullhuman/postcss-purgecss&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)({&lt;/span&gt;
  &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../Pages/**/*.cshtml&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./**/*.js&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="na"&gt;css&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../publish/**/*.css&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;module&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;exports&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;plugins&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;NODE_ENV&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;production&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;purgecss&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
  &lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;content&lt;/code&gt; refers to the location of files containing CSS class names while &lt;code&gt;css&lt;/code&gt; is the location of the CSS files. We have also configured PurgeCSS to run when the process environment is set to production.&lt;/p&gt;

&lt;h2&gt;
  
  
  Publish .NET Project
&lt;/h2&gt;

&lt;p&gt;To publish the application, run &lt;code&gt;dotnet publish -c Release -o ./publish&lt;/code&gt; from the terminal window. Make sure you are running the command from the project root folder.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foci0mb594d4vc51xynhf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foci0mb594d4vc51xynhf.png" alt="Alt "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you open &lt;em&gt;site.css&lt;/em&gt; and &lt;em&gt;site.js&lt;/em&gt; within the &lt;em&gt;publish&lt;/em&gt;, you will observe that the files are now compressed.&lt;/p&gt;

&lt;p&gt;Try adding some unused CSS classes within &lt;em&gt;Client/css/site.css&lt;/em&gt; and then publish the .NET project. You  will observe that those un-used CSS classes are missing in the published CSS file.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This post addressed one way of adding Snowpack build tool to an ASP.NET Core project. I relocated files within default &lt;em&gt;wwwroot&lt;/em&gt; folder to a different folder and let Snowpack create the &lt;em&gt;wwwroot&lt;/em&gt; folder during the build process. By use of Snowpack plugins, I minified the JS and CSS files and removed any un-used CSS classes from CSS files. Modifying the .NET project file made it possible to embed Snowpack's build process to the .NET project publishing process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Snowpack's Origin
&lt;/h2&gt;

&lt;p&gt;You can watch &lt;a href="https://www.youtube.com/watch?v=ZQZmPooJ4cA" rel="noopener noreferrer"&gt;Snowpack with Fred K. Schott&lt;/a&gt; or listen to the &lt;a href="https://www.contributor.fyi/snowpack" rel="noopener noreferrer"&gt;podcast&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That's all for this post.&lt;/p&gt;

</description>
      <category>snowpack</category>
      <category>aspnetcore</category>
      <category>webdev</category>
      <category>tooling</category>
    </item>
    <item>
      <title>The Data Directory Contains an Old postmaster.pid File</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Wed, 27 Jan 2021 17:39:51 +0000</pubDate>
      <link>https://dev.to/kagundajm/the-data-directory-contains-an-old-postmaster-pid-file-16a6</link>
      <guid>https://dev.to/kagundajm/the-data-directory-contains-an-old-postmaster-pid-file-16a6</guid>
      <description>&lt;h2&gt;
  
  
  PostgreSQL Connection Failure
&lt;/h2&gt;

&lt;p&gt;Sometimes, computers have a life of their own. You shut down your laptop and on the next boot, you are unable to connect to a PostgreSQL 12 database.&lt;/p&gt;

&lt;p&gt;Opening the installed &lt;a href="https://github.com/PostgresApp/PostgresApp"&gt;Postgres.app&lt;/a&gt;, I notice it displays &lt;code&gt;stale postmaster.pid file&lt;/code&gt; error.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HVnX-y-q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3j2o4vv64y02bcwa748x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HVnX-y-q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3j2o4vv64y02bcwa748x.png" alt='Alt="stale postmaster-pid file"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Postmaster.pid
&lt;/h2&gt;

&lt;p&gt;What the heck is the &lt;em&gt;postmaster.pid&lt;/em&gt; file? Turns out that the &lt;em&gt;postmaster.pid&lt;/em&gt; is a &lt;a href="https://www.postgresql.org/docs/current/server-start.html"&gt;lock file used to prevent two instances of the same PostgreSQL server from running on the same data-directory&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;blockquote&gt;
&lt;p&gt;While the server is running, its PID is stored in the file postmaster.pid in the data directory. This is used to prevent multiple server instances from running in the same data directory and can also be used for shutting down the server.&lt;/p&gt;
&lt;/blockquote&gt;


&lt;/blockquote&gt;

&lt;p&gt;This is the first time I am encountering such an error. I click on the &lt;strong&gt;Start&lt;/strong&gt; button to force a re-start and I get back a dialog with the following message: &lt;strong&gt;The data directory contains a postmaster.pid file, which usually means that the server is already running. When the server crashes or is killed, you have to remove this file before you can restart the server. Make sure that the database process is definitely not running anymore, otherwise your data directory will be corrupted.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3bmP_ULY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/y8z40p1tz5bqq3xgannm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3bmP_ULY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/y8z40p1tz5bqq3xgannm.png" alt='alt="The data directory contains old postmaster.pid  file"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Click on &lt;strong&gt;OK&lt;/strong&gt; button to close the dialog window.&lt;/p&gt;

&lt;h2&gt;
  
  
  Resolving the Problem
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;strong&gt;Postgres.app&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Click on &lt;strong&gt;Server Settings...&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Click on &lt;strong&gt;Show&lt;/strong&gt; button next to the &lt;strong&gt;Data Directory&lt;/strong&gt;. This should open the data directory of your PostgreSQl installation.
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--A5D7v8Sf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/h108dlf203gggzv9vksg.png" alt='alt="postgres.app server settings"'&gt;
&lt;/li&gt;
&lt;li&gt;Locate the &lt;em&gt;postmaster.pid&lt;/em&gt; file
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2-YnRu23--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/iwo0dt2kpvwp2ei3rqgp.png" alt='alt="postgresql data directory folder"'&gt;
&lt;/li&gt;
&lt;li&gt;Delete the &lt;em&gt;postmaster.pid&lt;/em&gt; file. Right click on the file and select &lt;strong&gt;Move to Bin&lt;/strong&gt;
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EYJLM-CD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/kuavf8ujo24mijmwrk6h.png" alt='alt="postgresql delete postmaster.pid file"'&gt;
&lt;/li&gt;
&lt;li&gt;After deleting the file, opening or making the Postgres GUI app window have focus will change the error message from &lt;code&gt;stale postmaster.pid file&lt;/code&gt; to &lt;code&gt;Not running&lt;/code&gt;. Click &lt;strong&gt;Start&lt;/strong&gt; button on Postgres GUI app to start PostgreSQL server.
&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RDpTFGJA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/matkqku1jw161n452dey.png" alt='alt="postgresql server  not running error"'&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Way Forward
&lt;/h2&gt;

&lt;p&gt;Why not let PostgreSQL auto remove a stale &lt;em&gt;postmaster.pid&lt;/em&gt; file? This would make a user manage a PostgreSQL database without knowing its internal workings. Pros and cons of &lt;a href="https://github.com/PostgresApp/PostgresApp/issues/573"&gt;auto removing a stale &lt;em&gt;postmaster.pid&lt;/em&gt;&lt;/a&gt; exist and balancing between these two is not easy. Therefore, as at the time of this writing, the auto removal of the file remains an open issue and the user's responsibility.&lt;/p&gt;

&lt;h2&gt;
  
  
  Further Reading
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/PostgresApp/PostgresApp/issues/395"&gt;What's a data directory and postmaster.pid issue #395&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://postgresapp.com/documentation/troubleshooting.html"&gt;Troubleshooting &amp;amp; Support&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://rhaas.blogspot.com/2020/05/dont-manually-modify-postgresql-data.html"&gt;Don't Manually Modify The PostgreSQL Data Directory!&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.postgresql.org/docs/current/server-start.html"&gt;Starting the Database Server&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>postgres</category>
      <category>database</category>
      <category>errors</category>
    </item>
    <item>
      <title>How To Alter a Column Used By A View or Rule</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Wed, 13 Jan 2021 13:03:53 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-alter-a-column-used-by-a-view-or-rule-58bp</link>
      <guid>https://dev.to/kagundajm/how-to-alter-a-column-used-by-a-view-or-rule-58bp</guid>
      <description>&lt;p&gt;In PostgreSQL, assume you have a table and view with the following definitions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;boq_items&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;character&lt;/span&gt; &lt;span class="nb"&gt;varying&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;22&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;item_no&lt;/span&gt; &lt;span class="nb"&gt;character&lt;/span&gt; &lt;span class="nb"&gt;varying&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;activity_name&lt;/span&gt; &lt;span class="nb"&gt;character&lt;/span&gt; &lt;span class="nb"&gt;varying&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;page_no&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;qty&lt;/span&gt; &lt;span class="nb"&gt;numeric&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;rate&lt;/span&gt; &lt;span class="nb"&gt;numeric&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;bq_amt&lt;/span&gt; &lt;span class="nb"&gt;numeric&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;vw_boq_item_names&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;activity_name&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;boq_items&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Attempting to change the definition of &lt;code&gt;activity_name&lt;/code&gt; column using &lt;code&gt;ALTER TABLE boq_items ALTER activity_name TYPE text, ALTER activity_name SET NOT NULL;&lt;/code&gt; will return a &lt;code&gt;cannot alter type of a column used by a view or rule. DETAIL: rule _RETURN on view vw_boq_item_names depends on column "activity_name"&lt;/code&gt; error . PostgreSQL will throw the same error if you  attempt to change any column definition of the table.&lt;/p&gt;

&lt;p&gt;PostgreSQL allows running &lt;a href="https://wiki.postgresql.org/wiki/Transactional_DDL_in_PostgreSQL:_A_Competitive_Analysis"&gt;DDL statements in a transaction&lt;/a&gt;. To resolve the error, we require to drop the view, run the alter statement  and recreate the view but enclosing the these statements in a transaction.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;BEGIN&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;DROP&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;vw_boq_item_names&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;boq_items&lt;/span&gt;
    &lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;activity_name&lt;/span&gt; &lt;span class="k"&gt;TYPE&lt;/span&gt; &lt;span class="nb"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="k"&gt;ALTER&lt;/span&gt; &lt;span class="n"&gt;activity_name&lt;/span&gt; &lt;span class="k"&gt;SET&lt;/span&gt; &lt;span class="k"&gt;NOT&lt;/span&gt; &lt;span class="k"&gt;NULL&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;VIEW&lt;/span&gt; &lt;span class="n"&gt;vw_boq_item_names&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt;
    &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;activity_name&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;boq_items&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;COMMIT&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you are using &lt;a href="https://fluentmigrator.github.io/"&gt;FluentMigrator&lt;/a&gt; database migration framework, exclude the &lt;code&gt;BEGIN;&lt;/code&gt; and &lt;code&gt;COMMIT;&lt;/code&gt; statements - otherwise the migration will fail with an error.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;override&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;Up&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;sql&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"DROP VIEW vw_boq_item_names;"&lt;/span&gt;  &lt;span class="p"&gt;+&lt;/span&gt;

            &lt;span class="s"&gt;"ALTER TABLE boq_items "&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt;
              &lt;span class="s"&gt;"ALTER activity_name TYPE text, "&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt;
              &lt;span class="s"&gt;"ALTER activity_name SET NOT NULL;"&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt;

            &lt;span class="s"&gt;"CREATE VIEW vw_boq_item_names AS "&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt;
                &lt;span class="s"&gt;"SELECT activity_name FROM boq_items;"&lt;/span&gt;

    &lt;span class="n"&gt;Execute&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sql&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sql&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In situations where a table has a lot of dependencies, or an object has cascading dependencies, &lt;a href="https://stackoverflow.com/questions/3243863/problem-with-postgres-alter-table/49000321#answer-49000321"&gt;a solution&lt;/a&gt; would be to create two functions - one to save the dependencies and the other to restore these dependencies. Each of these functions will require parameters for the schema and the object in the schema which has dependencies. Before changing the table and column definitions, call the function to save and drop the dependencies, make the definition changes and finally call the function to restore the dependencies.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="n"&gt;util&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;deps_save_and_drop_dependencies&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'mdm'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'global_item_master_swap'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;alter&lt;/span&gt; &lt;span class="k"&gt;table&lt;/span&gt; &lt;span class="n"&gt;mdm&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;global_item_master_swap&lt;/span&gt;
&lt;span class="k"&gt;alter&lt;/span&gt; &lt;span class="k"&gt;column&lt;/span&gt; &lt;span class="n"&gt;prod_id&lt;/span&gt; &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;128&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
&lt;span class="k"&gt;alter&lt;/span&gt; &lt;span class="k"&gt;column&lt;/span&gt; &lt;span class="n"&gt;prod_nme&lt;/span&gt; &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="nb"&gt;varchar&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;512&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;select&lt;/span&gt; &lt;span class="n"&gt;util&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;deps_restore_dependencies&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s1"&gt;'mdm'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'global_item_master_swap'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://pretius.com/postgresql-stop-worrying-about-table-and-view-dependencies/"&gt;Stop worrying about table and view dependencies in PostgreSQL&lt;/a&gt; details how these functions work. The post includes links to an &lt;a href="http://sqlfiddle.com/#!15/e1e32/1"&gt;sql fiddle&lt;/a&gt; and a up to date &lt;a href="https://gist.github.com/mateuszwenus/11187288"&gt;gist&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I hope this will be helpful to anyone encountering &lt;code&gt;cannot alter type of a column used by a view or rule&lt;/code&gt; error in PostgeSQL while updating table column definitions.&lt;/p&gt;

</description>
      <category>postgres</category>
      <category>database</category>
      <category>csharp</category>
      <category>fluentmigrator</category>
    </item>
    <item>
      <title>How to Drop All Tables in PostgreSQL Database</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Mon, 19 Oct 2020 20:24:36 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-drop-all-tables-in-postgresql-database-2fbc</link>
      <guid>https://dev.to/kagundajm/how-to-drop-all-tables-in-postgresql-database-2fbc</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;The &lt;code&gt;DROP TABLE&lt;/code&gt; command in PostgreSQL removes a table definition, all data and indexes of a table from a database. &lt;code&gt;DROP TABLE&lt;/code&gt; will fail if the table has other objects that depend on it like views and foreign key definitions. The command will also fail and display a &lt;strong&gt;table does not exist&lt;/strong&gt; message if the table being dropped does not exist. PostgreSQL  does not have a drop all tables command and you have to define your own way of performing this task. &lt;/p&gt;

&lt;p&gt;In this post, I explain how to drop one or multiple tables from a PostgreSQL database. Finally, I end the post with one approach that  can be used to drop all tables from a database.&lt;/p&gt;

&lt;h2&gt;
  
  
  Drop Table Command Syntax
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.postgresql.org/docs/current/sql-droptable.html"&gt;&lt;code&gt;DROP TABLE [ IF EXISTS ] name [, ...] [ CASCADE | RESTRICT ]&lt;/code&gt;&lt;/a&gt;  is the formal syntax for deleting a table from a PostgreSQL database. The name of the table is the only required parameter for the drop table command. To run the drop table command successfully, the user running the command must either be the table owner, schema owner or a superuser.&lt;/p&gt;

&lt;p&gt;When dropping a table or tables, you may optionally prefix the table names with the schema. A schema is a namespace that contains database objects(except roles and tablespaces) but unlike namespaces in normal programming usage or directories or folders in operating systems lingo, namespaces in PostgreSQL cannot be nested. Omitting a schema in object names automatically assumes a schema named &lt;strong&gt;public&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;You can run the command or commands from a PostgreSQL interactive terminal (&lt;strong&gt;psql&lt;/strong&gt;)  or as a query if you are using  graphical user interface administration tools such as &lt;a href="https://www.pgadmin.org/"&gt;pgAdmin&lt;/a&gt;, &lt;a href="https://www.beekeeperstudio.io/"&gt;Beekeeper Studio&lt;/a&gt;, &lt;a href="https://www.adminer.org/"&gt;Adminer&lt;/a&gt; or any other GUI tool that will allow you to connect to a PostgreSQL database.&lt;/p&gt;

&lt;h2&gt;
  
  
  Drop Table Command Explained
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;DROP TABLE public.table_name;&lt;/code&gt; will drop the table &lt;code&gt;table_name&lt;/code&gt; from the database. &lt;code&gt;public&lt;/code&gt; is the schema that owns the table and may be omitted.Using &lt;code&gt;DROP TABLE table_name;&lt;/code&gt; without the schema will also work. To delete more than one table, you can include the tables within the command but separate the table names with commas - &lt;code&gt;DROP TABLE table01, public.table2, myschema.table01;&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;IF EXISTS&lt;/code&gt; prevents PostgreSQL from throwing an error if the table being dropped does not exist. Running &lt;code&gt;DROP TABLE IF EXISTS table_name;&lt;/code&gt; from an interactive terminal will return a &lt;strong&gt;NOTICE:  table "table_name" does not exist, skipping&lt;/strong&gt; while Beekeeper Studio responds with &lt;strong&gt;Query Executed Successfully: No Results&lt;/strong&gt;". &lt;/p&gt;

&lt;p&gt;Adding a &lt;code&gt;CASCADE&lt;/code&gt; to the drop table command also drops any objects that depend on the table (such as views or foreign key references). &lt;code&gt;RESTRICT&lt;/code&gt; prevents dropping of a table if any objects depend on the table. If &lt;code&gt;CASCADE&lt;/code&gt; is not specified, the drop table command defaults to &lt;code&gt;RESTRICT&lt;/code&gt;. Therefore, &lt;code&gt;DROP TABLE table_name;&lt;/code&gt; is similar to &lt;code&gt;DROP TABLE table_name RESTRICT;&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Dropping All Tables
&lt;/h2&gt;

&lt;p&gt;PostgreSQL does not have a specific command to drop all tables from a database. To drop all tables, we will create an anonymous code block and execute the code block. To accomplish the task, we  select all table names for a schema from the &lt;code&gt;information_schema.tables&lt;/code&gt; and store the names in a &lt;code&gt;RECORD&lt;/code&gt; type variable. We then loop through these table names and execute the drop table command for each table name.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;a href="https://www.postgresql.org/docs/13/sql-do.html"&gt;&lt;code&gt;DO&lt;/code&gt;&lt;/a&gt; executes an anonymous code block in a procedural language. The code block is treated as though it were the body of a function with no parameters and returning no rows. The &lt;code&gt;LANGUAGE&lt;/code&gt; is optional and may be written either  before or after the code block. When language is omitted, it defaults to &lt;code&gt;plpgsql&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://www.postgresql.org/docs/current/sql-syntax-lexical.html#SQL-SYNTAX-DOLLAR-QUOTING"&gt;&lt;code&gt;$$&lt;/code&gt;&lt;/a&gt; is a dollar-quoted string constant (dollar quoting) and allow use of single quotes (&lt;code&gt;'&lt;/code&gt;) or backslashes (&lt;code&gt;\&lt;/code&gt;) without having to escape them by doubling (&lt;code&gt;''&lt;/code&gt; or &lt;code&gt;\\&lt;/code&gt;).  Dollar quoting can be used to replace single quotes anywhere in SQL scripts and are used at the beginning and ending of SQL blocks. If you use more than one set in a block, you can put a token (label) between the &lt;code&gt;$$&lt;/code&gt; to make them unique (&lt;code&gt;$SomeTag$Dianne's horse$SomeTag$&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;&lt;code&gt;DECLARE r RECORD;&lt;/code&gt; declares a variable of type &lt;a href="https://www.postgresql.org/docs/13/plpgsql-declarations.html#PLPGSQL-DECLARATION-RECORDS"&gt;&lt;code&gt;RECORD&lt;/code&gt;&lt;/a&gt; to hold the table names.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.postgresql.org/docs/current/functions-info.html"&gt;&lt;code&gt;current_schema()&lt;/code&gt;&lt;/a&gt; returns the name of the schema that is first in the search path. The &lt;code&gt;current_schema()&lt;/code&gt; can be replaced by the actual schema name &lt;code&gt;WHERE table_schema='public'&lt;/code&gt;. To drop tables for another schema, replace &lt;code&gt;public&lt;/code&gt; with the name of the schema. Using the schema name can be useful to reduce the scope of deletion to only the tables owned by the  schema.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;BEGIN&lt;/code&gt; and &lt;code&gt;END&lt;/code&gt; statements are used for wrapping multiple lines of SQL code into a statement block.&lt;/p&gt;

&lt;p&gt;As the drop table SQL statements are generated dynamically, we &lt;a href="https://www.postgresql.org/docs/13/plpgsql-statements.html#PLPGSQL-STATEMENTS-EXECUTING-DYN"&gt;quote the table names&lt;/a&gt; using &lt;code&gt;quote_ident&lt;/code&gt; function.&lt;/p&gt;

&lt;p&gt;You run the anonymous code block in the normal manner that you run a drop table command.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;In this post, we looked the &lt;code&gt;DROP TABLE&lt;/code&gt; command to drop a single table and multiple tables from a database. By enclosing the command in an anonymous code block, we extended the command to drop all tables from a PostgreSQL database.&lt;/p&gt;

&lt;p&gt;That brings us to the end of the post. I hope you the post was helpful in one way or the other. &lt;/p&gt;

</description>
      <category>postgres</category>
      <category>database</category>
      <category>sql</category>
    </item>
    <item>
      <title>How To Integrate Asp.NET Core Project with Bootstrap, TailwindCSS and ParcelJS</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Fri, 02 Oct 2020 20:11:09 +0000</pubDate>
      <link>https://dev.to/kagundajm/how-to-integrate-asp-net-core-project-with-bootstrap-tailwindcss-and-parceljs-4p5p</link>
      <guid>https://dev.to/kagundajm/how-to-integrate-asp-net-core-project-with-bootstrap-tailwindcss-and-parceljs-4p5p</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;When you create a new ASP.NET Core Razor Pages project, the new project comes bundled with Bootstrap CSS framework and jQuery Javascript library. Files for both Bootstrap and jQuery are placed in a &lt;em&gt;wwwroot/lib&lt;/em&gt; folder. The &lt;em&gt;lib&lt;/em&gt; folder contains both compressed and un-compressed versions of all files your project may require.&lt;/p&gt;

&lt;p&gt;During development, you add more CSS to customize the design of your final application. At times, the CSS you add may end up duplicating formatting under different class names. Coming up with these CSS class names at times also prove to turn out to be a challenge. This is where Tailwind CSS comes in handy. &lt;a href="https://tailwindcss.com/"&gt;Tailwind CSS is a highly customizable, low-level CSS framework that gives you all of the building blocks you need to build bespoke designs&lt;/a&gt;. Each Tailwind CSS class (&lt;code&gt;w-16&lt;/code&gt;) maps to a single CSS declaration (&lt;code&gt;width: 4rem;&lt;/code&gt;). By combining these classes, you can freely apply any CSS declarations to a page without ever thinking about coming up with CSS class names. Tailwind does not contain any predefined components and this is where Bootstrap comes in handy. You let Bootstrap provide the components like buttons, cards, and more while enhancing these components using Taiwind CSS.&lt;/p&gt;

&lt;p&gt;Combining both Bootstrap and Tailwind CSS will however result in a huge increases the size of the CSS files. &lt;a href="https://parceljs.org/"&gt;ParcelJS&lt;/a&gt; to our rescue. Parcel is a web application bundler which offers blazing fast performance and requires zero configuration. Using Parcel will allow us to compress and combine the CSS files into a single file. Now we get the best from both Bootstrap and Tailwind CSS and end up with manageable minimized CSS files.&lt;/p&gt;

&lt;p&gt;In this post, we create an ASP.NET Core razor web application, add Tailwind CSS to the project, bundle the project asset files with Parcel and finally add the bundling process to our project build process.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;Your development environment should already be set up to create ASP.NET Core applications and have Node JS installed. &lt;/p&gt;

&lt;p&gt;If however you are starting from a clean slate, you can run the following installations to follow along:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Install &lt;a href="https://code.visualstudio.com/"&gt;Visual Studio Code&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install &lt;a href="https://dotnet.microsoft.com/download"&gt;.NET Core SDK&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install &lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-vscode.csharp"&gt;C# for Visual Studio Code&lt;/a&gt; extension&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install &lt;a href="https://nodejs.org/en/download/"&gt;Node.js&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Create ASP.NET Core Razor Project
&lt;/h2&gt;

&lt;p&gt;Open a terminal or console window and run the following command &lt;code&gt;dotnet new razor -o ParcelBootstrapTailwind&lt;/code&gt; and wait for the project creation process to complete. &lt;code&gt;razor&lt;/code&gt; is the template of the project. To see a list of available templates, run &lt;code&gt;dot new --help&lt;/code&gt; command. &lt;code&gt;-o&lt;/code&gt; creates a folder for our project. &lt;/p&gt;

&lt;p&gt;Open the project by running &lt;code&gt;code ParcelBootstrapTailwind&lt;/code&gt; command.&lt;/p&gt;

&lt;p&gt;Click &lt;strong&gt;Yes&lt;/strong&gt; when the &lt;strong&gt;Required assets to build and debug are missing from ParcelBootstrapTailwind'. Add them?&lt;/strong&gt; dialog is displayed. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--73rcFJi_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ffxhc5d9gkqbgbgfnfo8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--73rcFJi_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ffxhc5d9gkqbgbgfnfo8.png" alt='alt="required vscode debug and build assets"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If the dialog closes before you press the &lt;strong&gt;Yes&lt;/strong&gt; button, select &lt;strong&gt;View&lt;/strong&gt; &amp;gt; &lt;strong&gt;Command Palette..&lt;/strong&gt;, type &lt;strong&gt;.NET&lt;/strong&gt; and click on  &lt;strong&gt;.NET: Generate Assets for Build and Debug&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xTM-SUdz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/xk6axhenvks2la13vqte.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xTM-SUdz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/xk6axhenvks2la13vqte.png" alt='alt="generate required vscode debug and build assets"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From the main menu, select &lt;strong&gt;Run&lt;/strong&gt; &amp;gt; &lt;strong&gt;Run Without Debugging&lt;/strong&gt; and a browser page should open with a &lt;strong&gt;Welcome&lt;/strong&gt; message.&lt;/p&gt;

&lt;p&gt;Now that our project runs using the bundled CSS library, we can move to the next phase. Back in the text editor, stop the running project by selecting &lt;strong&gt;Run&lt;/strong&gt; &amp;gt; &lt;strong&gt;Stop Debugging&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prepare Project To Use Parcel
&lt;/h2&gt;

&lt;p&gt;We need to do a bit of re-organization before using ParcelJS to bundle our asset files.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create an &lt;em&gt;assets&lt;/em&gt; folder in the root of the project. Your are free to name the folder differently.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Within the &lt;em&gt;assets&lt;/em&gt; folder, create a file and name it &lt;em&gt;main.js&lt;/em&gt;. You can use a  different name for the file. This file will be the  entry point for Parcel.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy or move &lt;em&gt;js&lt;/em&gt; and &lt;em&gt;css&lt;/em&gt; folders from &lt;em&gt;wwwroot&lt;/em&gt; to the &lt;em&gt;assets&lt;/em&gt; folder&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Delete the &lt;em&gt;wwwroot&lt;/em&gt; folder&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open  &lt;em&gt;Pages/Shared/&lt;/em&gt;&lt;em&gt;Layout.cshtml&lt;/em&gt; file and replace &lt;code&gt;&amp;lt;link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.min.css" /&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;link rel="stylesheet" href="~/css/site.css" /&amp;gt;&lt;/code&gt; with &lt;code&gt;&amp;lt;link rel="stylesheet" href="~/dist/css/main.css" /&amp;gt;&lt;/code&gt;. Scroll down the file and replace &lt;code&gt;&amp;lt;script src="~/lib/jquery/dist/jquery.min.js"&amp;gt;&amp;lt;/script&amp;gt;&lt;/code&gt;, &lt;code&gt;&amp;lt;script src="~/lib/bootstrap/dist/js/bootstrap.bundle.min.js"&amp;gt;&amp;lt;/script&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;script src="~/js/site.js" asp-append-version="true"&amp;gt;&amp;lt;/script&amp;gt;&lt;/code&gt; with &lt;code&gt;&amp;lt;script src="~/dist/js/main.js"&amp;gt;&amp;lt;/script&amp;gt;&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the main menu, select &lt;strong&gt;View&lt;/strong&gt; &amp;gt; &lt;strong&gt;Terminal&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We need to create a &lt;em&gt;package.json:&lt;/em&gt; file at the root of the project. Run &lt;code&gt;npm init -y&lt;/code&gt; command from the Terminal window. &lt;code&gt;-y&lt;/code&gt; will make the command run using the default options without prompting us. If you want to supply the values for any  of the required options, then omit the &lt;code&gt;-y&lt;/code&gt; from the command.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Install Bootstrap and jQuery. Run &lt;code&gt;npm install bootstrap jquery jquery-validation jquery-validation-unobtrusive popper.js&lt;/code&gt; from the Terminal.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We will include the jQuery validation Javascript files into  our build process. Therefore, &lt;em&gt;Pages/Shared/_ValidationScriptsPartial.cshtml&lt;/em&gt; file is no longer needed and can be deleted.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run &lt;code&gt;npm i -D parcel-bundler parcel-plugin-purgecss parcel-plugin-custom-dist-structure&lt;/code&gt; command to install Parcel and plugins we will require in our build process. Both &lt;code&gt;parcel-plugin-purgecss&lt;/code&gt; and &lt;code&gt;parcel-plugin-custom-dist-structure&lt;/code&gt; plugins are optional and may be omitted. However, we will configure &lt;code&gt;parcel-plugin-purgecss&lt;/code&gt; to remove any unused CSS classes. Parcel.js outputs all built resources in the same folder. We will therefore use &lt;code&gt;parcel-plugin-custom-dist-structure&lt;/code&gt; to move files to custom folders within the output folder.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open &lt;em&gt;main.js&lt;/em&gt; and add code to import Bootstrap and jQuery as follows: &lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Open &lt;em&gt;package.json&lt;/em&gt; file at the root of the project and add scripts to build Parcel. Replace the &lt;code&gt;scripts&lt;/code&gt; section with the following:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
 Scroll down and add the following code to move files to custom folders. Add the code below &lt;code&gt;"devDependencies"&lt;/code&gt; or any other location:
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;To remove any un-used CSS class during deployment, create a &lt;em&gt;purgecss.config.js&lt;/em&gt;. We will update this file with the content files that contain CSS class names and the location of our CSS files.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;From the terminal window, let us test that Parcel build works as expected. From the Terminal window, run &lt;code&gt;npm run build&lt;/code&gt; and  &lt;code&gt;npm run publish&lt;/code&gt;. If the builds run to completion, you should output similar to the following:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--X8HCIrVf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ssrmawwy3y6b0yccx6at.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--X8HCIrVf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ssrmawwy3y6b0yccx6at.png" alt='alt="parcel build and publish test output"'&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To complete this section, let's run the the application. Select &lt;strong&gt;Run&lt;/strong&gt; &amp;gt; &lt;strong&gt;Run Without Debugging&lt;/strong&gt; from the main menu and a browser page should open with a &lt;strong&gt;Welcome&lt;/strong&gt; message similar to the way it was before using Parcel to bundle our CSS and Javascript files.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Add Tailwind CSS
&lt;/h2&gt;

&lt;p&gt;Open the Terminal window and run &lt;code&gt;npm install tailwindcss&lt;/code&gt;. This command will add the Tailwind CSS package to our project.&lt;/p&gt;

&lt;p&gt;Create &lt;em&gt;assets/css/tailwind.css&lt;/em&gt; file and update with the following:&lt;br&gt;
  &lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
Make sure &lt;code&gt;@tailwind base;&lt;/code&gt; is commented out to avoid conflicting with Bootstrap.

&lt;p&gt;Some CSS class are common to both Bootstrap and Tailwind. We are therefore going to use a prefix for Tailwind CSS to differentiate between Bootstrap and Tailwind classes. Create a &lt;em&gt;tailwind.config.js&lt;/em&gt; file within the root of the project by running &lt;code&gt;npx tailwindcss init&lt;/code&gt; command from the terminal window. Open the file and add &lt;code&gt;prefix: 'tw-'&lt;/code&gt; prefix to use for Tailwind classes. Also, as we are using a Parcel &lt;code&gt;parcel-plugin-purgecss&lt;/code&gt; plugin for purging un-used classes, we will therefore add &lt;code&gt;purge: false&lt;/code&gt; to disable the purge feature within Tailwind. After these changes, your &lt;em&gt;tailwind.config.js&lt;/em&gt; file should look similar to the following:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
 

&lt;p&gt;Add an import &lt;code&gt;import './css/tailwind.css'&lt;/code&gt; statement for the &lt;em&gt;tailwind.css&lt;/em&gt; within &lt;em&gt;main.js&lt;/em&gt; just before the &lt;code&gt;import './css/site.css'&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The final step for adding Tailwind to our project is by adding a PostCSS plugin to our Parcel build chain. Create  &lt;em&gt;postcss.config.js&lt;/em&gt; file at the root of the project.&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;If we now run &lt;code&gt;npm run build&lt;/code&gt;, we should see that there is an increase in size on the &lt;em&gt;main.css&lt;/em&gt;. The size however reduces drastically when we run the &lt;code&gt;npm run publish&lt;/code&gt;.&lt;br&gt;
  &lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Yi54MzWA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ws8f31chpuu56nz6b13q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Yi54MzWA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ws8f31chpuu56nz6b13q.png" alt='alt="parcel build and publish test output"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To test whether the Tailwind CSS classes are being used, open &lt;em&gt;Pages/_Layout.cshtml&lt;/em&gt; and change the background color of the &lt;strong&gt;&lt;/strong&gt; tag by adding a  &lt;code&gt;class="tw-bg-gray-200"&lt;/code&gt; class. Also add a gradient to the nav bar by adding &lt;code&gt;tw-bg-gradient-to-r tw-from-teal-400 tw-to-blue-500&lt;/code&gt; CSS classes to the &lt;strong&gt;&lt;/strong&gt; tag. Run &lt;code&gt;npm run build&lt;/code&gt; command and select &lt;strong&gt;Run&lt;/strong&gt; &amp;gt; &lt;strong&gt;Run Without Debugging&lt;/strong&gt; from the main menu. You should observe that the home page and the nav bar are now displayed using our new changes.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r91rISyh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ovvnu6noonrh6fl132k4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r91rISyh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ovvnu6noonrh6fl132k4.png" alt='alt="homepage updated with tailwind css classes"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When working with Tailwind CSS classes in Visual Studio Code, the Tailwind CSS IntelliSense extension will come in handy.This extension provides features such as autocomplete, syntax highlighting, and linting. To install the extension, open your Extensions and search for Tailwind CSS IntelliSense. &lt;/p&gt;

&lt;p&gt;Also, if you have some CSS code and cannot remember out its equivalent class name in Tailwind CSS, &lt;a href="https://tailwind.spacet.me/"&gt;Tailwind CSS class search&lt;/a&gt; will be there to assist. Type a CSS style name and the website will display both the Tailwind CSS class name and style properties for you.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--tHKhogTq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ofpnrmn8obmzzbn91h7v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tHKhogTq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/ofpnrmn8obmzzbn91h7v.png" alt='alt="tailwind css class search website"'&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Add Parcel To Project MsBuild Process.
&lt;/h2&gt;

&lt;p&gt;To integrate Parcel build process to the project build process, we make some modifications to our project file. This will ensure that the &lt;code&gt;npm run&lt;/code&gt; commands run before the project build and publish process is performed.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://docs.microsoft.com/en-us/visualstudio/msbuild/msbuild"&gt;Microsoft Build Engine (MSBuild)&lt;/a&gt; tool uses properties, items, tasks, and targets to build a project. Properties are key-value pairs which direct MSBuild what to do. You define properties using &lt;code&gt;&amp;lt;PropertyName&amp;gt;Value&amp;lt;/PropertyName&amp;gt;&lt;/code&gt; elements. MSBuild has its own reserved properties. You reference property values using &lt;code&gt;$(PropertyName)&lt;/code&gt; syntax. Items are inputs into the build system while tasks are commands which are executed in order to complete a Target. Targets group tasks together.&lt;/p&gt;

&lt;p&gt;In our project, we require properties for &lt;em&gt;assets&lt;/em&gt;, &lt;em&gt;node&lt;/em&gt;&lt;em&gt;modules&lt;/em&gt;, and &lt;em&gt;wwwroot&lt;/em&gt;  folders. Open &lt;em&gt;ParcelBootstrapTailwind.csproj&lt;/em&gt; and add these properties within &lt;code&gt;&amp;lt;ProjectGroup&amp;gt;&lt;/code&gt; as follows:&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Next we create our Targets. The first Target will check whether NodeJS is installed and that the &lt;em&gt;node_modules&lt;/em&gt; folder exists. If NodeJS is not installed, it will not be possible to proceed with with the build and we request the user to first install NodeJS. If &lt;em&gt;node_modules&lt;/em&gt; folder does not exist, we first start by re-installing NPM packages before proceeding with the build. The other two targets will be called depending on whether we are doing a normal build or publishing our application. If we are running a normal build, the build process will invoke &lt;code&gt;npm run build&lt;/code&gt; otherwise &lt;code&gt;npm run publish&lt;/code&gt; will be invoked. The updated &lt;em&gt;ParcelBootstrapTailwind.csproj&lt;/em&gt; should be similar to the following:&lt;br&gt;
 &lt;/p&gt;
&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;From the main menu, select &lt;strong&gt;Run&lt;/strong&gt; &amp;gt; &lt;strong&gt;Run Without Debugging&lt;/strong&gt; and if we typed everything without errors,  a browser page should open with a &lt;strong&gt;Welcome&lt;/strong&gt; message. To test for the project publishing, run &lt;code&gt;dotnet publish -c Release -o ./publish&lt;/code&gt; from the terminal. You will observe the project will build and create a &lt;em&gt;publish&lt;/em&gt; folder at the root of  the project. The size of the CSS file within the &lt;em&gt;publish/wwwroot/dist/css/main.css&lt;/em&gt; will be much smaller than the one within &lt;em&gt;wwwroot/dist/css/main.css&lt;/em&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Not all of us are good at CSS and designing websites. By using CSS frameworks such as Bootstrap, we can use existing predefined components to create good looking websites. Using the low-level classes provided by Tailwind CSS enhances the look and feel of our websites without having to create custom classes. And when our websites are ready for deployment, we can bundle our website assets with no configuration using Parcel.&lt;/p&gt;

&lt;p&gt;In this post, we set up a new project in Visual Studio Code, added Bootstrap and Tailwind and finally bundled our asset files using Parcel. As a final step, we automated the assets bundling commands into the project build process thereby ending up with minified asset files ready for deployment. &lt;/p&gt;

&lt;p&gt;Thank you for reading this far and I hope you picked something along the way that can be of use in your current or next project.&lt;/p&gt;

&lt;p&gt;Happy Bootstraping and Tailwinding with Parcel!&lt;/p&gt;

</description>
      <category>development</category>
      <category>bootstrap</category>
      <category>tailwindcss</category>
      <category>parceljs</category>
    </item>
    <item>
      <title>Git - Not Something We Can Merge</title>
      <dc:creator>Kagunda JM</dc:creator>
      <pubDate>Sun, 30 Aug 2020 19:48:34 +0000</pubDate>
      <link>https://dev.to/kagundajm/git-not-something-we-can-merge-f3f</link>
      <guid>https://dev.to/kagundajm/git-not-something-we-can-merge-f3f</guid>
      <description>&lt;p&gt;I use Git for version control. My workflow is simple; create branch from the main branch, make changes on the new branch, commit the changes and merge the changes to the main branch. I have aliased the &lt;code&gt;git&lt;/code&gt; command to &lt;code&gt;g&lt;/code&gt; and combine the commands in a single line in the form &lt;code&gt;g checkout master;g merge branch-name;&lt;/code&gt; and it works.&lt;/p&gt;

&lt;p&gt;It was therefore a surprise when I ran &lt;code&gt;g checkout master;g merge add_progress;g push;g checkout add_progress&lt;/code&gt; and Git  responded with &lt;strong&gt;merge: add_progress - not something we can merge&lt;/strong&gt;. Being a moderate git user, my first thought was I had used a restricted word in the branch name. Could I have typed the branch name wrongly? I rechecked the branch name and re-ran the command--same result.&lt;/p&gt;

&lt;p&gt;Before searching the cause of the error, I replaced the branch name with the commit ID for the merge command. That worked without raising the error. So, what was wrong with my branch name? Does Git have any restrictions on branch names?&lt;/p&gt;

&lt;h3&gt;
  
  
  It Could Be a Silly Typo
&lt;/h3&gt;

&lt;p&gt;Turns out, the &lt;a href="https://stackoverflow.com/questions/16862933/how-to-resolve-gits-not-something-we-can-merge-error/16862934#answer-16862934"&gt;error can arise from a typo in the branch name&lt;/a&gt;. Taking a deeper look on my Git command, I realized I had typed an underscore (_) for the branch name (&lt;code&gt;add_progress&lt;/code&gt;) instead of a dash (&lt;code&gt;add-progress&lt;/code&gt;), damn it! &lt;/p&gt;

&lt;p&gt;I could not figure out why the Git team decided against a simplified error like &lt;strong&gt;merge:  - invalid branch name&lt;/strong&gt;. This begged the question, are there any restrictions or guidelines on branch naming?&lt;/p&gt;

&lt;h3&gt;
  
  
  Git Branch Naming Conventions
&lt;/h3&gt;

&lt;p&gt;While conventions are good for consistency, no single common convention exists for branch naming. Different teams adopt different styles for naming their Git branches. &lt;/p&gt;

&lt;p&gt;Some teams prefer prefixing branch names with (first and last names followed by a forward and branch description separated by hyphens)[&lt;a href="https://dev.to/g_abud/advanced-git-reference-1o9j"&gt;https://dev.to/g_abud/advanced-git-reference-1o9j&lt;/a&gt;]. &lt;/p&gt;

&lt;p&gt;Others break down features into subtasks and name branches based on &lt;a href="https://blog.kowsheek.com/git-branch-naming-conventions-for-the-real-world/"&gt;feature-subtasks&lt;/a&gt;. This would have a &lt;code&gt;billing-module&lt;/code&gt; have branches names such as &lt;code&gt;billing-module-setup&lt;/code&gt;,  &lt;code&gt;billing-ui-changes&lt;/code&gt; and  &lt;code&gt;billing-database-migration&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href=""&gt;Change types&lt;/a&gt; as a prefix for naming branches (&lt;code&gt;new-feature/twitter-feed&lt;/code&gt;, &lt;code&gt;refactor/homepage/carousel&lt;/code&gt;, &lt;code&gt;fix/twitter-feed/number-of-tweets&lt;/code&gt;) is adopted by other teams. Using this style and separating the change type and feature with a slash (/) makes Git create separate folders in &lt;code&gt;git/refs/heads/&lt;/code&gt; directory of your repository.&lt;/p&gt;

&lt;p&gt;&lt;a href="http://getbem.com/"&gt;BEM (Block Element Modifier)&lt;/a&gt; is another convention used in naming Git branches. This style takes &lt;a href="https://codeburst.io/let-the-branch-name-do-all-the-talking-in-git-e614ff85aa30"&gt;&lt;code&gt;reason__details--tag&lt;/code&gt;&lt;/a&gt; format.  Reason is the change type, details provide a short description of the change while the tag is optional. When the tag is available, it may refer to an external tracker or other extra details.&lt;/p&gt;

&lt;h3&gt;
  
  
  Branch Naming Restrictions
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://wincent.com/wiki/Legal_Git_branch_names"&gt;Restrictions on naming&lt;/a&gt; branches borrows from &lt;a href="https://git-scm.com/docs/git-check-ref-format"&gt;restrictions in reference names&lt;/a&gt;. Names cannot: &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;begin with a dot (&lt;code&gt;.&lt;/code&gt;) or end with the sequence &lt;code&gt;.lock&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;have two consecutive dots &lt;code&gt;..&lt;/code&gt; anywhere.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;have ASCII control characters, space, tilde (&lt;code&gt;~&lt;/code&gt;), caret (&lt;code&gt;^&lt;/code&gt;), or colon (&lt;code&gt;:&lt;/code&gt;) anywhere.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;have question-mark (&lt;code&gt;?&lt;/code&gt;), asterisk (&lt;code&gt;*&lt;/code&gt;), or open bracket (&lt;code&gt;[&lt;/code&gt;) anywhere. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;begin or end with a slash (&lt;code&gt;/&lt;/code&gt;) or contain multiple consecutive slashes &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;end with a dot (&lt;code&gt;.&lt;/code&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;contain a sequence (&lt;code&gt;@{&lt;/code&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Git ignores a backslash (&lt;code&gt;\&lt;/code&gt;) in a name. For example &lt;code&gt;g checkout -b po\le&lt;/code&gt; will create a branch  named &lt;code&gt;pole&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;An opening bracket &lt;code&gt;(&lt;/code&gt; in a name produces an incomplete command prompt (&lt;code&gt;&amp;gt;&lt;/code&gt;). You have to press &lt;strong&gt;Ctrl+C&lt;/strong&gt; to revert to your  normal terminal prompt. Maybe the &lt;code&gt;(&lt;/code&gt;  has a special meaning to Git.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DDFbj8uS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/j4tzvc7a1okfo26md81a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DDFbj8uS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/j4tzvc7a1okfo26md81a.png" alt='Alt "opening bracket in branch name produces incomplete command prompt"'&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Every cloud has a silver lining. Without the error, I would not have taken the effort to learn how others handle  Git branching conventions and naming.&lt;/p&gt;

&lt;h3&gt;
  
  
  Additional Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://dev.to/couchcamote/git-branching-name-convention-cch"&gt;Git Branch Naming Convention&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/agis/git-style-guide"&gt;Git Style Guide&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://dev.to/hardkoded/how-to-organize-your-git-branches-4dci"&gt;How to organize your git branches&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://guides.github.com/introduction/flow/"&gt;Understanding the GitHub flow&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://allenan.com/git-branch-naming-conventions/"&gt;Git Branch Naming Conventions&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>development</category>
      <category>git</category>
      <category>github</category>
      <category>branching</category>
    </item>
  </channel>
</rss>
