<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Lester Sim</title>
    <description>The latest articles on DEV Community by Lester Sim (@lestersimjj).</description>
    <link>https://dev.to/lestersimjj</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/lestersimjj"/>
    <language>en</language>
    <item>
      <title>AWS Certified Database - Specialty Notes</title>
      <dc:creator>Lester Sim</dc:creator>
      <pubDate>Sun, 13 Aug 2023 15:28:56 +0000</pubDate>
      <link>https://dev.to/lestersimjj/aws-certified-database-specialty-notes-2b6f</link>
      <guid>https://dev.to/lestersimjj/aws-certified-database-specialty-notes-2b6f</guid>
      <description>&lt;h1&gt;
  
  
  AWS Database
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;Disclaimer: The opinions expressed here are my own and I'm not writing on behalf of AWS or Amazon.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Here are some quick notes I've gathered to prepare for the certification:&lt;/p&gt;

&lt;h2&gt;
  
  
  Amazon RDS
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Benefits of Managed Database
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Automated provisioning&lt;/li&gt;
&lt;li&gt;Continuous backups and ability to restore to specific timestamp&lt;/li&gt;
&lt;li&gt;Monitoring dashboards&lt;/li&gt;
&lt;li&gt;Read replicas for improved read performance&lt;/li&gt;
&lt;li&gt;Multi AZ setup for Disaster Recovery&lt;/li&gt;
&lt;li&gt;Maintenance windows for OS patching and version upgrades&lt;/li&gt;
&lt;li&gt;Scaling capability (vertical and horizontal)&lt;/li&gt;
&lt;li&gt;Storage backed by EBS (gp2 or io1). Can be set to auto-scaling.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Pricing Model
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Pay as you go pricing model&lt;/li&gt;
&lt;li&gt;Instance types

&lt;ul&gt;
&lt;li&gt;On-demand (Pay for compute capacity per hour)&lt;/li&gt;
&lt;li&gt;Reserved (deeply discounted, 1-year or 3-year term contract)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Storage (GB/month) / Backups / Snapshot Export to S3&lt;/li&gt;
&lt;li&gt;I/O (per million requests)&lt;/li&gt;
&lt;li&gt;Data transfer&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Instance Types
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Standard &lt;/li&gt;
&lt;li&gt;Memory-optimized (memory-intensive, high performance workloads)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Storage Types
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;General Purpose Storage&lt;/strong&gt;: General Purpose SSD volumes offer cost-effective storage that is ideal for a broad range of workloads running on medium-sized DB instances. General Purpose storage is best suited for development and testing environments.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Provisioned IOPS&lt;/strong&gt;: Provisioned IOPS storage is designed to meet the needs of I/O-intensive workloads, particularly database workloads, that require low I/O latency and consistent I/O throughput. Provisioned IOPS storage is best suited for production environments.&lt;/li&gt;
&lt;li&gt;RDS Storage Auto Scaling: Storage is scaled up automatically when the utilization nears the provisioned capacity. Triggers:

&lt;ul&gt;
&lt;li&gt;Free available space is less than 10% of the allocated storage.&lt;/li&gt;
&lt;li&gt;The low-storage condition lasts at least five minutes.&lt;/li&gt;
&lt;li&gt;At least 6 hours have passed since the last storage modification.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;The additional storage is in increments of whichever of the following is greater:

&lt;ul&gt;
&lt;li&gt;5 GiB&lt;/li&gt;
&lt;li&gt;10% of currently allocated storage&lt;/li&gt;
&lt;li&gt;Storage growth prediction for 7 hours based on the FreeStorageSpace metrics change in the past hour.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Parameter Groups
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Configuration file to implement on database instance&lt;/li&gt;
&lt;li&gt;Default parameter group cannot be edited. To make config changes, you must create a new parameter group&lt;/li&gt;
&lt;li&gt;Changes to dynamic parameters always get applied immediately (irrespective of Apply Immediately setting)&lt;/li&gt;
&lt;li&gt;Changes to static parameters require a manual reboot&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Option Groups
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;For configuration of optional features offered by DB engines (not covered by parameter groups)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Security
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Traditional Username and Password can be used to log in to the database&lt;/li&gt;
&lt;li&gt;IAM-based authentication can be used to login into RDS MySQL &amp;amp; PostgreSQL.&lt;/li&gt;
&lt;li&gt;You cannot SSH into an RDS DB instance.&lt;/li&gt;
&lt;li&gt;You can map multiple IAM users or roles to the same database user account&lt;/li&gt;
&lt;li&gt;Rotating RDS DB Credentials: Use AWS Secrets Manager. Supports automatic rotation of secrets. Secrets Manager provides a Lambda rotation function and populates it automatically with the ARN in the secret.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Backups
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;RDS supports automatic backups. Capture transaction logs in real time&lt;/li&gt;
&lt;li&gt;Enabled by default with a 7-days retention period (0-35  days retention, 0=disable automatic backups) via Console. The default backup retention period is one day if you create the DB instance using the Amazon RDS API or the AWS CLI. &lt;/li&gt;
&lt;li&gt;Disabling automatic backups for a DB instance deletes all existing automated backups for the instance&lt;/li&gt;
&lt;li&gt;Automated backups are deleted when the DB instance is deleted. Only manually created DB Snapshots are retained after the DB Instance is deleted.&lt;/li&gt;
&lt;li&gt;Manual snapshot limits (of 100 per region) does not apply to automated backups.&lt;/li&gt;
&lt;li&gt;The first automatic backup is a full backup. Subsequent backups are incremental.&lt;/li&gt;
&lt;li&gt;Backup Data is stored in a S3 bucket (owned and managed by RDS service, you won’t see them in your S3 console)&lt;/li&gt;
&lt;li&gt;You can share manual DB snapshots with up to 20 AWS accounts. Automated Amazon RDS snapshots cannot be shared directly with other AWS accounts. Can share DB snapshots across different regions.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Multi-AZ Deployments and Read Replicas
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Configuring and managing a Multi-AZ deployment: &lt;a href="https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.MultiAZ.html"&gt;https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Concepts.MultiAZ.html&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Working with Read Replicas: &lt;a href="https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html"&gt;https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;To create read replicas, you need to enable automatic backups on source RDS DB instance.&lt;/li&gt;
&lt;li&gt;Multi-AZ follows synchronous replication and spans at least two Availability Zones within a single region. Read Replicas follow asynchronous replication and can be within an Availability Zone, Cross-AZ, or Cross-Region.&lt;/li&gt;
&lt;li&gt;Amazon RDS for MySQL, MariaDB and PostgreSQL allow you to add up to 15 read replicas to each DB Instance. Amazon RDS for Oracle and SQL Server allow you to add up to 5 read replicas to each DB Instance.&lt;/li&gt;
&lt;li&gt;For managing multiple read replicas, you may add each read replica endpoint to a Route 53 record set and configure weighted routing to distribute traffic across different read replicas.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  RDS Monitoring
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;In RDS Console/CloudWatch: CPU, Memory, DatabaseConnections, IOPS, disk space consumption, etc&lt;/li&gt;
&lt;li&gt;RDS Recommendations: Automated suggestions for DB instances, read replicas, etc&lt;/li&gt;
&lt;li&gt;RDS Enhanced Monitoring: Get real-time OS level metrics (CPU, Memory). Agent is automatically installed on DB server to collect metrics. Metrics will be pushed to CloudWatch as well. &lt;/li&gt;
&lt;li&gt;RDS Performance Insights: Dashboard for performance tuning and analysis eg. which SQL query has the highest load. Automatically publishes metrics to CloudWatch. &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Amazon Aurora
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vsl9k2uO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udpxq15k78d6nu9oz21x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vsl9k2uO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udpxq15k78d6nu9oz21x.png" alt="Aurora Architecture" width="640" height="359"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Differences with RDS
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Multi-AZ deployments for RDS MySQL follow synchronous replication whereas Multi-AZ deployments for Aurora MySQL follow asynchronous replication&lt;/li&gt;
&lt;li&gt;Read Replicas can be manually promoted to a standalone database instance for RDS MySQL whereas Read Replicas for Aurora MySQL can be promoted to the primary instance&lt;/li&gt;
&lt;li&gt;The primary and standby DB instances are upgraded at the same time for RDS MySQL Multi-AZ. All instances are upgraded at the same time for Aurora MySQL&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Aurora Backtracking
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Restoring a DB cluster to a point in time launches a new DB cluster and restores it from backup data or a DB cluster snapshot, which can take hours. Backtracking a DB cluster doesn't require a new DB cluster and rewinds the DB cluster in minutes.&lt;/li&gt;
&lt;li&gt;The limit for a backtrack window is 72 hours.&lt;/li&gt;
&lt;li&gt;Backtracking affects the entire DB cluster. For example, you can't selectively backtrack a single table or a single data update.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Aurora Cloning
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Aurora cloning works at the storage layer of an Aurora DB cluster. Uses a copy-on-write protocol.&lt;/li&gt;
&lt;li&gt;Aurora cloning is especially useful for quickly setting up test environments using your production data, without risking data corruption.&lt;/li&gt;
&lt;li&gt;Database cloning uses a copy-on-write protocol, in which data is copied only at the time the data changes, either on the source database or the clone database. Cloning is much faster than a manual snapshot of the DB cluster.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Failover
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Read replica automatically promoted, failover automatically&lt;/li&gt;
&lt;li&gt;Master instance that failed will become read replica when it comes back online&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Aurora Global Database
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;1 Primary Region (R/W), up to 5 secondary regions (Read only). Underlying cluster storage volume replicated to another region.&lt;/li&gt;
&lt;li&gt;If 1 region goes down, can promote another region to be the primary region.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Aurora Serverless
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Amazon Aurora Serverless is an on-demand, autoscaling configuration for Amazon Aurora. It automatically starts up, shuts down, and scales capacity up or down based on your application's needs. You can run your database in the cloud without managing any database instances.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Amazon DynamoDB
&lt;/h2&gt;

&lt;p&gt;Fully managed, serverless, Key-Value database.&lt;/p&gt;

&lt;h3&gt;
  
  
  Consistency
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Eventually consistent is the default read consistent model for all read operations. When issuing eventually consistent reads to a DynamoDB table or an index, the responses may not reflect the results of a recently completed write operation. If you repeat your read request after a short time, the response should eventually return the more recent item. Eventually consistent reads are supported on tables, local secondary indexes, and global secondary indexes.&lt;/li&gt;
&lt;li&gt;Read operations such as GetItem, Query, and Scan provide an optional ConsistentRead parameter. If you set ConsistentRead to true, DynamoDB returns a response with the most up-to-date data, reflecting the updates from all prior write operations that were successful. Strongly consistent reads are only supported on tables and local secondary indexes. &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Scan vs Query Operation
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Scan&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The Scan operation returns one or more items and item attributes by accessing every item in a table or a secondary index. To have DynamoDB return fewer items, you can provide a FilterExpression operation.&lt;/li&gt;
&lt;li&gt;Eventual/Strong Consistency&lt;/li&gt;
&lt;li&gt;Prefer Query over Scan when possible. 
&lt;strong&gt;Query&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Find items based on primary key values (partition key/sort key). Return all items with that partition key.&lt;/li&gt;
&lt;li&gt;Eventual/Strong Consistency&lt;/li&gt;
&lt;li&gt;Faster than Scan because it only scans through that parition specified&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Primary Key
&lt;/h3&gt;

&lt;p&gt;Simple Primary Key: Just 1 partition key&lt;br&gt;
Composite Primary Key: Comprise of 1 partition key and 1 sort key&lt;/p&gt;

&lt;p&gt;Partition Key: Used for partition selection via DynamoDB internal hash function&lt;br&gt;
Sort Key: Range select or to order results. Sort keys may not be used on their own.&lt;/p&gt;

&lt;h3&gt;
  
  
  Local Secondary Indexes
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Up to 5 LSIs&lt;/li&gt;
&lt;li&gt;Has same partition key as the primary index of the table but has different sort key than the primary index of the table. A local secondary index is "local" in the sense that every partition of a local secondary index is scoped to a base table partition that has the same partition key value.&lt;/li&gt;
&lt;li&gt;Can only be created at the time of creating the table and cannot be deleted later&lt;/li&gt;
&lt;li&gt;Support eventual / strong / transactional consistency&lt;/li&gt;
&lt;li&gt;Use Case:

&lt;ul&gt;
&lt;li&gt; When application needs same partition key as the table &lt;/li&gt;
&lt;li&gt; When application needs strongly consistent index reads&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Global Secondary Indexes
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Up to 20 GSIs&lt;/li&gt;
&lt;li&gt;Can have same or different partition key than the table’s primary index&lt;/li&gt;
&lt;li&gt;Can have same or different sort key than the table’s primary index. Optional to have sort key.&lt;/li&gt;
&lt;li&gt;A global secondary index is considered "global" because queries on the index can span all of the data in the base table, across all partitions.&lt;/li&gt;
&lt;li&gt;Can have different schema from base table. Cannot fetch attributes from the base table other than the base table’s primary key attributes.&lt;/li&gt;
&lt;li&gt;Supports only eventual consistency&lt;/li&gt;
&lt;li&gt;Can be created or deleted any time&lt;/li&gt;
&lt;li&gt;Has its own provisioned throughput. If the writes are throttled on the GSI, then the main table will be throttled too.&lt;/li&gt;
&lt;li&gt;Use Case:

&lt;ul&gt;
&lt;li&gt; When application needs different or same partition key as the table &lt;/li&gt;
&lt;li&gt;When application needs finer throughput control&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  DynamoDB Accelerator (DAX)
&lt;/h3&gt;

&lt;p&gt;Amazon DynamoDB Accelerator (DAX) is a fully managed, highly available, in-memory cache for Amazon DynamoDB that delivers up to a 10 times performance improvement.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;DynamoDB response times: Single-digit milliseconds&lt;/li&gt;
&lt;li&gt;DynamoDB with DAX response times: Microseconds&lt;/li&gt;
&lt;li&gt;Reduce read load on DynamoDB&lt;/li&gt;
&lt;li&gt;Supports only eventual consistency&lt;/li&gt;
&lt;li&gt;Redirect your DynamoDB API request to the DAX endpoint instead of DynamoDB endpoint&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is only a brief summary of the core topics I found to be important and not exhaustive. There are more database-related services covered in the certification. Please refer to &lt;a href="https://aws.amazon.com/certification/certified-database-specialty/"&gt;https://aws.amazon.com/certification/certified-database-specialty/&lt;/a&gt; for the full set of topics to prepare.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>database</category>
      <category>cloud</category>
      <category>certification</category>
    </item>
    <item>
      <title>Deploy a Python Flask App on Amazon EC2 - Part 1</title>
      <dc:creator>Lester Sim</dc:creator>
      <pubDate>Sun, 14 May 2023 07:42:20 +0000</pubDate>
      <link>https://dev.to/lestersimjj/deploy-a-python-flask-app-on-amazon-ec2-part-1-375g</link>
      <guid>https://dev.to/lestersimjj/deploy-a-python-flask-app-on-amazon-ec2-part-1-375g</guid>
      <description>&lt;p&gt;In this series of blogs, we'll be deploying a Python Flask application on Amazon EC2 instances.&lt;/p&gt;

&lt;p&gt;When we in our development stages of writing a Python Flask App, we are commonly running it on our local computer as the server and entering eg. &lt;code&gt;localhost:5000&lt;/code&gt; into our browser to see the changes. Now that the app is ready for deploying, how do we deploy it to the cloud?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;End Goal&lt;/strong&gt;&lt;br&gt;
&amp;lt;&amp;lt; Insert Gif &amp;gt;&amp;gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Agenda&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Part 1: What is Flask, WSGI, Nginx?&lt;/strong&gt;&lt;br&gt;
Part 2: Create an EC2 instance and accessing it&lt;br&gt;
Part 3: Create Flask App and install Nginx on EC2 instance&lt;br&gt;
Part 4: Configure Auto Scaling Group on AWS&lt;br&gt;
Part 5: Making changes to our Flask App&lt;/p&gt;

</description>
    </item>
    <item>
      <title>AWS Machine Learning Certification: Exam Notes</title>
      <dc:creator>Lester Sim</dc:creator>
      <pubDate>Mon, 10 Oct 2022 15:29:29 +0000</pubDate>
      <link>https://dev.to/lestersimjj/aws-machine-learning-certification-exam-notes-n9</link>
      <guid>https://dev.to/lestersimjj/aws-machine-learning-certification-exam-notes-n9</guid>
      <description>&lt;p&gt;&lt;em&gt;Disclaimer: The opinions expressed here are my own and I'm not writing on behalf of AWS or Amazon.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The AWS Machine Learning - Specialty Certification covers a wide spectrum of topics from data engineering to exploratory data analysis to model training and deployment. Here are some quick notes I've gathered to prepare for the certification:&lt;/p&gt;

&lt;h2&gt;
  
  
  AWS AI Services
&lt;/h2&gt;

&lt;p&gt;Beneficial for developers who want to add AI into their applications through API calls instead of developing and training their own ML models from scratch.&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon Textract
&lt;/h3&gt;

&lt;p&gt;Extract text from scanned documents using Optical Character Recognition (OCR).&lt;/p&gt;

&lt;h4&gt;
  
  
  Documents
&lt;/h4&gt;

&lt;p&gt;Returns text, forms, tables and query responses.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zZBmQf21--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ueuh5nsj0ambzv2o4spj.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zZBmQf21--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ueuh5nsj0ambzv2o4spj.gif" alt="Image description" width="880" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Expenses
&lt;/h4&gt;

&lt;p&gt;Extracts data from invoices/receipts eg. vendor name, invoice/receipt date, invoice/receipt number, item name, item price, item quantity, total amount.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yppFi8Nv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g31xbeoq220xn3fzfu7n.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yppFi8Nv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g31xbeoq220xn3fzfu7n.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon Comprehend
&lt;/h3&gt;

&lt;p&gt;Extract entities, key phrases, language, personal identifiable information (PII), and sentiments from text.&lt;/p&gt;

&lt;h4&gt;
  
  
  Entities
&lt;/h4&gt;

&lt;p&gt;Extract entities from text documents eg. people, places, locations.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS Console:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bz9ExG5u--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mmuc49n23narxoi4y1gr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bz9ExG5u--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mmuc49n23narxoi4y1gr.png" alt="Image description" width="813" height="814"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Using AWS API:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Yy9hXTOY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/evr0sqonrlh6qjnfo4kt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Yy9hXTOY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/evr0sqonrlh6qjnfo4kt.png" alt="Image description" width="880" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Key Phrases
&lt;/h4&gt;

&lt;p&gt;Extract the key phrases (one or more words) from text documents.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS Console:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BBQfjrA7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zy8brljktr2y7ea19wsg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BBQfjrA7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zy8brljktr2y7ea19wsg.png" alt="Image description" width="817" height="819"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Using AWS API:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TBrbFEJy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j7qdjvjbbp7vtpny4e8p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TBrbFEJy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j7qdjvjbbp7vtpny4e8p.png" alt="Image description" width="880" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Sentiment
&lt;/h4&gt;

&lt;p&gt;Predict the overall sentiment of the text - positive, negative, neutral, mixed.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS Console:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bYQZK3iq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xj9snwmuzefb9bct2ezv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bYQZK3iq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xj9snwmuzefb9bct2ezv.png" alt="Image description" width="818" height="478"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS API:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ts3jdOks--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n5ilgw5djhaihtrl22qy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ts3jdOks--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n5ilgw5djhaihtrl22qy.png" alt="Image description" width="880" height="375"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Language
&lt;/h4&gt;

&lt;p&gt;Predict the dominant language of the entire text. Amazon Comprehend can recognize 100 languages.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS Console:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2bmTY6OF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kxln6azfpjbd5z8o45y3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2bmTY6OF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kxln6azfpjbd5z8o45y3.png" alt="Image description" width="816" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS API:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SP7vWuAo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fxjukvhgyba9kvgg9f31.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SP7vWuAo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fxjukvhgyba9kvgg9f31.png" alt="Image description" width="880" height="370"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Personally Identifiable Information (PII)
&lt;/h4&gt;

&lt;p&gt;List out entities in your input text that contain personal information eg. address, bank account number, or phone number.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS Console:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2-16EI5k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/46ik6q0czbtzibn9h99w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2-16EI5k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/46ik6q0czbtzibn9h99w.png" alt="Image description" width="683" height="750"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Using AWS API:&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jJav4zit--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ct5sa1323aneb1gmkx4k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jJav4zit--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ct5sa1323aneb1gmkx4k.png" alt="Image description" width="880" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Vision
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Amazon Rekognition
&lt;/h3&gt;

&lt;p&gt;Analyze images and videos to identify objects, people, text, scenes, and activities.&lt;/p&gt;

&lt;h4&gt;
  
  
  Label Detection
&lt;/h4&gt;

&lt;p&gt;Extract labels of objects, concepts, scenes, and actions in your images.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KWIjpDLG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ed9yezh7vwz4jo5bmdpl.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KWIjpDLG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ed9yezh7vwz4jo5bmdpl.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Facial Analysis
&lt;/h4&gt;

&lt;p&gt;Detect faces and retrieve facial attributes in an image eg. facial expressions, accessories, facial features, etc.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NGCWCh-H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vhqulpx52nwyllwdr1ei.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NGCWCh-H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vhqulpx52nwyllwdr1ei.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Face Comparison
&lt;/h4&gt;

&lt;p&gt;Compare faces within a set of images with multiple faces in them. Compares the largest face in the source image (reference face) with up to 100 faces detected in the target image (comparison faces), and generate a similarity score.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TDp9htYL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1lvwr5h5ebh7jsa1zn11.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TDp9htYL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1lvwr5h5ebh7jsa1zn11.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Other AWS AI Services
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Amazon Lex: Build conversational interfaces using voice/text as input&lt;/li&gt;
&lt;li&gt;Amazon Polly: Text to speech&lt;/li&gt;
&lt;li&gt;Amazon Transcribe: Speech to text&lt;/li&gt;
&lt;li&gt;Amazon Translate: To different languages&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Domain 1: Data Engineering
&lt;/h2&gt;

&lt;h3&gt;
  
  
  AWS Glue
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/glue/latest/dg/what-is-glue.html"&gt;https://docs.aws.amazon.com/glue/latest/dg/what-is-glue.html&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Serverless data integration service that makes it easy for analytics users to discover, prepare, move, and integrate data from multiple sources.&lt;/li&gt;
&lt;li&gt;Data Sources: S3, RDS, JDBC, DynamoDB, Kinesis Data Streams, Apache Kafka&lt;/li&gt;
&lt;li&gt;Data Targets: S3, RDS, JDBC&lt;/li&gt;
&lt;li&gt;Crawlers: Automatically infer database and table schema from your source data, storing the associated metadata in the AWS Glue Data Catalog.&lt;/li&gt;
&lt;li&gt;ETL Programming Languages: PySpark (Python), Scala&lt;/li&gt;
&lt;li&gt;FindMatches Transform: Use this machine learning transformation step to identify duplicate or matching records. Eg. matching customers/products/improve fraud detection, etc.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Amazon Athena
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/athena/latest/ug/what-is.html"&gt;https://docs.aws.amazon.com/athena/latest/ug/what-is.html&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Serverless, interactive query service to query data and analyze big data in Amazon S3 using standard SQL.&lt;/li&gt;
&lt;li&gt;Integration with AWS Glue: AWS Glue crawlers automatically infer database and table schema from data in S3 and store the associated metadata in AWS Glue Data Catalog. This catalog lets the Athena query engine know how to find, read, and process the data you want to query.&lt;/li&gt;
&lt;li&gt;When to use Amazon Athena vs Redshift vs EMR: &lt;a href="https://docs.aws.amazon.com/athena/latest/ug/when-should-i-use-ate.html"&gt;https://docs.aws.amazon.com/athena/latest/ug/when-should-i-use-ate.html&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Amazon Kinesis
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/kinesis/index.html"&gt;https://docs.aws.amazon.com/kinesis/index.html&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Kinesis Video Stream
&lt;/h4&gt;

&lt;p&gt;Stream live video data, optionally store it, and make the data available for consumption both in real time and on a batch or ad hoc basis.&lt;/p&gt;

&lt;h4&gt;
  
  
  Kinesis Data Stream
&lt;/h4&gt;

&lt;p&gt;Collect and process large streams of data records in real time.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reading from Data Streams (Consumers): Using Kinesis Data Analytics, Kinesis Data Firehose, Lambda, EC2&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Kinesis Data Firehose
&lt;/h4&gt;

&lt;p&gt;ETL service that captures, transforms, and delivers streaming data to data lakes, data stores, and analytics services.  Buffers incoming streaming data to a certain size or for a certain period of time before delivering it to destinations.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use Lambda to do data transformation for each buffered batch/convert file format. Eg. Apache Parquet more efficient to query than JSON format.&lt;/li&gt;
&lt;li&gt;Delivery Stream Destination: S3, Redshift, Elasticsearch, Splunk, HTTP endpoint, etc&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Kinesis Data Analytics
&lt;/h4&gt;

&lt;p&gt;Continuously read and analyze data from a connected streaming source in real-time. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Source: Kinesis Data Stream, Kinesis Data Firehose&lt;/li&gt;
&lt;li&gt;Destination: 1/ Kinesis Data Stream, 2/ Kinesis Data Firehose, 3/ Lambda&lt;/li&gt;
&lt;li&gt;Runtime: SQL, Apache Flink&lt;/li&gt;
&lt;li&gt;Aggregate/Analytical Functions: Hotspots, Random Cut Forest, etc&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Domain 2: Exploratory Data Analysis
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Data Labelling: AWS Ground Truth (Data labeling service using human annotators from Amazon Mechanical Turk or your own private workforce)&lt;/li&gt;
&lt;li&gt;Feature Engineering: 1 hot encoding, binning, outliers, normalization, PCA dimension reduction. For text: TF-IDF, Bag of Words, N-Gram.&lt;/li&gt;
&lt;li&gt;Know the different types of data visualization: Histogram, scatter plot, box plot, correlation heatmap, hierarchical plot, etc.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Domain 3: Modelling
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/sagemaker/latest/dg/algos.html"&gt;https://docs.aws.amazon.com/sagemaker/latest/dg/algos.html&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Supervised Learning Algos: XGBoost, k-NN, Linear Learner, DeepAR Forecasting, Object2Vec, &lt;/li&gt;
&lt;li&gt;Unsupervised Learning Algos: K-Means, PCA, Random Cut Forest&lt;/li&gt;
&lt;li&gt;Text Analysis Algos: BlazingText, Sequence-to-Sequence, LDA, Neural Topic Model (NTM)&lt;/li&gt;
&lt;li&gt;Image Processing Algos: MXNet, TensorFlow, Object Detection, Semantic Segmentation (pixel level)&lt;/li&gt;
&lt;li&gt;Evaluation of ML Models: Confusion Matrix, AUC-ROC, Accuracy, Precision, Recall, F1 Score, RMSE&lt;/li&gt;
&lt;li&gt;Overfitting Solutions: 1/ Use fewer features, 2/ Decrease n-grams size, 3/ Increase amount of regularization used, 4/ Increase amount of training data examples&lt;/li&gt;
&lt;li&gt;Underfitting Solutions: 1/ Add new domain-specific features, 2/ Add more Cartesian products, 3/ Increase n-grams size, 4/ Decrease amount of regularization used, 5/ Increase amount of training data examples&lt;/li&gt;
&lt;li&gt;Hyperparameter Tuning: Random Search, Bayesian Search&lt;/li&gt;
&lt;li&gt;How SageMaker Studio works: &lt;a href="https://aws.amazon.com/blogs/machine-learning/dive-deep-into-amazon-sagemaker-studio-notebook-architecture/"&gt;https://aws.amazon.com/blogs/machine-learning/dive-deep-into-amazon-sagemaker-studio-notebook-architecture/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;SageMaker Studio Notebooks vs SageMaker Notebook Instances: &lt;a href="https://docs.aws.amazon.com/sagemaker/latest/dg/notebooks-comparison.html"&gt;https://docs.aws.amazon.com/sagemaker/latest/dg/notebooks-comparison.html&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Domain 4: Machine Learning Implementation &amp;amp; Operations
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Real-time Inference: Create a HTTPS endpoint if you require a persistent endpoint for apps to call to get inferences&lt;/li&gt;
&lt;li&gt;Batch Transform: Preprocess datasets, run inferences from large datasets, does not require a persistent endpoint.&lt;/li&gt;
&lt;li&gt;SageMaker Neo: Automatically optimizes machine learning models for inference on cloud instances and edge devices to run faster with no loss in accuracy.&lt;/li&gt;
&lt;li&gt;SageMaker Elastic Inference (EI): Speed up the throughput and decrease the latency of getting real-time inferences from your deep learning models that are deployed as SageMaker hosted models, but at a fraction of the cost of using a GPU instance for your endpoint&lt;/li&gt;
&lt;li&gt;Track and monitor SageMaker metrics using: 1/ AWS Console, 2/ CloudWatch, 3/ SageMaker Python SDK APIs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;This is only a brief summary of the core topics I found to be important and definitely not exhaustive. Please refer to &lt;a href="https://aws.amazon.com/certification/certified-machine-learning-specialty/"&gt;https://aws.amazon.com/certification/certified-machine-learning-specialty/&lt;/a&gt; for the full set of topics to prepare.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>machinelearning</category>
      <category>ai</category>
    </item>
    <item>
      <title>Intro to AWS AI Services: Textract, Comprehend, Rekognition</title>
      <dc:creator>Lester Sim</dc:creator>
      <pubDate>Sun, 25 Sep 2022 16:03:13 +0000</pubDate>
      <link>https://dev.to/lestersimjj/intro-to-aws-ai-services-textract-comprehend-rekognition-56hj</link>
      <guid>https://dev.to/lestersimjj/intro-to-aws-ai-services-textract-comprehend-rekognition-56hj</guid>
      <description>&lt;p&gt;Sharing these demos and compiled docs of AWS AI Services to help you visualize how these services work. These pre-built models will be beneficial for developers with no prior experience to machine learning as the models have been pre-trained. You will only need to call the API with the inputs to get the predicted results.&lt;/p&gt;

&lt;h2&gt;
  
  
  Text &amp;amp; Documents
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Amazon Textract
&lt;/h3&gt;

&lt;p&gt;Extract text from scanned documents using Optical Character Recognition (OCR).&lt;/p&gt;

&lt;h4&gt;
  
  
  Documents
&lt;/h4&gt;

&lt;p&gt;Returns text, forms, tables and query responses.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zZBmQf21--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ueuh5nsj0ambzv2o4spj.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zZBmQf21--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ueuh5nsj0ambzv2o4spj.gif" alt="Image description" width="880" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Expenses
&lt;/h4&gt;

&lt;p&gt;Extracts data from invoices/receipts eg. vendor name, invoice/receipt date, invoice/receipt number, item name, item price, item quantity, total amount&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--yppFi8Nv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g31xbeoq220xn3fzfu7n.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--yppFi8Nv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g31xbeoq220xn3fzfu7n.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon Comprehend
&lt;/h3&gt;

&lt;p&gt;Extract entities, key phrases, language, personal identifiable information (PII), and sentiments from text.&lt;/p&gt;

&lt;h4&gt;
  
  
  Entities
&lt;/h4&gt;

&lt;p&gt;Extract entities from text documents eg. people, places, locations.&lt;/p&gt;

&lt;p&gt;Using AWS Console:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bz9ExG5u--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mmuc49n23narxoi4y1gr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bz9ExG5u--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mmuc49n23narxoi4y1gr.png" alt="Image description" width="813" height="814"&gt;&lt;/a&gt;&lt;br&gt;
Using AWS API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Yy9hXTOY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/evr0sqonrlh6qjnfo4kt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Yy9hXTOY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/evr0sqonrlh6qjnfo4kt.png" alt="Image description" width="880" height="373"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Key Phrases
&lt;/h4&gt;

&lt;p&gt;Extract the key phrases (one or more words) from text documents.&lt;/p&gt;

&lt;p&gt;Using AWS Console:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BBQfjrA7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zy8brljktr2y7ea19wsg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BBQfjrA7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zy8brljktr2y7ea19wsg.png" alt="Image description" width="817" height="819"&gt;&lt;/a&gt;&lt;br&gt;
Using AWS API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TBrbFEJy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j7qdjvjbbp7vtpny4e8p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TBrbFEJy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/j7qdjvjbbp7vtpny4e8p.png" alt="Image description" width="880" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Sentiment
&lt;/h4&gt;

&lt;p&gt;Predict the overall sentiment of the text - positive, negative, neutral, mixed.&lt;/p&gt;

&lt;p&gt;Using AWS Console:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--bYQZK3iq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xj9snwmuzefb9bct2ezv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--bYQZK3iq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xj9snwmuzefb9bct2ezv.png" alt="Image description" width="818" height="478"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using AWS API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ts3jdOks--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n5ilgw5djhaihtrl22qy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ts3jdOks--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n5ilgw5djhaihtrl22qy.png" alt="Image description" width="880" height="375"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Language
&lt;/h4&gt;

&lt;p&gt;Predict the dominant language of the entire text. Amazon Comprehend can recognize 100 languages.&lt;/p&gt;

&lt;p&gt;Using AWS Console:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2bmTY6OF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kxln6azfpjbd5z8o45y3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2bmTY6OF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kxln6azfpjbd5z8o45y3.png" alt="Image description" width="816" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using AWS API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--SP7vWuAo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fxjukvhgyba9kvgg9f31.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SP7vWuAo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fxjukvhgyba9kvgg9f31.png" alt="Image description" width="880" height="370"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Personally Identifiable Information (PII)
&lt;/h4&gt;

&lt;p&gt;List out entities in your input text that contain personal information eg. address, bank account number, or phone number.&lt;/p&gt;

&lt;p&gt;Using AWS Console:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2-16EI5k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/46ik6q0czbtzibn9h99w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2-16EI5k--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/46ik6q0czbtzibn9h99w.png" alt="Image description" width="683" height="750"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using AWS API:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jJav4zit--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ct5sa1323aneb1gmkx4k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jJav4zit--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ct5sa1323aneb1gmkx4k.png" alt="Image description" width="880" height="374"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Vision
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Amazon Rekognition
&lt;/h3&gt;

&lt;p&gt;Analyze images and videos to identify objects, people, text, scenes, and activities.&lt;/p&gt;

&lt;h4&gt;
  
  
  Label Detection
&lt;/h4&gt;

&lt;p&gt;Extract labels of objects, concepts, scenes, and actions in your images.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KWIjpDLG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ed9yezh7vwz4jo5bmdpl.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KWIjpDLG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ed9yezh7vwz4jo5bmdpl.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Facial Analysis
&lt;/h4&gt;

&lt;p&gt;Detect faces and retrieve facial attributes in an image eg. facial expressions, accessories, facial features, etc.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NGCWCh-H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vhqulpx52nwyllwdr1ei.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NGCWCh-H--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vhqulpx52nwyllwdr1ei.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Face Comparison
&lt;/h4&gt;

&lt;p&gt;Compare faces within a set of images with multiple faces in them. Compares the largest face in the source image (reference face) with up to 100 faces detected in the target image (comparison faces), and generate a similarity score.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TDp9htYL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1lvwr5h5ebh7jsa1zn11.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TDp9htYL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1lvwr5h5ebh7jsa1zn11.gif" alt="Image description" width="880" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Sources:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;For other AWS AI services, refer to: 
&lt;a href="https://aws.amazon.com/machine-learning/ai-services/"&gt;https://aws.amazon.com/machine-learning/ai-services/&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/rekognition/latest/dg/what-is.html"&gt;https://docs.aws.amazon.com/rekognition/latest/dg/what-is.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/comprehend/latest/dg/what-is.html"&gt;https://docs.aws.amazon.com/comprehend/latest/dg/what-is.html&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Disclaimer: The opinions expressed here are my own and I'm not writing on behalf of AWS or Amazon.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ai</category>
      <category>cloud</category>
    </item>
    <item>
      <title>Deploy a Django Web App to Azure with connection to Azure PostgreSQL  </title>
      <dc:creator>Lester Sim</dc:creator>
      <pubDate>Tue, 22 Feb 2022 02:54:37 +0000</pubDate>
      <link>https://dev.to/lestersimjj/deploy-a-django-web-app-to-azure-with-connection-to-azure-postgresql-hl3</link>
      <guid>https://dev.to/lestersimjj/deploy-a-django-web-app-to-azure-with-connection-to-azure-postgresql-hl3</guid>
      <description>&lt;p&gt;In this tutorial, we’ll deploy a ready-made Django web app to Azure Web Service and connect the back-end to Azure PostgreSQL for the app to write data to the database.&lt;/p&gt;

&lt;h3&gt;
  
  
  End Goal
&lt;/h3&gt;

&lt;p&gt;The sample Django repo is a polls web application where a user selects a question and selects one of the options as their answers. The app will record the votes in the PostgreSQL database and display them to the user. A demo of the end state is as shown:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw6mff9vtdmrpxvvrgjax.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw6mff9vtdmrpxvvrgjax.gif" alt="demo"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;An Azure subscription is required. If you don't have an Azure subscription, create a free Azure account before you begin.&lt;/li&gt;
&lt;li&gt;Basic knowledge of Django, PostgreSQL, Git commands&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Agenda
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Fork and clone the sample repo of Django Web App&lt;/li&gt;
&lt;li&gt;Provision PostgreSQL Flexible Server in Azure&lt;/li&gt;
&lt;li&gt;Connect to PostgreSQL Flexible Server and create a database&lt;/li&gt;
&lt;li&gt;Provision the web app using Azure App Services&lt;/li&gt;
&lt;li&gt;Connect the app to Azure PostgreSQL&lt;/li&gt;
&lt;li&gt;Deploy app code from Github to Azure App Services&lt;/li&gt;
&lt;li&gt;Run Django database migrations to establish schema on Azure PostgreSQL database&lt;/li&gt;
&lt;li&gt;Test the app by creating a poll question&lt;/li&gt;
&lt;li&gt;Make changes to the app codes and redeploy&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Fork and Clone Sample Repo
&lt;/h3&gt;

&lt;p&gt;Navigate to &lt;a href="https://github.com/Azure-Samples/djangoapp" rel="noopener noreferrer"&gt;https://github.com/Azure-Samples/djangoapp&lt;/a&gt; and fork the repository into your own GitHub account.&lt;/p&gt;

&lt;h3&gt;
  
  
  Provision PostgreSQL Flexible Server in Azure
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;Azure portal&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Search for and select Azure Database for PostgreSQL&lt;/li&gt;
&lt;li&gt;Select Create &amp;gt; Flexible Server &amp;gt; Create.&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;On the Flexible Server page, enter the following information:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Subscription&lt;/td&gt;
&lt;td&gt;Select your desired Azure Subscription&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Resource Group&lt;/td&gt;
&lt;td&gt;Create a new resource group eg. DjangoPostgres-Tutorial-rg&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Server Name&lt;/td&gt;
&lt;td&gt;A name for the database server that's unique across all Azure&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Region&lt;/td&gt;
&lt;td&gt;Select a location near you&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Workload Type&lt;/td&gt;
&lt;td&gt;Development&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Compute + Storage&lt;/td&gt;
&lt;td&gt;Leave as default. Burstable, B1ms&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Availability Zone, High Availability, PostgreSQL version&lt;/td&gt;
&lt;td&gt;Leave as default&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Admin Username, Password&lt;/td&gt;
&lt;td&gt;Enter credentials for an administrator account on the database server. Record these credentials as you'll need them later in this tutorial. Note: do not use the $ character in the username or password as it will conflict with web app environmental variables later on&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select Next: Networking &amp;gt;, and on that page set Connectivity method to Public access, and then under Firewall rules check the box for Allow public access from any Azure service within Azure to this server.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select Review + Create, then Create. Azure takes a few minutes to provision the database server.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;After provisioning is complete, select Go to resource to open the overview page for the database server.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h3&gt;
  
  
  Connect to Azure Postgres and Create a Database
&lt;/h3&gt;

&lt;p&gt;Open the Azure Cloud Shell from the Azure portal by selecting the Cloud Shell icon at the top of the window. &lt;/p&gt;

&lt;p&gt;In the Cloud Shell, run the following command:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

psql &lt;span class="nt"&gt;--host&lt;/span&gt;&lt;span class="o"&gt;=[&lt;/span&gt;server-url].postgres.database.azure.com &lt;span class="nt"&gt;--port&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;5432 &lt;span class="nt"&gt;--username&lt;/span&gt;&lt;span class="o"&gt;=[&lt;/span&gt;user-name] &lt;span class="nt"&gt;--dbname&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgres


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Replace [server-name] and [user-name] with the names used in the previous section.&lt;/p&gt;

&lt;p&gt;When the shell connects successfully, you should see the prompt postgres=&amp;gt;. This prompt indicates that you're connected to the default administrative database named "postgres". (The "postgres" database isn't intended for app usage.)&lt;/p&gt;

&lt;p&gt;Run the command below in postgres to create a database called &lt;em&gt;pollsdb&lt;/em&gt; that your app will connect to later:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;

&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;pollsdb&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If the database is created successfully, the command should display CREATE DATABASE. To verify that the database was created, run:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;

&lt;span class="err"&gt;\&lt;/span&gt;&lt;span class="k"&gt;c&lt;/span&gt; &lt;span class="n"&gt;pollsdb&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Exit postgres by entering the command &lt;code&gt;exit&lt;/code&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Provision the Web App Infrastructure in Azure
&lt;/h3&gt;

&lt;p&gt;Open Azure Portal &amp;gt; Search for App Services &amp;gt; Create&lt;/p&gt;

&lt;p&gt;Enter the following information:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Subscription&lt;/td&gt;
&lt;td&gt;Select the subscription you want to use if different from the default.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Resource group&lt;/td&gt;
&lt;td&gt;Select the "DjangoPostgres-Tutorial-rg" group you created in the previous section.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;App name&lt;/td&gt;
&lt;td&gt;A name for your web app that's unique across all Azure.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Publish&lt;/td&gt;
&lt;td&gt;Select Code.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Runtime stack&lt;/td&gt;
&lt;td&gt;Select Python 3.8 from the drop-down list.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Region&lt;/td&gt;
&lt;td&gt;Select a location near you.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Linux Plan&lt;/td&gt;
&lt;td&gt;The portal will populate this field with an App Service Plan name based on your resource group. If you want to change the name, select Create new.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Sku and size&lt;/td&gt;
&lt;td&gt;Select Change size &amp;gt; Dev/Test &amp;gt; Select B1 &amp;gt; Apply. You can scale the plan later for better performance.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Select Review + Create, then select Create. Azure takes a few minutes to provision the infrastructure for the web app. Once done, select Go To Resource.&lt;/p&gt;




&lt;h3&gt;
  
  
  Connect Azure App Service to Azure Postgres
&lt;/h3&gt;

&lt;p&gt;Configure the environment variables on Azure App Service to connect to Azure Postgres. On the App Service page, scroll down and select Configuration &amp;gt; Application Settings.&lt;/p&gt;

&lt;p&gt;Click New Application Setting with the following:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting name&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;DBHOST&lt;/td&gt;
&lt;td&gt;The URL of the database server from the previous section; that is, the [server-name].postgres.database.azure.com.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DBNAME&lt;/td&gt;
&lt;td&gt;pollsdb&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DBUSER&lt;/td&gt;
&lt;td&gt;The administrator user name used when you provisioned the database.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;DBPASS&lt;/td&gt;
&lt;td&gt;The administrator password you created earlier.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Save your settings.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F91dw0bnf4n6cmh2thgur.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F91dw0bnf4n6cmh2thgur.png" alt="app-env-variables"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Deploy App to Azure App Service from Github Repo
&lt;/h3&gt;

&lt;p&gt;With the database and connection settings in place, you can now configure the Azure App to deploy code directly from a GitHub repository.&lt;/p&gt;

&lt;p&gt;On the same Azure App Service &amp;gt; Scroll down to Deployment section &amp;gt; Select Deployment Center.&lt;/p&gt;

&lt;p&gt;In the Source control, select GitHub. Follow the sign-in prompts to use your current GitHub login.&lt;/p&gt;

&lt;p&gt;In the GitHub section, select the following values:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Organization&lt;/td&gt;
&lt;td&gt;The GitHub account to which you forked the sample repository.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Repository&lt;/td&gt;
&lt;td&gt;djangoapp&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Branch&lt;/td&gt;
&lt;td&gt;Select flexible-server branch&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Workflow Option&lt;/td&gt;
&lt;td&gt;Add a workflow&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Select Save to trigger the build and deployment workflow. Go to Actions in your GitHub repository for djangoapp to monitor progress. Azure will deploy the code and start the app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn4ujdneb6txenbzp4fez.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn4ujdneb6txenbzp4fez.gif" alt="github-deploy"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Move to the next steps once the build is complete.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjo7of5bow6kngix48kvg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjo7of5bow6kngix48kvg.png" alt="github-actions"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Run Django Database Migrations
&lt;/h3&gt;

&lt;p&gt;What remains is to establish the necessary schema in the database itself. Do this by "migrating" the data models in the Django app to the database.&lt;/p&gt;

&lt;p&gt;On the same App Service page, scroll down to Development Tools on the left handle &amp;gt; select SSH &amp;gt; Enter the &lt;code&gt;ls&lt;/code&gt;command to show the app's files. Since the app server is hosted on Azure, you are now connected to the server via SSSH.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6c1dgp4k52cdrl7ajucb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6c1dgp4k52cdrl7ajucb.png" alt="app-ssh-enter"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0svtn4xb4sv43waekuv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0svtn4xb4sv43waekuv.png" alt="app-ssh"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the console, run database migrations:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

python manage.py migrate


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Create a administrator login for the app.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

python manage.py createsuperuser


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The createsuperuser command prompts you for Django superuser (or admin) credentials, which are used within the web app. For the purposes of this tutorial, use the default username root, press Enter for the email address to leave it blank, and enter &lt;code&gt;password321&lt;/code&gt; for the password.&lt;/p&gt;




&lt;h3&gt;
  
  
  Create a poll question in the App
&lt;/h3&gt;

&lt;p&gt;You're now ready to run a quick test of the app to demonstrate that it is working with the PostgreSQL database.&lt;/p&gt;

&lt;p&gt;In the browser window or tab for the web app, return to the Overview page, then click on the URL for the web app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq45t5b1ia3qsgm1r360r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq45t5b1ia3qsgm1r360r.png" alt="app-url"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The app should display the message "Polls app" and "No polls are available" because there are no specific polls yet in the database.&lt;/p&gt;

&lt;p&gt;Sign in using the Django superuser credentials from the previous section (&lt;code&gt;root&lt;/code&gt;and &lt;code&gt;password321&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Under Polls, select Add next to Questions and create a poll question with some choices.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F865h3kkjl4t6msxtjrdm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F865h3kkjl4t6msxtjrdm.png" alt="app-add-poll"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Browse back to the home page to confirm that the questions are now presented to the user. Test out the app, vote for different options, and you will see the database is being updated.&lt;/p&gt;




&lt;h2&gt;
  
  
  Update the App codes and Redeploy automatically
&lt;/h2&gt;

&lt;p&gt;Azure automatically redeploys your app code whenever you commit changes to the GitHub repository. However, if you change Django app's data models, you must migrate those changes to the database:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Connect to the web app again via SSH as previously&lt;/li&gt;
&lt;li&gt;Run the migrations again with &lt;code&gt;python manage.py migrate&lt;/code&gt;
&lt;/li&gt;
&lt;/ol&gt;




&lt;h3&gt;
  
  
  Resources
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/developer/python/tutorial-python-postgresql-app-portal?pivots=postgres-flexible-server" rel="noopener noreferrer"&gt;Full Tutorial&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  About the Sample Django Web App
&lt;/h3&gt;

&lt;p&gt;The sample web app is created by following Django documentation tutorial &lt;a href="https://docs.djangoproject.com/en/3.2/intro/tutorial01/" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>django</category>
      <category>postgres</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Create a PostgreSQL Database on Azure in 5mins</title>
      <dc:creator>Lester Sim</dc:creator>
      <pubDate>Tue, 19 Oct 2021 13:46:04 +0000</pubDate>
      <link>https://dev.to/lestersimjj/create-a-postgresql-database-on-azure-in-10mins-7nm</link>
      <guid>https://dev.to/lestersimjj/create-a-postgresql-database-on-azure-in-10mins-7nm</guid>
      <description>&lt;h2&gt;
  
  
  About Azure Database for PostgreSQL
&lt;/h2&gt;

&lt;p&gt;Azure Database for PostgreSQL is a fully managed Database as a Service (DBaaS) offering. As compared to running it on-premises, Azure automatically handles all the fundamental database management tasks so you can focus on developing your applications instead of spending time managing databases. Here's a quick summary of what DBaaS refers to:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flyfxdobmh2j24s8upe1q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flyfxdobmh2j24s8upe1q.png" alt="DBaaS"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;An Azure subscription is required. If you don't have an Azure subscription, create a free &lt;a href="https://azure.microsoft.com/free/" rel="noopener noreferrer"&gt;Azure account&lt;/a&gt; before you begin.&lt;/li&gt;
&lt;li&gt;Basic knowledge of pgAdmin/psql commands&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Agenda
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Creating an Azure Database for PostgreSQL (Flexible Server) using Azure Portal&lt;/li&gt;
&lt;li&gt;Configuring the Firewall&lt;/li&gt;
&lt;li&gt;Connecting/Querying the Database using pgAdmin/psql&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  Create an Azure Database for PostgreSQL
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Go to the &lt;a href="https://portal.azure.com/" rel="noopener noreferrer"&gt;Azure Portal&lt;/a&gt; to create an Azure Database for PostgreSQL Single Server database. Search for and select &lt;em&gt;Azure Database for PostgreSQL servers&lt;/em&gt;.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpbm4qjvwuv7tv2crnrsv.png" alt="Azure PostgreSQL"&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on &lt;em&gt;Create&lt;/em&gt; &amp;gt; select &lt;em&gt;Flexible Server&lt;/em&gt;.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffhd0rc4dtnah4kanleds.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffhd0rc4dtnah4kanleds.png" alt="Flexible Server"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Enter the Basics form with the following information:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Subscription&lt;/td&gt;
&lt;td&gt;Select your desired Azure Subscription&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Resource Group&lt;/td&gt;
&lt;td&gt;Create a new resource group eg. postgres-tutorial&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Server Name&lt;/td&gt;
&lt;td&gt;Create a globally unique name eg. postgres-tutorial-server-22102021&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Region&lt;/td&gt;
&lt;td&gt;Select a location closest to you&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Workload Type&lt;/td&gt;
&lt;td&gt;Development&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Compute + Storage&lt;/td&gt;
&lt;td&gt;Use the default: Burstable, 1 vCore, 2GiB RAM, 32 GiB Storage&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Availability Zone&lt;/td&gt;
&lt;td&gt;No preference&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;PostgreSQL Version&lt;/td&gt;
&lt;td&gt;13&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;High Availability&lt;/td&gt;
&lt;td&gt;Leave as unchecked&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Admin username&lt;/td&gt;
&lt;td&gt;Set your admin username eg. postgresadmin&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Password&lt;/td&gt;
&lt;td&gt;Set your password&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click &lt;em&gt;Next: Networking&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;By default, the server that you create is not publicly accessible. If you are querying from your computer, you'll need to allow access from your IP address. In Connectivity Method &amp;gt; select &lt;em&gt;Public Access (allowed IP addresses)&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scroll down to Firewall Rules &amp;gt; select &lt;em&gt;Add current client IP address&lt;/em&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Famg000s9107xpztv1jn2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Famg000s9107xpztv1jn2.png" alt="Networking"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;em&gt;Review + Create&lt;/em&gt; &amp;gt; &lt;em&gt;Create&lt;/em&gt; to provision the server. This operation might take a few minutes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;And you're done! PostgreSQL database is successfully created, click &lt;em&gt;Go to Resource&lt;/em&gt; to view the database created.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfm1o2od4abde9u0tawz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhfm1o2od4abde9u0tawz.png" alt="successful-creation"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;


&lt;h2&gt;
  
  
  Connecting to the Database with pgAdmin
&lt;/h2&gt;

&lt;p&gt;2 common methods to connect to a PostgreSQL database is using either pgAdmin or psql command line.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Click Overview in Azure Portal to copy the server name and username for our connection later on.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0p5tixw40map7hot66uj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0p5tixw40map7hot66uj.png" alt="overview"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Open pgAdmin &amp;gt; right click on Server &amp;gt; Create Server&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80ony3t0xyoftzf0wjeo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F80ony3t0xyoftzf0wjeo.png" alt="pgadmin"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a name for this connection eg. azure-postgres-tutorial&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Click the Connection tab and enter the hostname and username copied from the Overview page on Azure portal. In my example, it will be the following: &lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Settings&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Host Name&lt;/td&gt;
&lt;td&gt;postgres-tutorial-server-22102021.postgres.database.azure.com&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Port&lt;/td&gt;
&lt;td&gt;5432&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Maintenance Database&lt;/td&gt;
&lt;td&gt;postgres&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Username&lt;/td&gt;
&lt;td&gt;postgresadmin&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Password&lt;/td&gt;
&lt;td&gt;Your password&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click &lt;em&gt;Save&lt;/em&gt;. Once connected successfully, expand the server you have just connected to and you’ll see 3 databases already created by Azure. Do not delete these databases. &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feyipqwk9jtc67xekd2jo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feyipqwk9jtc67xekd2jo.png" alt="default-databases"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Perfect! Now you are connected to the PostgreSQL database running on Azure and ready to start adding data into your database!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Right click on Database &amp;gt; Create &amp;gt; Database. Enter any name as your new database name eg. mypgsqldb.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flspa2fno53z1izn7as0i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flspa2fno53z1izn7as0i.png" alt="pgadmin-create-database"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on mypgsqldb and the Query Tool icon at the top window.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy6pzxhusak1gqybyn04a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy6pzxhusak1gqybyn04a.png" alt="pgadmin-query"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy and paste the commands below into the Query Editor and select Execute Query.&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;serial&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
    &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; 
    &lt;span class="n"&gt;quantity&lt;/span&gt; &lt;span class="nb"&gt;INTEGER&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;INSERT&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;quantity&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;VALUES&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'banana'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;150&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; 
&lt;span class="k"&gt;INSERT&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;quantity&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;VALUES&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'orange'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;154&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The results from the Inventory table will be shown as below:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5bcnmhxk8smqvu89ebvd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5bcnmhxk8smqvu89ebvd.png" alt="pgadmin-results"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Connecting to the Database with psql
&lt;/h2&gt;

&lt;p&gt;Alternatively you can use psql to connect/query the database.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open Terminal/Powershell&lt;/li&gt;
&lt;li&gt;Run the following command to connect to the default database &lt;em&gt;postgres&lt;/em&gt;. Replace hostname and username with what your actual server name and admin username set in previous steps. This can also be found on the &lt;em&gt;Overview&lt;/em&gt; tab.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0p5tixw40map7hot66uj.png" alt="overview"&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In my example, the code will be:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;psql &lt;span class="nt"&gt;--host&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgres-tutorial-server-22102021.postgres.database.azure.com &lt;span class="nt"&gt;--port&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;5432 &lt;span class="nt"&gt;--username&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgresadmin &lt;span class="nt"&gt;--dbname&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgres
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Enter the database password created previously when prompted. Now, you're connected to the database with the following screen displayed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;psql &lt;span class="nt"&gt;--host&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgres-tutorial-server-22102021.postgres.database.azure.com &lt;span class="nt"&gt;--port&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;5432 &lt;span class="nt"&gt;--username&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgresadmin &lt;span class="nt"&gt;--dbname&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;postgres
Password &lt;span class="k"&gt;for &lt;/span&gt;user postgresadmin: 
psql &lt;span class="o"&gt;(&lt;/span&gt;13.1, server 13.4&lt;span class="o"&gt;)&lt;/span&gt;
SSL connection &lt;span class="o"&gt;(&lt;/span&gt;protocol: TLSv1.3, cipher: xxxxxxxx, bits: 256, compression: off&lt;span class="o"&gt;)&lt;/span&gt;
Type &lt;span class="s2"&gt;"help"&lt;/span&gt; &lt;span class="k"&gt;for &lt;/span&gt;help.

&lt;span class="nv"&gt;postgres&lt;/span&gt;&lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a database called mypgsqldb2&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;DATABASE&lt;/span&gt; &lt;span class="n"&gt;mypgsqldb2&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Switch the connection to connect to this newly created database instead of the default postgres database that we initially connected to.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="err"&gt;\&lt;/span&gt;&lt;span class="k"&gt;c&lt;/span&gt; &lt;span class="n"&gt;mypgsqldb2&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a table in this database&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;id&lt;/span&gt; &lt;span class="nb"&gt;serial&lt;/span&gt; &lt;span class="k"&gt;PRIMARY&lt;/span&gt; &lt;span class="k"&gt;KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
    &lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="nb"&gt;VARCHAR&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; 
    &lt;span class="n"&gt;quantity&lt;/span&gt; &lt;span class="nb"&gt;INTEGER&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Insert some data into the Inventory table&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;INSERT&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;quantity&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;VALUES&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'banana'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;150&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; 
&lt;span class="k"&gt;INSERT&lt;/span&gt; &lt;span class="k"&gt;INTO&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;quantity&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;VALUES&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'orange'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;154&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Query the results from the Inventory table&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;inventory&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;And you're done! You've successfully created a PostgreSQL database on Azure and connected to it using pgAdmin/psql!&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Use Azure CLI to create PostgreSQL databases&lt;/li&gt;
&lt;li&gt;Explore the different Compute + Storage options&lt;/li&gt;
&lt;li&gt;Create a Flask App connected to Azure PostgreSQL&lt;/li&gt;
&lt;li&gt;Deploy a Flask App to Azure App Service with Azure PostgreSQL&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Cleaning Up
&lt;/h2&gt;

&lt;p&gt;If you don't expect to use this database in the future, search for &lt;em&gt;Resource Groups&lt;/em&gt; at the top search bar of Azure Portal. Select the resource group name (eg. azure-postgres-tutorial) &amp;gt; &lt;em&gt;Delete Resource Group&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/postgresql/flexible-server/quickstart-create-server-portal" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/postgresql/flexible-server/quickstart-create-server-portal&lt;/a&gt;&lt;br&gt;
&lt;a href="https://docs.microsoft.com/en-us/azure/postgresql/tutorial-design-database-using-azure-portal" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/postgresql/tutorial-design-database-using-azure-portal&lt;/a&gt; &lt;/p&gt;

</description>
      <category>azure</category>
      <category>postgres</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
