<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: La Tasha "L."  Pollard</title>
    <description>The latest articles on DEV Community by La Tasha "L."  Pollard (@ljpeg).</description>
    <link>https://dev.to/ljpeg</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ljpeg"/>
    <language>en</language>
    <item>
      <title>An NBA Player Data Lake automation script, leveraging AWS S3, Glue, and Athena.</title>
      <dc:creator>La Tasha "L."  Pollard</dc:creator>
      <pubDate>Sat, 08 Feb 2025 22:21:25 +0000</pubDate>
      <link>https://dev.to/ljpeg/a-nba-data-lake-leveraging-aws-s3-glue-and-athena-27cc</link>
      <guid>https://dev.to/ljpeg/a-nba-data-lake-leveraging-aws-s3-glue-and-athena-27cc</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This is the Week 1 Day 3 Project of the 30-Day DevOps challenge I am participating in. &lt;a href="https://www.linkedin.com/posts/deshae-lyda_cloudengineering-devops-aws-activity-7265757561625186304-3Q_T/?utm_source=social_share_send&amp;amp;utm_medium=member_desktop_web" rel="noopener noreferrer"&gt;Learn more about the challenge and it's creators.&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;In today's project, we are implementing a data lake to ingest, store, and manage the high volume, complex data that we will retrieve from our API call to the NBA API. &lt;/p&gt;

&lt;h3&gt;
  
  
  What's a Data Lake?
&lt;/h3&gt;

&lt;p&gt;A data lake is kind of like a database in the sense that we store large amount of data in both. The difference is that, whereas a database is structured (tables, schemas, relationships), a data lake can handle large volumes of all of kinds data regardless of format(structured, semi-structured, unstructured), and your data doesn't need to organized first. &lt;/p&gt;

&lt;p&gt;We can think of a database like a organized filling cabinet: when we add to it, we add the file to the correct location based on the schema of the cabinet. &lt;/p&gt;

&lt;p&gt;Whereas a data lake would be more like a storage warehouse where we can add any kind of data. We don't have to organize it and put it in the correct location, just dump it on in there. &lt;/p&gt;

&lt;p&gt;But what good for us is a whole bunch of unprocessed, unorganized data? So in our project we are gathering all this data, storing it in a data lake in an S3 bucket, and then using other AWS services (e.g. Glue and Athena) to make the data queryable. &lt;/p&gt;




&lt;h2&gt;
  
  
  Key Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Fetches NBA player data using SportsDataIO's NBA API.&lt;/li&gt;
&lt;li&gt;Stores raw and processed data in AWS S3 buckets.&lt;/li&gt;
&lt;li&gt;Uses AWS Glue Crawlers to infer schema from raw data in S3.&lt;/li&gt;
&lt;li&gt;Stores metadata in AWS Glue Data Catalog for structured querying.&lt;/li&gt;
&lt;li&gt;Allows SQL-based querying of structured data stored in S3.&lt;/li&gt;
&lt;li&gt;Implements IAM roles and policies to control access to S3, Glue, and Athena.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Technical Architecture
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F075t8df0zglo0291tm27.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F075t8df0zglo0291tm27.png" alt="Flow chart depicting the interaction and data flow of the architecture." width="800" height="636"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Technologies
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Cloud Provider: AWS&lt;/li&gt;
&lt;li&gt;Core Services: S3, Glue, Athena&lt;/li&gt;
&lt;li&gt;External API: NBA Game API (SportsData.io)&lt;/li&gt;
&lt;li&gt;Programming Language: Python 3.x&lt;/li&gt;
&lt;li&gt;IAM Security: Least privilege policies&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  NBA API
&lt;/h3&gt;

&lt;p&gt;We are making a request to the NBA API, this time for data about NBA players. This is how I coded the request:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;nba_endpoint&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://api.sportsdata.io/v3/nba/scores/json/Players?key=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;nba_endpoint&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;raise_for_status&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Fetched NBA data successfully.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  
&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error fetching NBA data: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The response looks something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="o"&gt;[&lt;/span&gt;
  &lt;span class="o"&gt;{&lt;/span&gt;
    &lt;span class="s2"&gt;"PlayerID"&lt;/span&gt;: 20000441,
    &lt;span class="s2"&gt;"SportsDataID"&lt;/span&gt;: &lt;span class="s2"&gt;""&lt;/span&gt;,
    &lt;span class="s2"&gt;"Status"&lt;/span&gt;: &lt;span class="s2"&gt;"Active"&lt;/span&gt;,
    &lt;span class="s2"&gt;"TeamID"&lt;/span&gt;: 29,
    &lt;span class="s2"&gt;"Team"&lt;/span&gt;: &lt;span class="s2"&gt;"PHO"&lt;/span&gt;,
    &lt;span class="s2"&gt;"Jersey"&lt;/span&gt;: 3,
    &lt;span class="s2"&gt;"PositionCategory"&lt;/span&gt;: &lt;span class="s2"&gt;"G"&lt;/span&gt;,
    &lt;span class="s2"&gt;"Position"&lt;/span&gt;: &lt;span class="s2"&gt;"SG"&lt;/span&gt;,
    &lt;span class="s2"&gt;"FirstName"&lt;/span&gt;: &lt;span class="s2"&gt;"Bradley"&lt;/span&gt;,
    &lt;span class="s2"&gt;"LastName"&lt;/span&gt;: &lt;span class="s2"&gt;"Beal"&lt;/span&gt;,
    &lt;span class="s2"&gt;"Height"&lt;/span&gt;: 76,
    &lt;span class="s2"&gt;"Weight"&lt;/span&gt;: 207,
    &lt;span class="s2"&gt;"BirthDate"&lt;/span&gt;: &lt;span class="s2"&gt;"1993-06-28T00:00:00"&lt;/span&gt;,
    &lt;span class="s2"&gt;"BirthCity"&lt;/span&gt;: &lt;span class="s2"&gt;"St. Louis"&lt;/span&gt;,
    &lt;span class="s2"&gt;"BirthState"&lt;/span&gt;: &lt;span class="s2"&gt;"MO"&lt;/span&gt;,
    &lt;span class="s2"&gt;"BirthCountry"&lt;/span&gt;: &lt;span class="s2"&gt;"USA"&lt;/span&gt;,
    &lt;span class="s2"&gt;"HighSchool"&lt;/span&gt;: null,
    &lt;span class="s2"&gt;"College"&lt;/span&gt;: &lt;span class="s2"&gt;"Florida"&lt;/span&gt;,
    &lt;span class="s2"&gt;"Salary"&lt;/span&gt;: 50203930,
    &lt;span class="s2"&gt;"PhotoUrl"&lt;/span&gt;: &lt;span class="s2"&gt;"https://s3-us-west-2.amazonaws.com/static.fantasydata.com/headshots/nba/low-res/0.png"&lt;/span&gt;,
    &lt;span class="s2"&gt;"Experience"&lt;/span&gt;: 12,
    &lt;span class="s2"&gt;"SportRadarPlayerID"&lt;/span&gt;: &lt;span class="s2"&gt;"ff461754-ad20-4eeb-af02-2b46cc980b24"&lt;/span&gt;,
    &lt;span class="s2"&gt;"RotoworldPlayerID"&lt;/span&gt;: 1966,
    &lt;span class="s2"&gt;"RotoWirePlayerID"&lt;/span&gt;: 3303,
    &lt;span class="s2"&gt;"FantasyAlarmPlayerID"&lt;/span&gt;: 200464,
    &lt;span class="s2"&gt;"StatsPlayerID"&lt;/span&gt;: 606912,
    &lt;span class="s2"&gt;"SportsDirectPlayerID"&lt;/span&gt;: 750970,
    &lt;span class="s2"&gt;"XmlTeamPlayerID"&lt;/span&gt;: 3395,
    &lt;span class="s2"&gt;"InjuryStatus"&lt;/span&gt;: &lt;span class="s2"&gt;"Scrambled"&lt;/span&gt;,
    &lt;span class="s2"&gt;"InjuryBodyPart"&lt;/span&gt;: &lt;span class="s2"&gt;"Scrambled"&lt;/span&gt;,
    &lt;span class="s2"&gt;"InjuryStartDate"&lt;/span&gt;: &lt;span class="s2"&gt;"2025-02-06T00:00:00"&lt;/span&gt;,
    &lt;span class="s2"&gt;"InjuryNotes"&lt;/span&gt;: &lt;span class="s2"&gt;"Scrambled"&lt;/span&gt;,
    &lt;span class="s2"&gt;"FanDuelPlayerID"&lt;/span&gt;: 15595,
    &lt;span class="s2"&gt;"DraftKingsPlayerID"&lt;/span&gt;: 606912,
    &lt;span class="s2"&gt;"YahooPlayerID"&lt;/span&gt;: 5009,
    &lt;span class="s2"&gt;"FanDuelName"&lt;/span&gt;: &lt;span class="s2"&gt;"Bradley Beal"&lt;/span&gt;,
    &lt;span class="s2"&gt;"DraftKingsName"&lt;/span&gt;: &lt;span class="s2"&gt;"Bradley Beal"&lt;/span&gt;,
    &lt;span class="s2"&gt;"YahooName"&lt;/span&gt;: &lt;span class="s2"&gt;"Bradley Beal"&lt;/span&gt;,
    &lt;span class="s2"&gt;"DepthChartPosition"&lt;/span&gt;: &lt;span class="s2"&gt;"SG"&lt;/span&gt;,
    &lt;span class="s2"&gt;"DepthChartOrder"&lt;/span&gt;: 6,
    &lt;span class="s2"&gt;"GlobalTeamID"&lt;/span&gt;: 20000029,
    &lt;span class="s2"&gt;"FantasyDraftName"&lt;/span&gt;: &lt;span class="s2"&gt;"Bradley Beal"&lt;/span&gt;,
    &lt;span class="s2"&gt;"FantasyDraftPlayerID"&lt;/span&gt;: 606912,
    &lt;span class="s2"&gt;"UsaTodayPlayerID"&lt;/span&gt;: 8315651,
    &lt;span class="s2"&gt;"UsaTodayHeadshotUrl"&lt;/span&gt;: &lt;span class="s2"&gt;"http://cdn.usatsimg.com/api/download/?imageID=24445236"&lt;/span&gt;,
    &lt;span class="s2"&gt;"UsaTodayHeadshotNoBackgroundUrl"&lt;/span&gt;: &lt;span class="s2"&gt;"http://cdn.usatsimg.com/api/download/?imageID=24445234"&lt;/span&gt;,
    &lt;span class="s2"&gt;"UsaTodayHeadshotUpdated"&lt;/span&gt;: &lt;span class="s2"&gt;"2024-10-09T14:17:10"&lt;/span&gt;,
    &lt;span class="s2"&gt;"UsaTodayHeadshotNoBackgroundUpdated"&lt;/span&gt;: &lt;span class="s2"&gt;"2024-10-09T14:17:04"&lt;/span&gt;,
    &lt;span class="s2"&gt;"NbaDotComPlayerID"&lt;/span&gt;: 203078
  &lt;span class="o"&gt;}&lt;/span&gt;,
...
&lt;span class="o"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's like 50 lines of code for one player! &lt;em&gt;..Googles the number of current NBA players, multiples times 50..&lt;/em&gt; Yea, this is good introductory example of a large data set.   &lt;/p&gt;

&lt;h3&gt;
  
  
  S3
&lt;/h3&gt;

&lt;p&gt;AWS S3 - Simple Storage Solution - will be our storage layer. The raw data retrieved from the API call will be stored in our S3 bucket. The processed data (after being processed by Glue) is also stored in an S3 bucket.&lt;/p&gt;

&lt;h3&gt;
  
  
  Glue
&lt;/h3&gt;

&lt;p&gt;AWS Glue is a serverless data integration service that includes tools for ETL (Extract, Transform, Load) jobs and a Data Catalog for metadata management. We will use Glue Crawler to process the raw data we are storing in our S3 bucket and Glue Data Catalog to store and manage metadata about the structured data. This will make the data accessible for querying using AWS Athena.  &lt;/p&gt;

&lt;h3&gt;
  
  
  Athena
&lt;/h3&gt;

&lt;p&gt;AWS Athena is a serverless querying service used to analyze data. We will use it to run SQL queries on the data in our S3 bucket after its been processed by AWS Glue. &lt;/p&gt;

&lt;h3&gt;
  
  
  A helpful analogy:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;S3 = The Library &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Where all the raw and processed data is stored.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Glue Data Catalog = The Library's Catalog &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A system that organizes and records information about the books/data stored in S3.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Athena = A Reader &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Who looks at the catalog (Glue Data Catalog) to find and understand the structure of the books (data) before reading them directly from the library (S3))&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  Python Script
&lt;/h3&gt;

&lt;p&gt;The creation of these resources could be done manually through the AWS Console, however this project automates this process by preparing python code, utilizing the boto3 library. The script will be copy-and-pasted into the AWS CloudShell terminal and executed to provision the necessary AWS resources, including fetching data from the API, creating the S3 bucket, configuring the Glue Crawler, updating the Glue Data Catalog, and setting up Athena for querying the stored data.&lt;/p&gt;




&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nba-data-lake/
├── .env                             &lt;span class="c"&gt;# holds environment variables&lt;/span&gt;
├── policies/        
│   └── IAM_role.json                &lt;span class="c"&gt;# json for policy permissions&lt;/span&gt;
├── src/
│   ├── setup_nba_data_lake.py       &lt;span class="c"&gt;# main script&lt;/span&gt;
│   ├── delete_nba_data_lake.py      &lt;span class="c"&gt;# script to delete resources&lt;/span&gt;
├── README.md                        &lt;span class="c"&gt;# documentation&lt;/span&gt;
└── requirements.txt                 &lt;span class="c"&gt;# dependencies &lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Set Up Instructions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;AWS Account with the following permissions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;S3: CreateBucket, PutObject, DeleteBucket, ListBucket&lt;/li&gt;
&lt;li&gt;Glue: CreateDatabase, CreateTable, DeleteDatabase, DeleteTable&lt;/li&gt;
&lt;li&gt;Athena: StartQueryExecution, GetQueryResults&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;NBA API key &lt;/p&gt;&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  1. Clone the Repo
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone &amp;lt;url&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Go to the AWS Console, and click the CloudShell icon to open the terminal.
&lt;/h3&gt;

&lt;h3&gt;
  
  
  3. Type &lt;code&gt;nano setup_nba_data_lake.py&lt;/code&gt; into the shell console and press enter.
&lt;/h3&gt;

&lt;h3&gt;
  
  
  4. Copy the code from our repo file: &lt;code&gt;src/setup_data_lake.py&lt;/code&gt; to the shell console.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Update the api_key variable to your actual key&lt;/li&gt;
&lt;li&gt;Update the nba_endpoint variable.&lt;/li&gt;
&lt;li&gt;Update the bucket_name variable.&lt;/li&gt;
&lt;li&gt;Update the region variable (if necessary).&lt;/li&gt;
&lt;li&gt;Hit crtl + x to exit, then y to save.&lt;/li&gt;
&lt;li&gt;Hit enter to confirm the name.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. Type &lt;code&gt;python setup_nba_data_lake.py&lt;/code&gt; and run the code.
&lt;/h3&gt;

&lt;h3&gt;
  
  
  6. Manually check for the resources.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Go to S3 and you should see the new bucket with 3 objects inside of it.&lt;/li&gt;
&lt;li&gt;Click on raw-data and open the file inside of it. &lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  7. Query the data with Athena.
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Go to Athena and paste the sample query
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;FirstName&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;LastName&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;Position&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Team&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;nba_players&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="k"&gt;Position&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'PG'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Click run and you should see output under "Query Results"&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  8. Delete Resources
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to the CloudShell console.&lt;/li&gt;
&lt;li&gt;Type &lt;code&gt;nano delete_aws_resources&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Copy-and-paste the content from the file &lt;code&gt;src/delete_aws_resources&lt;/code&gt; into the console.&lt;/li&gt;
&lt;li&gt;Update the bucket_name variable to match the bucket created. &lt;/li&gt;
&lt;li&gt;Hit crtl + x to exit, then y to save.&lt;/li&gt;
&lt;li&gt;Hit enter to confirm the name&lt;/li&gt;
&lt;li&gt;Type &lt;code&gt;python delete_aws_resources&lt;/code&gt; into the console and hit enter.&lt;/li&gt;
&lt;li&gt;Manually confirm resources have been deleted.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Today I learned...
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;What data lakes are and how they differ from databases&lt;/li&gt;
&lt;li&gt;How to create a python script to automate the provisioning of AWS resources&lt;/li&gt;
&lt;li&gt;What the services AWS Glue and AWS Athena are&lt;/li&gt;
&lt;li&gt;How to use Glue Crawler and Glue Data Catalog to process data&lt;/li&gt;
&lt;li&gt;How to use Athena to query data&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Future Enhancements
&lt;/h2&gt;

&lt;p&gt;I am writing this after reaching the initial goal of this challenge: to automate the creation of a data lake using AWS services.&lt;/p&gt;

&lt;p&gt;If I have the capacity I will enhance this app by integrating a data visualization dashboard that connects to AWS Athena. I could also add logic to include NFL player data.&lt;/p&gt;




&lt;p&gt;And if you've made it this far, thanks for reading! Feel free to checkout my &lt;a href="https://github.com/Ljpeg" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;. I'd love to connect on &lt;a href="https://www.linkedin.com/in/latashapollard/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;. &lt;/p&gt;

</description>
      <category>cloud</category>
      <category>devops</category>
      <category>aws</category>
      <category>automation</category>
    </item>
    <item>
      <title>The VSCode one looks awesome, I’m going to try it!</title>
      <dc:creator>La Tasha "L."  Pollard</dc:creator>
      <pubDate>Fri, 07 Feb 2025 02:00:24 +0000</pubDate>
      <link>https://dev.to/ljpeg/the-vscode-one-looks-awesome-im-going-to-try-it-1k81</link>
      <guid>https://dev.to/ljpeg/the-vscode-one-looks-awesome-im-going-to-try-it-1k81</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/justin3go" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F852049%2F50bb3a45-5e79-4c37-bfeb-9d3c1d097568.jpeg" alt="justin3go"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://dev.to/justin3go/10-creative-open-source-portfolio-templates-o9n" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;10 Creative &amp;amp; Open Source Portfolio Templates&lt;/h2&gt;
      &lt;h3&gt;Justin3go ・ Feb 5&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#webdev&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#opensource&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#portfolio&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#github&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
      <category>webdev</category>
      <category>opensource</category>
      <category>portfolio</category>
      <category>github</category>
    </item>
    <item>
      <title>A Cloud-based NBA Game Day Alert System using Python, AWS Lambda, SNS, and EventBridge.</title>
      <dc:creator>La Tasha "L."  Pollard</dc:creator>
      <pubDate>Wed, 05 Feb 2025 18:14:17 +0000</pubDate>
      <link>https://dev.to/ljpeg/a-cloud-based-nba-game-day-alert-system-using-python-aws-lambda-sns-and-eventbridge-3g7o</link>
      <guid>https://dev.to/ljpeg/a-cloud-based-nba-game-day-alert-system-using-python-aws-lambda-sns-and-eventbridge-3g7o</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;This is the Week 1 Day 2 project of the 30-Day DevOps challenge that I am participating in. Learn more about the challenge and its creators, s/o to them, &lt;a href="https://www.linkedin.com/posts/deshae-lyda_cloudengineering-devops-aws-activity-7265757561625186304-3Q_T?utm_source=social_share_send&amp;amp;utm_medium=member_desktop_web" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;This project creates a cloud based NBA game day alert system leveraging Python, AWS Lambda, Amazon EventBridge, Amazon SNS, and  an NBA API. NBA fans can subscribe to this service to receive real-time notifications of game scores via email or text message. &lt;/p&gt;




&lt;h2&gt;
  
  
  Key Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Fetches NBA game scores using SportsDataIO's NBA API.&lt;/li&gt;
&lt;li&gt;Sends score updates to subscribers via email or text message using Amazon SNS.&lt;/li&gt;
&lt;li&gt;Automates sending of notifications using Amazon EventBridge.&lt;/li&gt;
&lt;li&gt;Follows principle of least privilege for IAM roles. &lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Technologies
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Cloud Provider: AWS&lt;/li&gt;
&lt;li&gt;Core Services: SNS, Lambda, EventBridge&lt;/li&gt;
&lt;li&gt;External API: NBA Game API (SportsData.io)&lt;/li&gt;
&lt;li&gt;Programming Language: Python 3.x&lt;/li&gt;
&lt;li&gt;IAM Security: Least privilege policies&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  NBA API
&lt;/h3&gt;

&lt;p&gt;&lt;small&gt;(The more I interact with, research, and write about APIs, the better I understand them. Have you heard the notion that you only really know something if you can explain in the simplest of terms? Well, I'm gradually feeling more confident in my ability to explain APIs and the HTTP request-response cycle and I am going to attempt to do so in a forthcoming post that I'll have linked here once its up.)&lt;/small&gt; &lt;/p&gt;

&lt;p&gt;In this project, we make a request to the NBA API for data about games for the current day. The request looks something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;api_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://api.sportsdata.io/v3/nba/scores/json/GamesByDate/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;todays_date&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;?key=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;your_api_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; 
  &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;api_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;raise_for_status&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="n"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="nf"&gt;print &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;indent&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; 
  &lt;span class="nf"&gt;print &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;An error occurred: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With the call wrapped in a try-except block, the NBA API responds to our request with either an error message  or with the data I requested. The data comes through in a format that is hard to decipher so we decipher it by using methods, e.g. &lt;code&gt;.json()&lt;/code&gt; and &lt;code&gt;json.dumps()&lt;/code&gt;.  &lt;/p&gt;

&lt;p&gt;The formatted response looks something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="o"&gt;[&lt;/span&gt;
    &lt;span class="o"&gt;{&lt;/span&gt;
        &lt;span class="s2"&gt;"GameID"&lt;/span&gt;: 21678,
        &lt;span class="s2"&gt;"Season"&lt;/span&gt;: 2025,
        &lt;span class="s2"&gt;"SeasonType"&lt;/span&gt;: 1,
        &lt;span class="s2"&gt;"Status"&lt;/span&gt;: &lt;span class="s2"&gt;"Scheduled"&lt;/span&gt;,
        &lt;span class="s2"&gt;"Day"&lt;/span&gt;: &lt;span class="s2"&gt;"2025-02-05T00:00:00"&lt;/span&gt;,
        &lt;span class="s2"&gt;"DateTime"&lt;/span&gt;: &lt;span class="s2"&gt;"2025-02-05T19:00:00"&lt;/span&gt;,
        &lt;span class="s2"&gt;"AwayTeam"&lt;/span&gt;: &lt;span class="s2"&gt;"MIL"&lt;/span&gt;,
        &lt;span class="s2"&gt;"HomeTeam"&lt;/span&gt;: &lt;span class="s2"&gt;"CHA"&lt;/span&gt;,
        &lt;span class="s2"&gt;"AwayTeamID"&lt;/span&gt;: 15,
        &lt;span class="s2"&gt;"HomeTeamID"&lt;/span&gt;: 2,
        &lt;span class="s2"&gt;"StadiumID"&lt;/span&gt;: 2,
        &lt;span class="s2"&gt;"Channel"&lt;/span&gt;: &lt;span class="s2"&gt;"FDSS"&lt;/span&gt;,
        &lt;span class="s2"&gt;"Attendance"&lt;/span&gt;: null,
        &lt;span class="s2"&gt;"AwayTeamScore"&lt;/span&gt;: null,
        &lt;span class="s2"&gt;"HomeTeamScore"&lt;/span&gt;: null,
        ...
    &lt;span class="o"&gt;}&lt;/span&gt;,
   ... 
&lt;span class="o"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that each game's information is held in a dictionary. So I can extract what I want from that decoded data using the &lt;code&gt;.get()&lt;/code&gt; method. For example, I can extract the home team with a line like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;home_team&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;game&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;HomeTeam&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Unknown&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a variable &lt;code&gt;home_team&lt;/code&gt; and sets it to the value associated with the key "HomeTeam" and if that key doesn't exist, &lt;code&gt;home_team&lt;/code&gt; is set to "Unknown".&lt;/p&gt;

&lt;p&gt;Thanks NBA API!  &lt;/p&gt;

&lt;h3&gt;
  
  
  Event driven architecture
&lt;/h3&gt;

&lt;p&gt;Today I learned... about event driven architecture. This project is an example of event driven architecture as we set up a &lt;code&gt;cron&lt;/code&gt; job schedule in EventBridge that invokes the code we house in Lambda. The code in Lambda makes the API call, transforms the data, and sends it to SNS which then sends out the notifications. The event, the schedule we set in EventBridge, automatically kicks off the rest of the workflow.&lt;/p&gt;

&lt;h3&gt;
  
  
  EventBridge
&lt;/h3&gt;

&lt;p&gt;EventBridge helps route events from different AWS services and custom applications. In this case, we use EventBridge to define a scheduled rule that triggers our Lambda function at specific intervals. This eliminates the need for manual execution and ensures the workflow runs consistently.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lambda
&lt;/h3&gt;

&lt;p&gt;Lambda runs the code we write to fetch data from the NBA API, process it, and send it out, in this case to an SNS topic.&lt;/p&gt;

&lt;h3&gt;
  
  
  SNS
&lt;/h3&gt;

&lt;p&gt;SNS, Simple Notification Service, is a pub-sub service that sends notifications to its subscribers. People can subscribe to a topic and receive notifications by text and/or email. &lt;/p&gt;




&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;game-day-notifications/
├── .env.                            # holds environment variables
├── policies/        
│   └── gd_sns_policy.json           # json for the sns policy
├── src/
│   ├── game_day_notifs.py           # main logic
├── README.md                        # documentation
└── requirements.txt                 # dependencies 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Set Up Instructions
&lt;/h2&gt;

&lt;p&gt;Follow these steps to create the workflow for this app.&lt;/p&gt;

&lt;h3&gt;
  
  
  Clone the Repo
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/LilLoveBytes/game-day-notifications.git
&lt;span class="nb"&gt;cd &lt;/span&gt;game-day-notifications
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Create an SNS Topic
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the AWS management console.&lt;/li&gt;
&lt;li&gt;Navigate to the SNS service.&lt;/li&gt;
&lt;li&gt;Click Create Topic and select Standard as the topic type&lt;/li&gt;
&lt;li&gt;Name the topic (e.g. GameDay) and note the ARN.&lt;/li&gt;
&lt;li&gt;Click create topic.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Add Subscriptions to the SNS Topic
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;After creating the topic, click on the topic name from the list.&lt;/li&gt;
&lt;li&gt;Navigate to the Subscriptions tab and click Create subscription&lt;/li&gt;
&lt;li&gt;Select a Protocol:

&lt;ul&gt;
&lt;li&gt;Email: Choose Email and enter a valid email address&lt;/li&gt;
&lt;li&gt;SMS: Choose SMS and enter a valid phone number in international format (e.g. +1-213-555-5555)&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Click Create Subscription.&lt;/li&gt;
&lt;li&gt;Confirm the subscription

&lt;ul&gt;
&lt;li&gt;Email: Check the inbox of this account and confirm the subscription by clicking the confirmation link.&lt;/li&gt;
&lt;li&gt;SMS: Subscription will be active after creation, no need to further confirm.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Create the SNS Publish Policy
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the IAM service in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Navigate to Policies -&amp;gt; Create Policy&lt;/li&gt;
&lt;li&gt;Click JSON and paste the JSON policy from the gd_sns_policy.json file in the repo&lt;/li&gt;
&lt;li&gt;Update the value of the "Resource" key to the ARN noted earlier.&lt;/li&gt;
&lt;li&gt;Click Next: Tags (you can skip adding tags).&lt;/li&gt;
&lt;li&gt;Click Next: Review&lt;/li&gt;
&lt;li&gt;Enter a name for the policy &lt;/li&gt;
&lt;li&gt;Review and click Create Policy.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Create an IAM Role for Lambda
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the IAM service in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Click Roles -&amp;gt; Create Role.&lt;/li&gt;
&lt;li&gt;Select AWS Service and choose Lambda.&lt;/li&gt;
&lt;li&gt;Attach the following policies

&lt;ul&gt;
&lt;li&gt;SNS Publish Policy (created in the previous step).&lt;/li&gt;
&lt;li&gt;Lambda Basic Execution Role (AWSLambdaBasicExecutionRole) (an AWS managed policy).&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Click Next: Tags (you can skip adding tags).&lt;/li&gt;
&lt;li&gt;Click Next: Review&lt;/li&gt;
&lt;li&gt;Enter a name for the role&lt;/li&gt;
&lt;li&gt;Review and click Create Role.&lt;/li&gt;
&lt;li&gt;Copy and save the ARN of the role for use in the Lambda function.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Deploy the Lambda Function
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the AWS Management Console and navigate to the Lambda service.&lt;/li&gt;
&lt;li&gt;Click Create Function.&lt;/li&gt;
&lt;li&gt;Select Author from Scratch.&lt;/li&gt;
&lt;li&gt;Enter a function name&lt;/li&gt;
&lt;li&gt;Choose Python 3.x as the runtime.&lt;/li&gt;
&lt;li&gt;Assign the IAM role created earlier to the function.&lt;/li&gt;
&lt;li&gt;Under the Function Code section:

&lt;ul&gt;
&lt;li&gt;Copy the content of the src/game_day_notifs.py file from the repository.&lt;/li&gt;
&lt;li&gt;Paste it into the inline code editor.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Under the Environment Variables section, add the following:

&lt;ul&gt;
&lt;li&gt;NBA_API_KEY: your NBA API key.&lt;/li&gt;
&lt;li&gt;SNS_TOPIC_ARN: the ARN of the SNS topic created earlier.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Click Create Function&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Set up Automation with EventBridge
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Navigate to the EventBridge service in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Go to Rules -&amp;gt; Create Rule.&lt;/li&gt;
&lt;li&gt;Select Rule Type: Schedule.&lt;/li&gt;
&lt;li&gt;Select Continue in EventBridge Scheduler.&lt;/li&gt;
&lt;li&gt;Name your schedule and select Recurring Schedule&lt;/li&gt;
&lt;li&gt;Set the cron schedule for when you want updates&lt;/li&gt;
&lt;li&gt;Under Targets, select the Lambda function you just created&lt;/li&gt;
&lt;li&gt;Select Create a new role and hit next.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Test the System
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the Lambda function in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;Create a test event to simulate execution.&lt;/li&gt;
&lt;li&gt;Run the function and check CloudWatch Logs for errors.&lt;/li&gt;
&lt;li&gt;Verify that notifications are sent to the subscribed users. &lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Conclusions
&lt;/h2&gt;

&lt;p&gt;Today I learned...&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;what event driven architecture is &lt;/li&gt;
&lt;li&gt;what AWS Lambda, SNS, and EventBridge are.&lt;/li&gt;
&lt;li&gt;how to design a notification system with SNS and Lambda.&lt;/li&gt;
&lt;li&gt;how to use AWS services like EventBridge and Lambda to build automated workflows.&lt;/li&gt;
&lt;li&gt;how to secure AWS services with least privilege IAM policies.&lt;/li&gt;
&lt;li&gt;how to automate workflows using EventBridge.&lt;/li&gt;
&lt;li&gt;how to format a cron expression to define scheduled execution times.&lt;/li&gt;
&lt;li&gt;how to integrate an external API into a cloud-based workflow.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Future Enhancements
&lt;/h2&gt;

&lt;p&gt;I am writing this after reaching the initial goal of this challenge: to create a simple notification system that fetches NBA game data from an API, formats the data, and sends it to subscribers. &lt;/p&gt;

&lt;p&gt;If I have the capacity I will enhance this app by creating a webpage/ user interface for user interaction and data visualization. I could also add logic to include NFL game data notifications. &lt;/p&gt;




&lt;p&gt;And if you've made it this far, thanks for reading! Feel free to checkout my &lt;a href="https://github.com/Ljpeg" rel="noopener noreferrer"&gt;Github&lt;/a&gt;. I'd love to connect on &lt;a href="https://www.linkedin.com/in/latashapollard/" rel="noopener noreferrer"&gt;Linkedin&lt;/a&gt;. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>eventdriven</category>
      <category>lambda</category>
    </item>
    <item>
      <title>Hello, world! (Can you tell I'm new here?)</title>
      <dc:creator>La Tasha "L."  Pollard</dc:creator>
      <pubDate>Tue, 04 Feb 2025 22:22:30 +0000</pubDate>
      <link>https://dev.to/ljpeg/hello-world-can-you-tell-im-new-here-2lca</link>
      <guid>https://dev.to/ljpeg/hello-world-can-you-tell-im-new-here-2lca</guid>
      <description>&lt;p&gt;Hi, hello there! I'm new here (this is &lt;em&gt;one of&lt;/em&gt; my first post, &lt;em&gt;excitement&lt;/em&gt;). Instead of over analyzing every single word -- and ultimately feeling like I'll never find the perfect combination of words-- I am going to treat this as what it is, a blog. I'm here to document my learning, but the educator in me hopes this aids in your learning process in some way too.&lt;/p&gt;

&lt;h2&gt;
  
  
  About Me
&lt;/h2&gt;

&lt;p&gt;I'm a cough-cough year old, career changing, aspiring Software/DevOps Engineer. I come from a background in education and social justice work. I completed a coding bootcamp with Ada Developers Academy in 2024 and worked briefly for a FAANG+ company before being affected by company wide layoffs -- I was impacted by layoffs five months into my first full time role, sad face. Since then I've been taking my time to up skill, build out my portfolio, and practice DSAs (sadder face). &lt;/p&gt;

&lt;h2&gt;
  
  
  Why this blog?
&lt;/h2&gt;

&lt;p&gt;&lt;small&gt; "Have you ever played have you ever?" - Kendrick Lamar &lt;/small&gt;&lt;/p&gt;

&lt;p&gt;Have you ever read a book or watched a show and walked away feeling like it was one of the best things ever, and then maybe a month or so later, a friend starts said media from the beginning, wants to converse with you about it, begins to tells you things and you're like... "Oh yea.. that &lt;em&gt;did&lt;/em&gt; happen!" ? &lt;/p&gt;

&lt;p&gt;That's analogous to my journey in tech: I finished a coding bootcamp, completed a Software Engineering internship with a FAANG+ company, was hired full time by the company... But when it came to reflectively talking about the work I was doing, I felt like Spongebob when he couldn't express what he learned in boating school... &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6vtae6pwjj6q75coakqt.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6vtae6pwjj6q75coakqt.jpeg" alt="A screenshot of Spongebob stressed writing on top and below a screenshot of a completely blank paper in his hands" width="179" height="282"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I found myself in interviews and other conversations drawing a blank on what it is I've actually done. I was learning enough to complete assigned tasks but weeks later I was left with only the gist of what I had done. And when it came time to talk in depth about said tasks, like in an interviewwwwugh, the details were even further removed from my memory. &lt;/p&gt;

&lt;p&gt;Ayo, why am I not remembering everything I've ever done, scratches head?? It's partially because our memories don't work like that and partially because I wasn't documenting my process enough.&lt;/p&gt;

&lt;p&gt;Now, I make notes about everything I do, so later down the line, when it's not fresh in memory, I can review my notes and be reminded, "Oh yea... I &lt;em&gt;did&lt;/em&gt; do... &lt;em&gt;all of that&lt;/em&gt;".&lt;/p&gt;

&lt;p&gt;(When I say document, I just mean utilizing some note-taking method:  summarize my task, make bullet points highlighting the action plan, screenshot error messages, note what did and didn't resolve the issues, etc..) &lt;/p&gt;

&lt;p&gt;Note-taking, documenting, and reviewing were underrated, but now are essential, intentional parts of my learning process. Translating my notes/documentation into a blog helps all these new concepts sink in. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So I'm creating this blog as part of my learning process and to document my learning journey.&lt;/strong&gt; It's a way for me to process each new thing I learn, each project I work on, document it, and later review and reflect as necessary.  &lt;/p&gt;

&lt;p&gt;Maybe this blog will showcase my understanding of these concepts to potential employers. Maybe it'll prompt discussions that will help my understanding and thought processes evolve. Or maybe this is just a sounding board and I am posting to the void. &lt;/p&gt;

&lt;p&gt;Either way, I'll be here :)&lt;/p&gt;

&lt;p&gt;And if you've read this far along, thanks for sticking around! My name's La Tasha and you can find me on &lt;a href="https://www.linkedin.com/in/latashapollard/" rel="noopener noreferrer"&gt;Linkedin&lt;/a&gt; and &lt;a href="https://github.com/Ljpeg/" rel="noopener noreferrer"&gt;Github&lt;/a&gt;. Let's connect!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>learning</category>
      <category>codenewbie</category>
      <category>devjournal</category>
    </item>
    <item>
      <title>A weather data collection system using AWS S3 and OpenWeather API.</title>
      <dc:creator>La Tasha "L."  Pollard</dc:creator>
      <pubDate>Mon, 03 Feb 2025 23:39:57 +0000</pubDate>
      <link>https://dev.to/ljpeg/a-weather-data-collection-system-using-aws-s3-and-openweather-api-5f5b</link>
      <guid>https://dev.to/ljpeg/a-weather-data-collection-system-using-aws-s3-and-openweather-api-5f5b</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Me:
&lt;/h3&gt;

&lt;p&gt;Hi, there! I'm new here (this is my first post, &lt;em&gt;excitement&lt;/em&gt;). Instead of over analyzing every single word -- and ultimately feeling like I'll never find the perfect combination of words-- I am going to treat this as what it is, a blog. I'm here to document my learning, but the educator in me hopes this aids in your learning process in some way too. &lt;/p&gt;

&lt;p&gt;I have experience in Full Stack development  and have gotten my toes wet with some DevOps tools during my internship at Microsoft. To expound upon that and gain more familiarity with cloud computing, I recently joined a 30-Day DevOps coding challenge. Our Week 1 Day 1 challenge is to create &lt;b&gt; a weather data collection system using AWS S3 and OpenWeather API.&lt;/b&gt;  This blog will cover the steps I took to create this app.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;The weather dashboard app is a python application that utilizes the OpenWeather API to fetch real-time weather data for multiple cities, displays that data in a readable format in the terminal, and automates storage of the data to an AWS S3 bucket. Storing our real-time weather insights in scalable cloud storage buckets makes the data more accessible and persistent.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Fetches real time weather data for specified locations using OpenWeather API.&lt;/li&gt;
&lt;li&gt;Displays temperature, humidity, and weather conditions.&lt;/li&gt;
&lt;li&gt;Automatically stores weather data in AWS S3.&lt;/li&gt;
&lt;li&gt;Uses environment variables to securely manage API keys.&lt;/li&gt;
&lt;li&gt;Timestamps all data for historical tracking.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  OpenWeather API
&lt;/h3&gt;

&lt;p&gt;In order to dynamically fetch data or populate a web page or web app, I need to make an API call to get the data I need from other systems or services.&lt;/p&gt;

&lt;p&gt;Since my project needs to dynamically provide feedback about the weather, I used OpenWeather API which retrieves real-time weather data, forecasts, and other climate-related information for a specified location.&lt;/p&gt;

&lt;p&gt;Here's what it looks like to make a request:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;base_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://api.openweathermap.org/data/2.5/weather&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="n"&gt;params&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;q&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;city-name&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;appid&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;your-api-key&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;units&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;imperial&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;base_url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;params&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;raise_for_status&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;exceptions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RequestExceptions&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Error fetching weather data: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I wrap my request is in a try-except block for error handling. If for any reason my request fails, rather than the program simply exiting (or doing some other not-detailed action), I'll get a response with details about the failure that will aid in debugging. &lt;/p&gt;

&lt;h3&gt;
  
  
  Data Storage with AWS S3
&lt;/h3&gt;

&lt;p&gt;This project was my first time using AWS. So in order to get started I had to set up my AWS account (using the free tier), install the AWS CLI, configure the CLI to use SSO (single sign on which is a means of authentication), learn what Amazon S3 is, and learn how to use boto3, a python library, to create and save files to an S3 bucket, in a python script. Whew. &lt;/p&gt;

&lt;p&gt;Briefly, S3 is Amazons cloud storage, short for Simple Storage Service.. we can save data things (files, images, etc.) there. SSO - Single Sign On - is a means of authentication for accessing AWS accounts. Boto3 is the python library we use to interact with AWS services and manage resources, like S3, using python code. &lt;/p&gt;

&lt;p&gt;To use boto3 to create and save files to an S3 bucket looks something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;

&lt;span class="n"&gt;s3_client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Session&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;profile_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;pro-name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;region_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;your-region&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s3&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;s3_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create_bucket&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Bucket&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bucket_name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;CreateBucketConfiguration&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;LocationConstraint&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;your-region&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;


&lt;span class="n"&gt;s3_client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;put_object&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="n"&gt;Bucket&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bucket_name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="n"&gt;Key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;file_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;Body&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;weather_data&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
                &lt;span class="n"&gt;ContentType&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;weather-dashboard/
  src/
    __init__.py            &lt;span class="c"&gt;# initiates python &lt;/span&gt;
    weather_dashboard.py   &lt;span class="c"&gt;# Defines functions/logic&lt;/span&gt;
  .env                     &lt;span class="c"&gt;# Holds environment variables&lt;/span&gt;
  requirements.txt         &lt;span class="c"&gt;# Dependencies&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Set up Instructions
&lt;/h2&gt;

&lt;p&gt;These are the steps one could take to run my version of the app.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Clone the Repository
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/LilLoveBytes/weather-dashboard.git
&lt;span class="nb"&gt;cd &lt;/span&gt;weather-dashboard
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Install Dependencies
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-r&lt;/span&gt; requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. Configure Environment Variable
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;OPENWEATHER_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_api_key
&lt;span class="nv"&gt;AWS_BUCKET_NAME&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_bucket_name
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Configure AWS Credentials
&lt;/h3&gt;

&lt;p&gt;Make sure you have AWS CLI installed and run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  5. Run the application
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python src/weather_dashboard.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;insert spongebob meme&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0svfod290xyhru6tviuo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0svfod290xyhru6tviuo.png" alt="Meme image showing Ms. Puff's hands holding Spongbob's paper titled " width="631" height="469"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What I learned today is...&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;how to set up an AWS account using the free tier&lt;/li&gt;
&lt;li&gt;what amazon S3 is &lt;/li&gt;
&lt;li&gt;how to use boto3/python to manage S3 resources&lt;/li&gt;
&lt;li&gt;how to make API calls to OpenWeather API&lt;/li&gt;
&lt;li&gt;how to set up AWS CLI to use SSO&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Further Enhancements
&lt;/h2&gt;

&lt;p&gt;I am writing this after reaching the initial goal of this challenge: to create a simple app that fetches weather data, displays it to the terminal, and stores it to cloud storage. &lt;/p&gt;

&lt;p&gt;If I have the capacity I will enhance this app by creating a webpage/ user interface for user interaction and data visualization. I will also use AWS Lambda to schedule automated data collection. &lt;/p&gt;

&lt;h3&gt;
  
  
  Let's Connect
&lt;/h3&gt;

&lt;p&gt;You can view this project on my &lt;a href="https://github.com/Ljpeg/weather-dashboard/tree/main" rel="noopener noreferrer"&gt;Github&lt;/a&gt;.&lt;br&gt;
Also feel free to connect with me on &lt;a href="https://www.linkedin.com/in/latashapollard/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>devops</category>
      <category>python</category>
      <category>aws</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
