<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Badrinarayanan Ravi</title>
    <description>The latest articles on DEV Community by Badrinarayanan Ravi (@badriravi).</description>
    <link>https://dev.to/badriravi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/badriravi"/>
    <language>en</language>
    <item>
      <title>Book Review: Building a Second Brain by Tiago Forte</title>
      <dc:creator>Badrinarayanan Ravi</dc:creator>
      <pubDate>Tue, 15 Aug 2023 08:12:06 +0000</pubDate>
      <link>https://dev.to/badriravi/book-review-building-a-second-brain-by-tiago-forte-4c4o</link>
      <guid>https://dev.to/badriravi/book-review-building-a-second-brain-by-tiago-forte-4c4o</guid>
      <description>&lt;p&gt;I am a note taking enthusiast, I like to try different note taking apps. Some people upgrade their iPhones every year others buy the latest sneakers. I try to keep up with the shiniest note taking tool. This led to a huge problem, I was great at consuming and collecting information but failed to derive any value from it.  I have documented hundreds of code snippets and solution to unique problems related to my work, however I was not able to retrieve them, if I faced any similar problem. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Building a Second Brain by Tiago Forte&lt;/strong&gt; is a great book on organising the information we capture and unlock our creative potential.  The author has superbly written effective strategies on note taking and deriving value from the notes we take.  In today's world, we are being bombarded by information. We are able to bookmark important articles, research papers, videos but don't have time to reflect upon them. When we invest in Mutal Funds or Stocks, we are making our money work for us right? That is most common phrase we hear from financial advisors, "Let your money work, while you sleep". In the same manner, the author explains, the information we capture should open us to new ideas, see connections between different areas and help us become more productive. The author also gives motivating examples from his own life and how effective note taking made him the go-to guy at his workplace. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;CODE: Capture, Organize, Distil, Express.&lt;/strong&gt; CODE is an effective note taking strategy in which the author goes in great detail as to what information to capture, how to organise information which can be retrieved later easily, extracting the essence of an article, blog post or a video and expressing our ideas to others. &lt;/p&gt;

&lt;p&gt;I had a great time reading this book and it gave me important insights as to capturing and organizing information. I recommend it to anyone who has a habit of hoarding information but find it difficult to organize and extract real value from them. &lt;/p&gt;

</description>
      <category>bookreview</category>
      <category>buildingasecondbrain</category>
      <category>notetaking</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Over-engineering analytics for my personal website</title>
      <dc:creator>Badrinarayanan Ravi</dc:creator>
      <pubDate>Wed, 26 Jul 2023 15:21:50 +0000</pubDate>
      <link>https://dev.to/badriravi/over-engineering-analytics-for-my-personal-website-37nc</link>
      <guid>https://dev.to/badriravi/over-engineering-analytics-for-my-personal-website-37nc</guid>
      <description>&lt;p&gt;Working with AWS can be really overwhelming, especially when they have gazillion services. As a developer, I like to try out various services and check if they can be useful to my organization. It also feels nice adding those swanky keywords in  resume hoping the ATS picks your resume when applying for a new job.&lt;/p&gt;

&lt;p&gt;I have been programming professionally since 2014 and to be honest never felt the need to have a personal landing page or a blog. I occupy myself with other hobbies, but I felt it was about time I had some kind of presence on the Internet apart from the social media profiles. &lt;/p&gt;

&lt;p&gt;While working on my landing page, I was wondering what to use for analytics for my site. I need some kind of data as to how many users are visiting my site and from where. I didn’t want to integrate google analytics and surely was not going to look for a paid solution. Then, I had an epiphany why not just glue together a couple of AWS services and make it work. I don’t need a ton of functionality, just a user count and where are the users from. &lt;/p&gt;

&lt;p&gt;So, here’s the gist, I decided to host the site in CloudFront. CloudFront stores access logs in a S3 bucket. Access logs contain IP addresses. I can get the location from the IP address. &lt;/p&gt;

&lt;p&gt;Let's Build. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3rxRcGgA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lt6rshg61b4ggoluvs0p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3rxRcGgA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lt6rshg61b4ggoluvs0p.png" alt="Image description" width="800" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the above architecture diagram, we get a glimpse of the AWS services used and how the system actually works. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS services used:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;CloudFront:&lt;/strong&gt; The landing page is hosted on CloudFront.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;S3:&lt;/strong&gt; To store CloudFront access logs and custom Json files which we create via code. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Athena:&lt;/strong&gt; To query, log file and Json files.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;EventBridge:&lt;/strong&gt; To trigger StepFunction at a particular time. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Step Functions:&lt;/strong&gt; Run the serverless workflow. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lambda:&lt;/strong&gt; Query Athena and put the result back in a S3 bucket. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DynamoDB:&lt;/strong&gt; It is not shown in the above diagram, I use it to store the aggregate result from Athena. &lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  How does the system work?
&lt;/h2&gt;

&lt;p&gt;When a user, visits &lt;a href="https://www.badriravi.com/"&gt;https://www.badriravi.com/&lt;/a&gt;. The request is served by CloudFront which writes the access log to a S3 bucket which is configured at the time of creating the CloudFront distribution. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; CloudFront does not write access logs by default. The setting has to be enabled. &lt;/p&gt;

&lt;p&gt;Now that access logs have been configured, we need to create an Athena table to query those logs. The schema for the table can be found &lt;a href="https://docs.aws.amazon.com/athena/latest/ug/cloudfront-logs.html"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once that Athena table has been created, the logs can be queried. The site visitor location can be fetched from the &lt;br&gt;
 "ip_address" field, by using a third-party service. I am using &lt;a href="http://ip-api.com/json/"&gt;http://ip-api.com/json/&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Let's write code to query Athena and get the location of the site visitor. Athena queries are not synchronous. They take some time to give back the query result. We need to "wait" for the query to be completed. This can be achieved by using the "long polling" technique in code. I decided to be creative and use Step Functions. I created two Lambda functions, first function will run the query in Athena and the second function will parse the result. &lt;br&gt;
Step Functions will be used to orchestrate the serverless workflow. We can also introduce a "wait" state in Step Functions to wait for the Athena query to be completed. &lt;/p&gt;
&lt;h3&gt;
  
  
  Step Functions JSON
&lt;/h3&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "Comment": "&amp;lt;comment&amp;gt;",
  "StartAt": "Run Athena Query",
  "States": {
    "Run Athena Query": {
      "Type": "Task",
      "Resource": "arn:aws:states:::lambda:invoke",
      "Parameters": {
        "Payload.$": "$",
        "FunctionName": "&amp;lt;arn of Lambda function&amp;gt;"
      },
      "Retry": [
        {
          "ErrorEquals": [
            "Lambda.ServiceException",
            "Lambda.AWSLambdaException",
            "Lambda.SdkClientException",
            "Lambda.TooManyRequestsException"
          ],
          "IntervalSeconds": 2,
          "MaxAttempts": 6,
          "BackoffRate": 2
        }
      ],
      "Next": "Wait"
    },
    "Wait": {
      "Type": "Wait",
      "Seconds": 3,
      "Next": "Write to DB"
    },
    "Write to DB": {
      "Type": "Task",
      "Resource": "arn:aws:states:::lambda:invoke",
      "Parameters": {
        "Payload.$": "$.Payload",
        "FunctionName": "&amp;lt;arn of Lambda function&amp;gt;"
      },
      "Retry": [
        {
          "ErrorEquals": [
            "Lambda.ServiceException",
            "Lambda.AWSLambdaException",
            "Lambda.SdkClientException",
            "Lambda.TooManyRequestsException"
          ],
          "IntervalSeconds": 2,
          "MaxAttempts": 6,
          "BackoffRate": 2
        }
      ],
      "End": true
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In our Step function config, we have 3 states:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Run Athena Query&lt;/li&gt;
&lt;li&gt;Wait &lt;/li&gt;
&lt;li&gt;Write to DB&lt;/li&gt;
&lt;/ol&gt;
&lt;h4&gt;
  
  
  Run Athena Query
&lt;/h4&gt;

&lt;p&gt;The first state is a Lambda function, which runs the Athena query. I used the GO SDK to do this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package main

import (
    "context"
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/aws/aws-sdk-go-v2/aws"
    "github.com/aws/aws-sdk-go-v2/config"
    "github.com/aws/aws-sdk-go-v2/service/athena"
    "github.com/aws/aws-sdk-go-v2/service/athena/types"
)

type MyEvent struct {
    QueryDateStart string `json:"queryDateStart"`
    QueryDateEnd string `json:"queryDateEnd"`
}

type QueryExecution struct {
    QueryExecutionId string `json:"queryId"`
}

const REGION = "&amp;lt;region&amp;gt;"
const OUTPUT_BUCKET = "&amp;lt;bucketname&amp;gt;"

func handler(ctx context.Context, event MyEvent) (QueryExecution, error) {

    QUERY := "select * from tablename where method = 'GET' and status = 200 and uri = '/' and date &amp;lt; DATE('"+event.QueryDateEnd+"') and date &amp;gt;= DATE('"+event.QueryDateStart+"')  order by date desc limit 1000"
    cfg, err := config.LoadDefaultConfig(context.TODO(), func(o *config.LoadOptions) error {
        o.Region = REGION
        return nil
    })

    if err != nil {
        fmt.Println(err)
        return QueryExecution{}, err
    }

    client := athena.NewFromConfig(cfg)

    resultConfig := &amp;amp;types.ResultConfiguration{
        OutputLocation: aws.String(OUTPUT_BUCKET),
    }

    executeParams := &amp;amp;athena.StartQueryExecutionInput{
        QueryString:         aws.String(QUERY),
        ResultConfiguration: resultConfig,
    }

    // Start Query Execution
    athenaExecution, err := client.StartQueryExecution(context.TODO(), executeParams)

    if err != nil {
        fmt.Println(err)
        return QueryExecution{}, err
    }
    executionId := *athenaExecution.QueryExecutionId

    return QueryExecution{
        QueryExecutionId: executionId,
    }, nil
}

func main() {
    lambda.Start(handler)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When the Step Function is triggered, I pass the current date and current date - 1 (yesterday's date). Once the query execution starts, the 'QueryExecutionId' is returned by the function, which is passed to the subsequent state. &lt;/p&gt;

&lt;h4&gt;
  
  
  Wait State
&lt;/h4&gt;

&lt;p&gt;Now the Step Function waits for the time specified in the config before proceeding. &lt;/p&gt;

&lt;h4&gt;
  
  
  Write to DB
&lt;/h4&gt;

&lt;p&gt;The final state is another Lambda function which retrieves the query result from the S3 bucket and transforms the data to a Json object and writes the Json object back to another S3 bucket. &lt;br&gt;
Here's the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package main

import (
    "bytes"
    "context"
    "encoding/csv"
    "encoding/json"
    "io"
    "net/http"

    "fmt"
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/aws/aws-sdk-go/aws"
    "github.com/aws/aws-sdk-go/aws/session"
    "github.com/aws/aws-sdk-go/service/s3"
)

type QueryExecution struct {
    QueryExecutionId string `json:"queryId"`
}

type IpInfo struct {
    Query       string  `json:"query"`
    Status      string  `json:"status"`
    Country     string  `json:"country"`
    CountryCode string  `json:"countryCode"`
    Region      string  `json:"region"`
    RegionName  string  `json:"regionName"`
    City        string  `json:"city"`
    Zip         string  `json:"zip"`
    Lat         float64 `json:"lat"`
    Lon         float64 `json:"lon"`
    Timezone    string  `json:"timezone"`
    Isp         string  `json:"isp"`
    Org         string  `json:"org"`
    As          string  `json:"as"`
    Uri         string  `json:"uri"`
    RequestDate string  `json:"requestDate"`
    RequestTime string  `json:"requestTime"`
    UserAgent   string  `json:"userAgent"`
    Referer     string  `json:"referer"`
}

func handler(ctx context.Context, event QueryExecution) {
    sess, err := session.NewSession(&amp;amp;aws.Config{
        Region: aws.String(""), // replace with your desired region
    })
    if err != nil {
        fmt.Println("Error creating session:", err)
        return
    }
    // Create a new S3 service client
    svc := s3.New(sess)

    // Set the parameters for the object to retrieve
    params := &amp;amp;s3.GetObjectInput{
        Bucket: aws.String(""), // replace with your S3 bucket name
        Key:    aws.String(event.QueryExecutionId + ".csv"),     // replace with your S3 object key
    }

    resp, err := svc.GetObject(params)
    if err != nil {
        fmt.Println("Error retrieving object:", err)
        return
    }
    reader := csv.NewReader(resp.Body)

    record, err := reader.ReadAll()
    if err != nil {
        fmt.Println("Error reading CSV:", err)
        return
    }
    for i := 1; i &amp;lt; len(record); i++ {
        ipInfo := &amp;amp;IpInfo{}

        resp, err := http.Get("http://ip-api.com/json/" + record[i][4])
        if err != nil {
            fmt.Println(err)
        }
        defer resp.Body.Close()
        body, _ := io.ReadAll(resp.Body)
        err = json.Unmarshal(body, &amp;amp;ipInfo)
        if err != nil {
            fmt.Println(err)
            return
        }
        ipInfo.Uri = record[i][7]
        ipInfo.UserAgent = record[i][10]
        ipInfo.Referer = record[i][9]
        ipInfo.RequestDate = record[i][0]
        ipInfo.RequestTime = record[i][1]
        if err != nil {
            fmt.Println(err)
            return
        }
        jsonData, err := json.Marshal(ipInfo)
        if err != nil {
            fmt.Println(err)
            return
        }
        _, err = svc.PutObject(&amp;amp;s3.PutObjectInput{
            Body:   aws.ReadSeekCloser(bytes.NewReader(jsonData)),
            Bucket: aws.String("bucketname"),
            Key:    aws.String("refined-logs-" + record[i][0] + record[i][1] + ".json"),
        })
        if err != nil {
            fmt.Println(err)
            return
        }
    }
    resp.Body.Close()
}

func main() {
    lambda.Start(handler)
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, the Athena executionId is passed via the Step Function state and is processed. We call the third party api to fetch the location from the IP Address and form a Json object and write it back to S3. &lt;/p&gt;

&lt;p&gt;Now another Lambda function is need which will be triggered by EventBridge at a particular time. This Lambda Function will trigger the step function and along with it pass today's and yesterday's date. Here's the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package main

import (
    "encoding/json"
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/aws/aws-sdk-go/aws"
    "github.com/aws/aws-sdk-go/aws/session"
    "github.com/aws/aws-sdk-go/service/sfn"
    "time"
)

const REGION = "&amp;lt;region&amp;gt;"

func handler() error {

    mySession := session.Must(session.NewSession())
    client := sfn.New(mySession, aws.NewConfig().WithRegion(REGION))
    layout := "2006-01-02"
    now := time.Now().UTC()
    tmw := now.AddDate(0, 0, 1)
    reqBody, err := json.Marshal(map[string]interface{}{
        "queryDateStart": now.Format(layout),
        "queryDateEnd":   tmw.Format(layout),
    })
    if err != nil {
        return err
    }
    input := string(reqBody)
    machineArn := "&amp;lt;arn-of-step-function&amp;gt;"
    _, err = client.StartExecution(&amp;amp;sfn.StartExecutionInput{
        Input:           &amp;amp;input,
        StateMachineArn: &amp;amp;machineArn,
    })
    if err != nil {
        return err
    }
    return nil
}

func main() {
    lambda.Start(handler)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now what to with the Json files in S3? Simple, create another Athena table and query the records you want. &lt;/p&gt;

&lt;p&gt;Here's the CREATE TABLE Query&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE EXTERNAL TABLE `&amp;lt;tablename&amp;gt;`(
  `query` string COMMENT 'from deserializer', 
  `country` string COMMENT 'from deserializer', 
  `countrycode` string COMMENT 'from deserializer', 
  `city` string COMMENT 'from deserializer', 
  `uri` string COMMENT 'from deserializer', 
  `requestdate` date COMMENT 'from deserializer', 
  `requesttime` string COMMENT 'from deserializer', 
  `useragent` string COMMENT 'from deserializer', 
  `referer` string COMMENT 'from deserializer')
ROW FORMAT SERDE 
  'org.openx.data.jsonserde.JsonSerDe' 
WITH SERDEPROPERTIES ( 
  'case.insensitive'='TRUE', 
  'dots.in.keys'='FALSE', 
  'ignore.malformed.json'='FALSE', 
  'mapping'='TRUE') 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'
LOCATION
  's3://&amp;lt;bucket-name&amp;gt;/'
TBLPROPERTIES (
  'classification'='json', 
  'transient_lastDdlTime'='1683369051')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now I can query the site data based on visitor country, city and get the total visitor count. I wrote another Lambda function and added a scheduler to Event Bridge. This Lambda function will query the newly created Athena table, run the queries and put the result in a DynamoDB Table. &lt;br&gt;
I will share the code which I use to fetch total users and users by country.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;package main

import (
    "context"
    "encoding/json"
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/aws/aws-sdk-go-v2/aws"
    "github.com/aws/aws-sdk-go-v2/config"
    "github.com/aws/aws-sdk-go-v2/feature/dynamodb/attributevalue"
    "github.com/aws/aws-sdk-go-v2/feature/dynamodb/expression"
    "github.com/aws/aws-sdk-go-v2/service/athena"
    "github.com/aws/aws-sdk-go-v2/service/athena/types"
    "github.com/aws/aws-sdk-go-v2/service/dynamodb"
    "log"
    "strconv"
    "time"
)

const REGION = "&amp;lt;region&amp;gt;"
const OUTPUT_BUCKET = "s3://&amp;lt;bucket-name&amp;gt;/"
const COUNTRY_USERS_QUERY = "SELECT country, count(*) as hits FROM &amp;lt;tablename&amp;gt;  group by country order by hits desc"
const REQUEST_DATE_QUERY = "select requestdate, count(*) as users from &amp;lt;tablename&amp;gt;  where requestdate &amp;gt; current_date - interval '30' day group by requestdate order by requestdate asc"

func extractData(athenaExecution *athena.StartQueryExecutionOutput,
    client *athena.Client,
    dynamodbClient *dynamodb.Client,
    calculateTotalUsers bool) {
    executionId := *athenaExecution.QueryExecutionId
    var qrop *athena.GetQueryExecutionOutput
    err := error(nil)
    for {
        qrop, err = client.GetQueryExecution(context.TODO(), &amp;amp;athena.GetQueryExecutionInput{
            QueryExecutionId: aws.String(executionId),
        })
        if err != nil {
            log.Print(err)
            return
        }
        if qrop.QueryExecution.Status.State != "RUNNING" &amp;amp;&amp;amp; qrop.QueryExecution.Status.State != "QUEUED" {
            break
        }
        time.Sleep(2 * time.Second)
    }
    if qrop.QueryExecution.Status.State == "SUCCEEDED" {
        data, err := client.GetQueryResults(context.TODO(), &amp;amp;athena.GetQueryResultsInput{
            QueryExecutionId: aws.String(executionId),
        })
        if err != nil {
            log.Print(err)
            return
        }
        athenaData := make(map[string]string, 0)
        totalUsers := 0
        for _, item := range data.ResultSet.Rows[1:] {
            athenaData[*item.Data[0].VarCharValue] = *item.Data[1].VarCharValue
            userCountInt, _ := strconv.Atoi(*item.Data[1].VarCharValue)
            totalUsers += userCountInt
        }
        tableName := "&amp;lt;tablename&amp;gt;"
        key, err := attributevalue.MarshalMap(map[string]string{
            "_id": tableName,
        })
        if err != nil {
            log.Print(err)
        }
        jsonData, err := json.Marshal(athenaData)
        if err != nil {
            log.Print(err)
        }
        var upd expression.UpdateBuilder
        location, _ := time.LoadLocation(&amp;lt;timezone&amp;gt;)
        now := time.Now().In(location)
        layout := "2006-01-02 15:04:05"
        if calculateTotalUsers {
            upd = expression.Set(expression.Name("totalUsers"), expression.Value(totalUsers)).
                Set(expression.Name("usersByCountry"), expression.Value(string(jsonData))).
                Set(expression.Name("last_updated"), expression.Value(now.Format(layout)))
        } else {
            upd = expression.Set(expression.Name("monthlyUsers"), expression.Value(string(jsonData))).
                Set(expression.Name("last_updated"), expression.Value(now.Format(layout)))
        }
        expr, err := expression.NewBuilder().WithUpdate(upd).Build()
        if err != nil {
            log.Print(err)
        }
        _, err = dynamodbClient.UpdateItem(context.TODO(), &amp;amp;dynamodb.UpdateItemInput{
            Key:                       key,
            TableName:                 aws.String(tableName),
            ExpressionAttributeNames:  expr.Names(),
            ExpressionAttributeValues: expr.Values(),
            UpdateExpression:          expr.Update(),
        })
        if err != nil {
            log.Print(err)
        }
        return
    } else {
        log.Print(qrop.QueryExecution.Status.State)
    }
}

func handler() {
    cfg, err := config.LoadDefaultConfig(context.TODO(), func(o *config.LoadOptions) error {
        o.Region = REGION
        return nil
    })
    if err != nil {
        log.Print(err)
        return
    }
    client := athena.NewFromConfig(cfg)
    dynamodbClient := dynamodb.NewFromConfig(cfg)

    resultConfig := &amp;amp;types.ResultConfiguration{
        OutputLocation: aws.String(OUTPUT_BUCKET),
    }

    executeCountryUsersParams := &amp;amp;athena.StartQueryExecutionInput{
        QueryString:         aws.String(COUNTRY_USERS_QUERY),
        ResultConfiguration: resultConfig,
    }

    executeMonthlyUsersParams := &amp;amp;athena.StartQueryExecutionInput{
        QueryString:         aws.String(REQUEST_DATE_QUERY),
        ResultConfiguration: resultConfig,
    }

    // Start Query Execution
    athenaCountryUsersExecution, err := client.StartQueryExecution(context.TODO(), executeCountryUsersParams)
    if err != nil {
        log.Print(err)
        return
    }
    athenaMonthlyUsersExecution, err := client.StartQueryExecution(context.TODO(), executeMonthlyUsersParams)

    if err != nil {
        log.Print(err)
        return
    }
    extractData(athenaCountryUsersExecution, client, dynamodbClient, true)
    extractData(athenaMonthlyUsersExecution, client, dynamodbClient, false)

}

func main() {
    lambda.Start(handler)
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that the aggregate result is in DynamoDB. I plan to create a simple Dashboard using NextJS and Tailwind to display the data. &lt;br&gt;
In this little endeavour of I used 7 AWS services. Overkill? Maybe but It is a good way to learn about AWS Services. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
