DEV Community

Cover image for Auction House Hunter (amplify awschallenge entry)
Jake Horvath
Jake Horvath

Posted on • Updated on

Auction House Hunter (amplify awschallenge entry)

This is a submission for the The AWS Amplify Fullstack TypeScript Challenge

(Important Note: Simple Email Service SES was just approved for me by AWS for production use. You can now save records and receive emails with CSV files for any user's email.)

What I Built

Inspiration for this application came from local small business owners that told me they want to get real time data from auction sites. I built a web-scraping application that tracks current listings on popular auction sites. It currently is working for one auction site (hibid.com). This is a business oriented real-time data tracking application and is heavily focused on lambdas, dynamoDB streams, S3 buckets, and serverless functions. Simply go to your auction website of choice, search for whatever you want with whatever filters you want, and copy and paste the URL of the search results page you’re currently viewing into Auction House Hunter (the name of my site). It will immediately perform axios requests with my own custom API configurations and proxy onto that URL and sort all of the data into CSV ready format for every single product currently available and save it to an S3 bucket using a DynamoDB Stream Handler function. After saving the CSV file to the S3 bucket, then an email is sent to the email address on file for the Cognito user containing the CSV file of the scraped data from the URL.

Demo and Code

This a private repo, but significant amounts of code logic will be included throughout this submission. I just would prefer to keep the repository as a whole private for future business use. Here is a link to the website I made:

https://storage.d12npvtctq2ov6.amplifyapp.com/

Before going to my website. Go to hibid.com and perform a search and copy the URL that you found. Here’s an example: https://hibid.com/lots?q=ww2%20medal&status=OPEN

Just create your account using Amplify UI’s default login/cognito functionality and signin.

Image description

Land on the home page

Image description

Enter a search URL and click Submit and wait a moment for the input field to disappear and your saved record to appear with a delete button available.

Image description

(Currently only visible to AWS admin) DynamoDB stream watching the “PastedUrl” table is triggered by the record change and runs a lambda to run an Axios request, sort/parse the data, and save it to an S3 bucket. The CSV file that is saved is now available to manipulate using Amplify, other AWS services, or just to directly download from the AWS Console.

Image description

After saving the CSV file to S3 another lambda is triggered that will send an email to your account’s email address containing the URL you used to search and a copy of the CSV file generated in S3. It uses data from the S3 bucket file and uses AWS SES to send an email.

Image description

Integrations

Data

Data is an essential part of how this site works. A schema was created for 2 tables. One table being the PastedUrl table and the other being Product table (Product table functionality will be added at a later date). Product table has a many to one relationship with the PastedUrl table and only the Cognito authenticated user that created the records are able to view them. Here is a code sample of the data/resource.ts

  const schema = a.schema({
    Product: a.model({
      href: a.url().required(),//full link to product
      title: a.string().required(), //product title
      currentBid: a.string(),//current bid price is optional as it may not be available
      timeLeft: a.string(),//time left for auction to end is optional as it may not be available
      productId: a.id(),
      pastedUrl: a.belongsTo('PastedUrl', 'productId')
    })
    .authorization((allow) => [allow.owner()]),
  
    PastedUrl: a.model({
      id: a.id().required(),
      url: a.url().required(),//url of search results provided by user
      userEmail: a.email().required(),//email of user
      products: a.hasMany('Product', 'productId')
    })
    .authorization((allow) => [allow.owner()]),
  });
Enter fullscreen mode Exit fullscreen mode

Here is sample code of the entrypoint (page.tsx) file that creates records, deletes records and updates the records visible on the application to the authenticated user:

  const [pastedUrl, setPastedUrl] = useState<Schema["PastedUrl"]["type"][]>([]);
  const [inputUrl, setInputUrl] = useState<string>('');

  // Function to create a new record in the PastedUrl table
  const createUrl = async (userEmail: string) => {
    await client.models.PastedUrl.create({
      url: inputUrl,
      userEmail: userEmail
    });
    // Fetch the records from the PastedUrl table
    const { data } = await client.models.PastedUrl.list();
    setPastedUrl(data);
  };

  // Set the initial state of the pastedUrl array to the records in the PastedUrl table
  useEffect(() => {
    const fetchData = async () => {
      const { data } = await client.models.PastedUrl.list();
      setPastedUrl(data);
    };

    fetchData();
  }, []);

  // Delete a record from the PastedUrl table using the current PastedUrl ID
  async function deleteRecord() {
    // Ensure the pastedUrl array is not empty
    if (pastedUrl.length === 0) {
      console.error('No records to delete');
      return;
    }

    // Get the id of the first record to be deleted
    const toBeDeletedPastedUrl = {
      id: pastedUrl[0].id
    };

    try {
      // Call the delete method on the PastedUrl model
      const { data: deletedPastedUrl, errors } = await client.models.PastedUrl.delete(toBeDeletedPastedUrl);

      // Log the result
      console.log('Deleted Record:', deletedPastedUrl);

      // Handle any errors
      if (errors) {
        console.error('Errors occurred while deleting the record:', errors);
      }
      else {
        // Set pastedUrl to an empty array to reset screen record state to empty
        setPastedUrl([]);
      }
    } catch (error) {
      // Handle any unexpected errors
      console.error('An error occurred:', error);
    }
  }
Enter fullscreen mode Exit fullscreen mode

Authentication

Authentication was pretty straightforward. Mostly the default Cognito authentication for creating, verifying, and logging in/out of the application. Just altered the header of the email the user receives.

import { defineAuth } from "@aws-amplify/backend";

/**
 * Define and configure your auth resource
 * @see https://docs.amplify.aws/gen2/build-a-backend/auth
 */
export const auth = defineAuth({
  loginWith: {
    email: {
      verificationEmailStyle: "CODE",
      verificationEmailSubject: "Welcome to Auction House Hunter!",
      verificationEmailBody: (createCode) => `Use this code to confirm your account: ${createCode()}. Now let's get to hunting some deals!`,
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

Serverless Functions

Serverless functions are the bread and butter of this application. Here is a sample of the resource.ts for the DynamoDB stream that is used whenever a record is created (URL saved). There is also an environment secret used for the API key of my proxy service for security purposes.

import { defineFunction, secret } from "@aws-amplify/backend";

export const dynamodbPerformSearch = defineFunction({
  name: "dynamodbPerformSearch",
  timeoutSeconds: 30, // timeout
  environment: {
    scrapeops: secret('scrapeops') //secrets must be defined in the function's resource.ts file
  }
});
Enter fullscreen mode Exit fullscreen mode

Here is a sample of the handler.ts function that performs the DynamoDBStreamHandler (lambda) function whenever a new record is saved to the DynamoDB table: PastedUrl

export const handler: DynamoDBStreamHandler = async (event) => {
  try {
    for (const record of event.Records) {
      logger.info(`Processing record: ${record.eventID}`);
      logger.info(`Event Type: ${record.eventName}`);

      if (record.eventName === "INSERT") {
        const newImage = record.dynamodb?.NewImage;
        if (newImage) {
          const userEmail = newImage.userEmail?.S || '';
          const url = newImage.url?.S ? encodeURIComponent(newImage.url.S + '&country=us') : undefined; //encode the URL
          const productId = newImage.id?.S || '';

          logger.info(`New Image: ${JSON.stringify(newImage)}`);
          logger.info(`User Email: ${userEmail}`);
          logger.info(`URL: ${url}`);

          if (url) {
            const proxyAndUrl = `${env.scrapeops}${url}`; //combine the proxy and URL
            try {
              const response = await axios.get(proxyAndUrl);
              // Get the final data from the response and convert it to a CSV format using parse
              const finalData = await searchData(response.data, productId);
              const finalDataCSV = parse(finalData);
              logger.info(`Axios response successful: ${response.status}`);
              logger.info(`Response final data: ${finalDataCSV}`);
              // Upload the final data to S3
              await uploadToS3(userEmail, url, finalDataCSV);
            } catch (error) {
              logError(error);
              logger.error(`Axios GET request failed.`);
              // Continue to the next record instead of failing the whole batch
              continue;
            }
          }
        }
      }
    }
  } catch (error) {
    // Log the error for retrieving the secret
    logError(error);
    logger.error(`Error processing event.`);
  }

  return {
    batchItemFailures: [],
  };
};
Enter fullscreen mode Exit fullscreen mode

The DynamoDBStreamHandler also calls another lambda that will generate a CSV file and save it to the S3 bucket:

const s3Client = new S3Client();// grant access to S3

// This is an asynchronous function that uploads a file to S3.
async function uploadToS3(userEmail: string, url: string, searchResults: string) {
  // Convert searchResults to a Buffer
  const buffer = Buffer.from(searchResults, 'utf-8');
  const safeEmail = userEmail.replace(/[@]/g, '_at_').replace(/[.]/g, '_dot_'); //sanitize the email to use as path
  const command = new PutObjectCommand({
    Bucket: "ahhreports", //bucket name should go here
    Key: `searchResults/${safeEmail}/${url}.csv`,
    Body: buffer, // Use searchResults as the content of the file
    ContentType: 'text/csv', // Set the content type of the file
  });

  try {
    //try to list buckets 
    // Try to send the command to the S3 client. This will upload the file.
    await s3Client.send(command);
    // If the upload is successful, log a success message.
    logger.info('Upload successful');
    // Now send the email with the same buffer
    await sendEmailWithAttachment(userEmail, url, buffer);
  } catch (error) {
    // If there's an error during the upload, catch it and log an error message.
    logger.error('Error uploading to S3', error as Error);
  }
}
Enter fullscreen mode Exit fullscreen mode

And finally, the uploadToS3 lambda triggers another lambda that will email a copy of the CSV file to the Cognito user’s email address:

async function sendEmailWithAttachment(toAddress: string, url: string, attachmentBuffer: Buffer) {
  const params = {
    Source: "auctionhousehunter@gmail.com", // verified Gmail address
    Destinations: [toAddress],
    RawMessage: {
      Data: new Uint8Array(Buffer.from(createRawEmail(toAddress, url, attachmentBuffer))),
    },
  };

  try {
    // Send the email
    await new SESClient().send(new SendRawEmailCommand(params));
    logger.info('Email sent successfully');
  } catch (error) {
    logger.error('Error sending email', error as Error);
  }
}
Enter fullscreen mode Exit fullscreen mode

File Storage

File Storage was complicated with Amplify Gen 2 using Next.js as the stack. I wanted dynamic S3 bucket generation to occur using existing Amplify Gen 2 functionality, but it is currently not working. The functionality that is broken involves having Amplify automatically generate bucket names for you and being able to pass the bucket name as an environment variable to lambda functions. Here’s a link to the issue ticket in amplify-backend github Amplify deploy fails when using function resolver with env by TypeScript error · Issue #1374 · aws-amplify/amplify-backend (github.com) https://github.com/aws-amplify/amplify-backend/issues/1374 .

There are some attempted workarounds in that ticket, but none of them worked for me. But life finds a way. After a few days of trying I managed to figure out how to make it work using an alternate route that is not fully fleshed out in the Amplify documentation.

I created a new S3 bucket manually that will be used for this website. Then I had to open Amplify in the console, locate my Lambda function for my deployed branch, and then navigate to it in IAM/Roles section of the console. I then Added permissions for it to access S3 and SES (for sending emails). The permission for S3 would have been taken care of automatically if Amplify Gen 2 was working properly with environment variable generation and distribution to lambda functions in Next.js.

Image description

And here is a log in Cloudwatch that shows all of the lambdas occurring and them successfully running after a record is updated:

Image description

The UI components were essentially all taken from Amplify Dev Center UI Library. They all work seamlessly together and with the GraphQL plus AppSync API connections and user authentication with Cognito. I decided to take full advantage of the massive time-saving that can occur from just using Amplify’s preset UI library for React/Next.js.

This project includes all four integrations to qualify for the additional prize categories.

And just like I mentioned at the beginning of this post, please be understanding of the website not sending you an email when you save a record (unless you have me add your email to the sandbox). AWS is VERY thorough about who has access to a Simple Email Service production environment that can send to any email address that is not manually entered. It can take days or even weeks for them to grant me access and as you are well aware this contest just did not give them enough time to grant it to me, but you can see in the screenshots it is working properly with my personal verified SES email address.

Jake Horvath
@jake_horvath_b58f87019ef1

Top comments (0)