DEV Community

Cover image for Building a Serverless Language Translation API with AWS & Terraform.
Philip Essel
Philip Essel

Posted on • Edited on

Building a Serverless Language Translation API with AWS & Terraform.

Project Objective

This project aims to build hands-on expertise in cloud automation and Infrastructure as Code (IaC) by leveraging AWS services and Terraform to deploy a serverless language translation application. Through this project, I aim to develop key DevOps and cloud skills, including:

  • Writing and managing Terraform configurations for cloud infrastructure deployment.
  • Implementing AWS Lambda functions for serverless computing.
  • Securing cloud resources with IAM roles and policies.
  • Using API Gateway to manage HTTP requests.
  • Automating workflows and deployments to improve system efficiency.

Beyond technical skills, this project contributes to my portfolio, showcasing my ability to design and deploy real-world cloud solutions. It is a stepping stone toward securing a competitive role in Cloud Engineering or DevOps.

Project Overview

The AWS Language Translation Application is a fully serverless solution that enables users to translate English text into German, French, or Spanish. Users can submit text in two ways: via a JSON API request or by uploading text, PDF, or Word documents. The application processes the input using a cloud-based translation service and stores the results for retrieval. The entire infrastructure is deployed using Infrastructure as Code (IaC), ensuring automation and scalability.

Technologies Used & Project Architecture

Technologies Utilized:

  1. API Gateway: Manages HTTP requests.
  2. Lambda Function: Handles text processing and translation.
  3. S3 Bucket: Stores input text and translated results.
  4. IAM Roles & Policies: Controls access to AWS resources.
  5. CloudWatch: Monitors logs and system performance.
  6. Terraform: Automates AWS resource deployment.

Architecture Flow:

  1. API Gateway receives translation requests.
  2. Lambda function processes the request:
    • Stores text in an S3 input bucket.
    • Calls AWS Translate for conversion.
    • Saves the translated result in an S3 output bucket.
  3. Users retrieve translated text from the output bucket via API Gateway.
  4. IAM policies secure communication between services.
  5. CloudWatch logs and monitors performance.

Below is the architecture diagram for this project:

Image description

Steps To Deploy The Project

The following are the steps to deploy the project

  1. Project Planning and Architecture Design

    • The project was planned based on its objectives and scope.
    • An architectural diagram was created to visually map out the key AWS services and their interactions, providing a clear understanding of the infrastructure.
    • This planning phase helped identify potential challenges early, streamline resource allocation, and ensure a well-structured deployment process.
  2. Setting Up the Project Environment

    • Installed Terraform to automate AWS resource provisioning.
    • Installed Python for developing the Lambda function.
    • Installed AWS CLI to interact with AWS services from the command line.
    • Configured AWS CLI with the necessary credentials using the command below:
aws configure
Enter fullscreen mode Exit fullscreen mode
  • Connected Terraform to AWS by setting up provider configurations in Terraform files.

=====

  1. Infrastructure as Code: Defining AWS Resources with Terraform
  • The following is the folder structure for the AWS Language Translation project. Each file serves a specific purpose in defining and managing the infrastructure using Terraform:

aws-language-translation-application/
│── docs/
│── api_gateway.tf
│── lambda.tf
│── s3.tf
│── iam.tf
│── variables.tf
│── outputs.tf
│── provider.tf
│── terraform.tfvars
│── translation_request.py
│── translation_request.zip
│── README.md

  • Key Terraform Files:
   1. `provider.tf` – Configures AWS as the cloud provider, specifying the region and authentication methods.

   2. `variables.tf` – Defines reusable variables such as S3 bucket names, Lambda function names, and IAM role ARNs.

   3. `terraform.tfvars` – Stores values for variables, keeping Terraform configuration flexible.

   4. `s3.tf` – Declares the creation of S3 buckets for storing text input and translated output.

   5. `iam.tf` – Grants permissions for Lambda to access S3, CloudWatch, and API Gateway.

   6. `lambda.tf` – Deploys the Lambda function and ensures the correct execution role is assigned.

   7. `api_gateway.tf` – Defines API Gateway to handle translation requests by configuring routes for POST (submitting text for translation) and GET (retrieving translated results). It also integrates API Gateway with Lambda, ensuring seamless request processing and response handling.

   8. `outputs.tf` – Specifies key outputs, such as the API Gateway URL and S3 bucket names, for reference after deployment.
Enter fullscreen mode Exit fullscreen mode
  • Writing the Python Script for our Lambda Function.

    • The translation_request.py file in the project folder contains the Python script that powers the Lambda function responsible for processing translation requests. It implements the following logic:
       1. Stores the submitted text in the S3 Input Bucket.
    
       2. Immediately translates the text using AWS Translate.
       3. Stores the translated result in the S3 Output Bucket.
       4. Returns the translated result in the API response.
       5. Retrieves the translated text from the S3 Output Bucket based on the request ID.
       6. Handles both JSON data and file uploads (such as .txt files).
    
    • To package the translation_request.py file and its dependencies, we use the command:
Compress-Archive -Path translation_request.py -DestinationPath translation_request.zip
Enter fullscreen mode Exit fullscreen mode
  - This command packaged the translation_request.py file into translation_request.zip as shown in the project folder structure.
Enter fullscreen mode Exit fullscreen mode

========

The architecture for this project follows a serverless design to ensure scalability and cost efficiency:

  1. AWS Lambda handles incoming translation requests and processes them using Amazon Translate.

  2. Amazon S3 stores input files and translated outputs securely.

  3. CloudWatch monitors Lambda execution and logs operational data.

  4. IAM Roles control granular access to AWS resources for enhanced security.

Data flows from the input files stored in S3, gets processed by a Lambda function invoking Amazon Translate, and the translated output is securely written back to S3.

Project details (including detailed steps, terraform configuration files and codes) can be found on my GitHub Repository Link

The Beginning: Diving into the AWS World
When I decided to deploy a translation service using AWS, I knew it would be both exciting and challenging. AWS is a powerhouse of cloud services, and Terraform adds immense flexibility for deploying infrastructure as code. However, as I dived deeper into the project, I quickly encountered hurdles that tested my problem-solving skills. Here are the key challenges I faced and how I tackled them.

1. Organizing Configuration Files for Clarity and Maintainability

At the start, I initially chose to declare all resources in a single Terraform configuration file. While this worked initially, it became messy and difficult to manage as the project grew.

To improve this, I reorganized the resources into separate configuration files:

  • s3.tf for S3 resources

  • iam.tf for IAM role configurations

  • lambda.tf for Lambda function deployment

This approach made the code much cleaner and easier to maintain.

2. Avoiding the Exposure of Sensitive Information

During my initial attempt to push the project to GitHub, I made a critical mistake by including both sensitive information and large dependency files. This not only exposed sensitive and private details but also resulted in a bloated repository. For instance, files like .env, terraform.tfvars, *.tfstate, and *.tfstate.backup often store sensitive data, while directories such as .terraform/, along with .zip and .log files, can be large and unnecessarily inflate the repository size.

Realizing this oversight, I researched best practices and implemented a .gitignore file. This simple yet effective solution ensured that sensitive information remained protected and large files were excluded, keeping my repository both secure and efficient.

3. Packaging Lambda Functions the Right Way

When it was time to deploy the Lambda function, I faced yet another hurdle. I initially uploaded the function without packaging it properly into a zip file. Terraform couldn’t deploy the function successfully, as Lambda requires packaged files with all necessary code and dependencies.

I sought help and initially tried this Linux/MacOS-friendly command from ChatGPT:

zip lambda_function.zip lambda_function.py

Unfortunately, this command didn't work on my Windows machine. After further prompting, I found the solution:

Compress-Archive -Path lambda_function.py -DestinationPath lambda_function.zip

This Windows-specific command worked perfectly, allowing me to package the Lambda code correctly and meet Terraform's deployment requirements.

4. Troubleshooting IAM Role Permissions and Lambda Deployment

Finally, the deployment process hit another barrier when the Lambda function failed due to permission errors. The logs indicated that the function lacked the necessary IAM role to interact with AWS Translate and write logs to CloudWatch.

After carefully reviewing the error logs and IAM policies, I updated the permissions to grant the Lambda function the required access. This troubleshooting process was crucial in getting the deployment back on track.

Key Takeaways

Through this phase of the project, I learned the importance of maintaining clean, organized code, securing sensitive information, packaging functions correctly, and thoroughly reviewing IAM permissions to ensure granular access control. Each challenge was a valuable learning experience that made me more confident in working with AWS and Terraform.

Next Steps

I'm excited to take these lessons forward and continue exploring cloud-based solutions. If you're new to AWS, I hope my story helps you avoid some of these pitfalls. For those with more experience, I would love to hear about your best practices, insights, and any suggestions for further refining this project. Let's connect, share knowledge, and learn from each other as we continue building in the cloud.

Top comments (0)