DEV Community

Cover image for Deploying a Swift API on AWS Elastic Container Service with CodePipeline
Patrick for Zentered

Posted on • Edited on • Originally published at zentered.co

Deploying a Swift API on AWS Elastic Container Service with CodePipeline

Motivation

After getting started with Swift on Server and deploying a Swift on Server API on Google Cloud Run with Google Cloud Build, we wanted to see how to get the Swift API running on Amazon Web Services (AWS). AWS offers a lot more products and flexibility, which comes at the cost of simplicity, so we created a new Terraform project to consistently re-create the environment. Here's an overview of the puzzle pieces roughly in order:

  • CodeCommit as a mirror of the GitHub repo
  • CodeBuild to build the Docker image and push it to the registry
  • CodeDeploy to orchestrate the deployment and traffic allocation to ECS
  • CodePipeline to glue Commit, Build and Deploy together in a nice deployment pipeline
  • Elastic Container Registry (ECR) to store the Docker images
  • Elastic Container Service (ECS) to run Docker containers
  • Elastic Application Load Balancer (ABN) to manage the traffic between instances and allow "Blue/Green Deployment"
  • and a few more 3-letter acronyms and other AWS products like KMS, IAM, S3, CloudWatch...

diagram showing the above mentioned AWS products

Please mind the /api folder. For prototyping we've decided to work in a single repo, while the server bits are stored in the /api subfolder (which makes things again a little more difficult, as this needs to be specified in various places).

TLDR; Mistakes made, 🤦 and lessons learned

You can find our Terraform project on GitHub

  • for ECS Blue/Green builds, the "image definition" is not "imagedefinitions.json" but imageDetail.json: - printf '{"ImageURI":"%s"}' ${REPOSITORY_URI}:latest} > imageDetail.json
  • to deploy from CodePipeline to ECS, the "blue/green" strategy is required
  • CodePipeline uses S3 to pass build artifacts from one step to the next
  • a taskdef.json and appspec.yaml are necessary build artifacts from the build to the deploy step
  • it's appspec.yaml (y*a*ml with the 'a') not appspec.yml
  • Docker container port (ie 8080) is used everywhere except for the Load Balancer Listener
  • Fargate requires network mode awsvpc and public IPs, or private IPs and a lot more configuration with firewalls, networks etc.
  • IAM permissions are a mess
  • snake_case should be used for all Terraform names

Getting down to the nitty gritty

Setting up this project was all but smooth. There are a lot of little details and a single wrong number can cost you hours of debugging. The AWS error messages are misleading, the user/developer experience a mess, each of the products uses different UIs, all in all very chaotic. At least with Terraform we're able to write "Infrastructure as Code" and provide a working Terraform project that you can use as a starting point. Some product specific information first:

AWS CodeCommit

CodeBuild uses GitHub OAuth, which gives Amazon access to all public and private repos. We're a little paranoid, so that's not something we wanted to do. Instead, we created a CodeCommit repo and push the changes there, until AWS switches to the new GitHub App experience, which lets you choose individual repos.

In order to push to CodeBuild, you need to add your ssh public key to your IAM user and add CodeCommit origin to your git config:

You can probably skip this with the AWS Connector for GitHub where you can connect a GitHub Repo as Source in the Pipeline. We're still working that out with Terraform.

AWS CodeBuild

CodeBuild was one of the easier tools to use. Just add your buildspec.yml. Important are the build artifacts that are exported to S3 and used by the deploy step.

Elastic Container Registry & Elastic Container Service (ECS)

To get a head start for your builds, you can push your latest image to ECR:

aws ecr get-login-password --region <region> | docker login --username AWS --password-stdin <account_id>.dkr.ecr.<region>.amazonaws.com
docker push <account_id>.dkr.ecr.<region>.amazonaws.com/<image>:latest
Enter fullscreen mode Exit fullscreen mode

ECS was the worst part to get up and running, since there are a lot of different components playing together. There are containers, tasks, services, task definitions, image definitions and more. If you encounter any issues, it's worth looking into the service, then tasks, to see if the container boots up correctly.

Let's Deploy

We've published the entire Terraform project here: Terraform ECS Blue/Green Deployments with CodePipeline. You need to download the Terraform CLI. If you're new to AWS and you don't have the AWS CLI installed, you can find instructions here.

terraform init
terraform validate
terraform plan
terraform apply
Enter fullscreen mode Exit fullscreen mode

Terraform outputs the url of the load balancer, you should be able to navigate to that URL and get the 'hello world' response.

Clean up:

terraform destroy
Enter fullscreen mode Exit fullscreen mode

Feel free to fork/clone the repo and adjust it to your needs.

Summary

AWS comes with a lot of benefits and great features, but the UI across the products is inconsistent and there are dozens of configuration options, permissions and other things to take care of and its easy to miss an important detail. Setting up the project in Terraform makes things a little easier, but it still takes time to write down all the configs. There are some pre-built Terraform modules for AWS which we didn't use, as we wanted to learn how things worked under the hood. In comparison, deployment to GCP was definitely a lot easier and faster to set things up. On GCP there are a lot of "beta" products though and things can change, or sometimes don't work as expected.

Special thanks and further reading ...

Special thanks to "vinycoolguy2015", "snow-dev" and "capital one" for their articles and resources on the topic. Here's a list of links to articles and repos that helped put together this project:

Thanks for reading! You can check out all the code on GitHub and read more about our "Swift on Server" series:

If you have any questions or comments, please reach out on Twitter or start a discussion on GitHub.

Top comments (0)