DEV Community

Ben Curtis for Honeybadger

Posted on • Originally published at honeybadger.io

Configure Your App with SSM Parameter Store

Configuring your Rails app via environment variables works well, but sometimes you want to be able to update your configuration on the fly. Here's a way to update your app's environment using SSM Parameter Store.

Why would you want to do this? Well, say you deploy your Rails app to an EC2 instance that's part of an autoscaling group. To get the fastest boot times, you should create a custom AMI with your code already on it (a.k.a. a golden image) that the autoscaling group can use when it's time to boot a new instance. Unfortunately, if you store your configuration on the image (and use something like dotenv to load it), you'll need to create a new AMI every time you have a configuration change. You can work around this by using SSM Parameter Store parameters, and let your app fetch its configuration at boot time.

Putting data into Parameter Store is easy enough -- you can use the CLI or the AWS console to edit the variables. But how do you get the data back out for your app to use? One way to do it is to fetch these parameters into your ENV via an initializer, like so:

Aws::SSM::Client.new.get_parameters_by_path(path: "/honeybadger/#{Rails.env}/", recursive: true, with_decryption: true).parameters.each do |param|
  ENV[param["name"].split("/").last] = param["value"]
end
Enter fullscreen mode Exit fullscreen mode

Assuming you have parameters named like honeybadger/production/MY_API_KEY, this snippet will result in ENV["MY_API_KEY"] having whatever value you supplied for that parameter.

But what if loading the environment variable values in an initializer is too late in the Rails boot up process? What if you need settings like DATABASE_URL to be set in SSM and you need to use those settings before your app loads? For that, you can save the variables to a .env file and let dotenv handle that. But first, let's set up the database and store our DATABASE_URL value.

Here's a terraform snippet that creates an RDS instance and stores the connection in SSM Parameter Store:

resource "aws_db_instance" "db" {
  allocated_storage      = 20
  storage_type           = "gp2"
  engine                 = "postgres"
  engine_version         = "11.4"
  password               = "${var.database_password}"
  name                   = "honeybadger"
  username               = "honeybadger"
}

resource "aws_ssm_parameter" "database_url" {
  name  = "/honeybadger/${var.environment}/DATABASE_URL"
  type  = "SecureString"
  value = "postgres://${aws_db_instance.db.username}:${var.database_password}@${aws_db_instance.db.endpoint}/${aws_db_instance.db.name}"
}
Enter fullscreen mode Exit fullscreen mode

With the following shell command, you can grab all the parameters (just like we did with the Ruby snippet above) and get them ready for use as environment variables:

aws ssm get-parameters-by-path --path /honeybadger/production/ \
  --recursive --with-decryption --output text \
  --query "Parameters[].[Name,Value]" |
  sed -E 's#/honeybadger/production/([^[:space:]]*)[[:space:]]*#export \1=#' \
  > /home/honeybadger/shared/.env.production.local
Enter fullscreen mode Exit fullscreen mode

Now you have a .env.production.local file that dotenv can load -- assuming that it's symlinked into your current path at deploy time, if you are using capistrano. As a bonus, you can also source that env file to have variables like $DATABASE_URL defined for you in any shell scripts you want to run.

We put that shell command in our deployment script (which is triggered by our CI/CD pipeline), so any new code that goes to production will pick up any changes to made to our parameters in SSM. Now we get to have our golden image and we don't have to build a new one for every configuration change. 😎

Top comments (0)