<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Charlie DiGiovanna</title>
    <description>The latest articles on DEV Community by Charlie DiGiovanna (@cd17822).</description>
    <link>https://dev.to/cd17822</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/cd17822"/>
    <language>en</language>
    <item>
      <title>Using Docker on Lambda for Postgres to S3 Backups </title>
      <dc:creator>Charlie DiGiovanna</dc:creator>
      <pubDate>Thu, 07 Jan 2021 23:22:48 +0000</pubDate>
      <link>https://dev.to/cd17822/using-docker-on-lambda-for-postgres-to-s3-backups-28gn</link>
      <guid>https://dev.to/cd17822/using-docker-on-lambda-for-postgres-to-s3-backups-28gn</guid>
      <description>&lt;h3&gt;
  
  
  Intro
&lt;/h3&gt;

&lt;p&gt;If you don't care about any context you can just skip down to the Ship it section.&lt;/p&gt;

&lt;p&gt;I've never written a tech blog post before, but I figured out how to leverage AWS Lambda's &lt;a href="https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/"&gt;newly announced container image support (Dec 1, 2020)&lt;/a&gt; to back up a database I'm maintaining and it wasn't as straightforward as I'd hoped, so I figured I'd write something on it.&lt;/p&gt;

&lt;p&gt;For context, as a cost-cutting measure™, I have just a single EC2 instance with some docker containers running my application, and that same EC2 instance houses my database! Sorry if you hate it!&lt;/p&gt;

&lt;p&gt;If you &lt;em&gt;also&lt;/em&gt; didn't feel like dealing with the costs and complexities of RDS &amp;amp; snapshotting and just want a nice way to back up your data, you're in the right place!&lt;/p&gt;

&lt;h3&gt;
  
  
  My reasons
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;I have 2- count 'em- 2 users for an app I made and I want to make sure their data is less ephemeral than the reliability of a database-on-EC2-instance setup would suggest. I imagine this scales to databases much larger than what I have, but I haven't tested it out personally. I suppose you're up against the max timeout (15 minutes), and the max memory capacity (10GB) of a Lambda execution.&lt;/li&gt;
&lt;li&gt;It's super cheap- for S3, all you're paying for is the storage, since data transfer &lt;em&gt;into&lt;/em&gt; S3 is free, and you shouldn't be exporting from S3 very often. And of course Lambda is very cheap- I've calculated that for my setup, backing up an admittedly very small amount of data once every hour, it will cost approximately $0.05/mo.&lt;/li&gt;
&lt;li&gt;It's super configurable- you can snapshot as frequently or infrequently as you'd like, and I guess if you wanted to you could backup only certain tables or something- I don't know I'm not super familiar with RDS' snapshotting capabilities, maybe they're similar.&lt;/li&gt;
&lt;li&gt;I can pull my production data down for local testing/manipulating very easily! It's just a single docker command!&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  High-Level
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;For the actual database backups I adapted a Docker-based script I found on GitHub &lt;a href="https://github.com/schickling/dockerfiles/tree/master/postgres-backup-s3"&gt;here&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;For running that script I'm using AWS Lambda's new "Container Image" option. I borrowed the Dockerfile from the &lt;strong&gt;Building a Custom Image for Python&lt;/strong&gt; section of &lt;a href="https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/"&gt;Amazon's announcement&lt;/a&gt; to help configure things in a way that'd make sense.&lt;/li&gt;
&lt;li&gt;For triggering the Lambda on a cron I'm using &lt;a href="https://console.aws.amazon.com/events/home"&gt;Amazon EventBridge&lt;/a&gt;. I've never heard of it before but it was really easy to create a rule that just says, "run this Lambda once an hour," so I'm recommending it.&lt;/li&gt;
&lt;li&gt;I'm storing the SQL dump files in a private S3 bucket with a retention policy of 14 days.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Let's go lower now
&lt;/h3&gt;

&lt;p&gt;So in my head I was like, "Oh cool I can just chuck this Dockerfile on a Lambda and run a command on an image with some environment variables once an hour and we're golden."&lt;/p&gt;

&lt;p&gt;It doesn't really work that way though.&lt;/p&gt;

&lt;p&gt;Lambda requires that your Docker image's entrypoint be a function, in some programming language, that gets called when triggered.&lt;/p&gt;

&lt;p&gt;So rather than just being able to trigger a docker run command (which could in my case run a script, &lt;code&gt;backup.sh&lt;/code&gt;) like so:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run schickling/postgres-backup-s3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It's more that you're setting up a Docker environment for a program (in my case a Python program), that'll have an entrypoint function, that'll run whatever you need to run (again, in my case &lt;code&gt;backup.sh&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;What's that entrypoint function look like? Pretty simple:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;json&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;subprocess&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Received event: "&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;indent&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="n"&gt;subprocess&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"sh backup.sh"&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" "&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Process complete."&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The mangling of the Dockerfile they provide as an example in the &lt;strong&gt;Building a Custom Image for Python&lt;/strong&gt; section of &lt;a href="https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/"&gt;Amazon's announcement&lt;/a&gt;, to look more like &lt;a href="https://github.com/schickling/dockerfiles/blob/master/postgres-backup-s3/Dockerfile"&gt;the Dockerfile of the Postgres to S3 backup script&lt;/a&gt; was the more complicated part. I'll let you take a look at that what that final Dockerfile looks like &lt;a href="https://github.com/cd17822/lambda-s3-pg-backup/blob/main/db-backup.Dockerfile"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Some other gotchas
&lt;/h3&gt;

&lt;h4&gt;
  
  
  AWS Environment variables
&lt;/h4&gt;

&lt;p&gt;By far the most annoying part of this whole thing was finding some &lt;a href="https://github.com/jschneier/django-storages/issues/606#issuecomment-426136721"&gt;GitHub issue comment&lt;/a&gt; that mentioned that Lambdas automatically set the &lt;code&gt;AWS_SESSION_TOKEN&lt;/code&gt; and &lt;code&gt;AWS_SECURITY_TOKEN&lt;/code&gt; environment variables, and it turns out it was causing a hard-to-track-down error in the backup script's invocation of the aws client along the lines of:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;An error occurred (InvalidToken) when calling the PutObject operation: The provided token is malformed or otherwise invalid.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If all this article accomplishes is that someone stumbles upon this section after Googling that error, I will consider it astoudingly successful.&lt;/p&gt;

&lt;p&gt;Anyway, I just had to edit the &lt;code&gt;backup.sh&lt;/code&gt; file to add these two lines and the complaining stopped:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;unset &lt;/span&gt;AWS_SECURITY_TOKEN
&lt;span class="nb"&gt;unset &lt;/span&gt;AWS_SESSION_TOKEN
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Writing to files
&lt;/h4&gt;

&lt;p&gt;For some reason Lambda didn't like that the &lt;code&gt;backup.sh&lt;/code&gt; script was writing to a file. After a couple minutes of researching with no luck, I decided to change:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Creating dump of &lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;POSTGRES_DATABASE&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; database from &lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;POSTGRES_HOST&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;..."&lt;/span&gt;
pg_dump &lt;span class="nv"&gt;$POSTGRES_HOST_OPTS&lt;/span&gt; &lt;span class="nv"&gt;$POSTGRES_DATABASE&lt;/span&gt; | &lt;span class="nb"&gt;gzip&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; dump.sql.gz

&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Uploading dump to &lt;/span&gt;&lt;span class="nv"&gt;$S3_BUCKET&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;span class="nb"&gt;cat &lt;/span&gt;dump.sql.gz | aws &lt;span class="nv"&gt;$AWS_ARGS&lt;/span&gt; s3 &lt;span class="nb"&gt;cp&lt;/span&gt; - s3://&lt;span class="nv"&gt;$S3_BUCKET&lt;/span&gt;/&lt;span class="nv"&gt;$S3_PREFIX&lt;/span&gt;/&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;POSTGRES_DATABASE&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;_&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +&lt;span class="s2"&gt;"%Y-%m-%dT%H:%M:%SZ"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;.sql.gz &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nb"&gt;exit &lt;/span&gt;2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"Creating dump of &lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;POSTGRES_DATABASE&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; database from &lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;POSTGRES_HOST&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; and uploading dump to &lt;/span&gt;&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;S3_BUCKET&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;..."&lt;/span&gt;

pg_dump &lt;span class="nv"&gt;$POSTGRES_HOST_OPTS&lt;/span&gt; &lt;span class="nv"&gt;$POSTGRES_DATABASE&lt;/span&gt; | &lt;span class="nb"&gt;gzip&lt;/span&gt; | aws &lt;span class="nv"&gt;$AWS_ARGS&lt;/span&gt; s3 &lt;span class="nb"&gt;cp&lt;/span&gt; - s3://&lt;span class="nv"&gt;$S3_BUCKET&lt;/span&gt;/&lt;span class="nv"&gt;$S3_PREFIX&lt;/span&gt;/&lt;span class="k"&gt;${&lt;/span&gt;&lt;span class="nv"&gt;POSTGRES_DATABASE&lt;/span&gt;&lt;span class="k"&gt;}&lt;/span&gt;_&lt;span class="si"&gt;$(&lt;/span&gt;&lt;span class="nb"&gt;date&lt;/span&gt; +&lt;span class="s2"&gt;"%Y-%m-%dT%H:%M:%SZ"&lt;/span&gt;&lt;span class="si"&gt;)&lt;/span&gt;.sql.gz &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nb"&gt;exit &lt;/span&gt;2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There might be a better way around this but I couldn't find one, so here we are, just piping away.&lt;/p&gt;

&lt;h4&gt;
  
  
  Lambda timeouts
&lt;/h4&gt;

&lt;p&gt;The Lamba I had was timing out by default after 3 seconds. Make sure you jack that up in the function configuration's &lt;strong&gt;General Configuration&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Test it out locally
&lt;/h3&gt;

&lt;p&gt;Testing this out locally is really easy because the example Dockerfile that Amazon provided in their announcement has "Lambda Runtime Interface Emulator" support &lt;a href="https://github.com/cd17822/lambda-s3-pg-backup/blob/main/db-backup.Dockerfile#L55-L57"&gt;built in&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Build the image:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker build &lt;span class="nt"&gt;-t&lt;/span&gt; db-backup &lt;span class="nt"&gt;-f&lt;/span&gt; db-backup.Dockerfile &lt;span class="nb"&gt;.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Run it in Terminal Window 1:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_DATABASE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;POSTGRES_DATABASE&amp;gt;ms
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_HOST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;POSTGRES_HOST&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;POSTGRES_PASSWORD&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_USER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;POSTGRES_USER&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_ACCESS_KEY_ID&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;S3_ACCESS_KEY_ID&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_BUCKET&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;S3_BUCKET&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_REGION&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;S3_REGION&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_PREFIX&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;S3_PREFIX&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_SECRET_ACCESS_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&amp;lt;S3_SECRET_ACCESS_KEY&amp;gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-p&lt;/span&gt; 9000:8080 &lt;span class="se"&gt;\&lt;/span&gt;
    db-backup:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Trigger it in Terminal Window 2:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-XPOST&lt;/span&gt; &lt;span class="s2"&gt;"http://localhost:9000/2015-03-31/functions/function/invocations"&lt;/span&gt; &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{}'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Ship it
&lt;/h3&gt;

&lt;p&gt;Sure, the right™ way to ship it can be debated by whoever, but the simplest way is probably:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Clone the code:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/cd17822/lambda-s3-pg-backup.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Build the Docker image:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker build &lt;span class="nt"&gt;-t&lt;/span&gt; db-backup &lt;span class="nt"&gt;-f&lt;/span&gt; db-backup.Dockerfile
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Tag the built image to be pushed to a private &lt;a href="https://aws.amazon.com/ecr/"&gt;ECR&lt;/a&gt; repository:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker tag db-backup &amp;lt;AWS_ACCOUNT_ID&amp;gt;.dkr.ecr.&amp;lt;ECR_REGION&amp;gt;.amazonaws.com/&amp;lt;ECR_REPO_NAME&amp;gt;:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Push the image up to ECR:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker push &amp;lt;AWS_ACCOUNT_ID&amp;gt;.dkr.ecr.&amp;lt;ECR_REGION&amp;gt;.amazonaws.com/&amp;lt;ECR_REPO_NAME&amp;gt;:latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a private S3 bucket that we'll be storing the backups in (I'd also recommend setting up a retention policy unless you want to keep around these files forever).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a Lambda function by selecting &lt;code&gt;Create Function&lt;/code&gt; in the &lt;a href="https://console.aws.amazon.com/lambda/home"&gt;Lambda Console&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;code&gt;Container Image&lt;/code&gt;, name it whatever you want, find the Docker image in ECR in &lt;code&gt;Browse Images&lt;/code&gt;, leave everything else as default and finally select &lt;code&gt;Create Function&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Scroll down to &lt;strong&gt;Environment variables&lt;/strong&gt; and set values for the following environment variables:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;POSTGRES_DATABASE
POSTGRES_HOST
POSTGRES_PASSWORD
POSTGRES_USER
S3_ACCESS_KEY_ID
S3_BUCKET
S3_PREFIX
S3_SECRET_ACCESS_KEY
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scroll down further and make sure you edit the &lt;strong&gt;General Configuration&lt;/strong&gt; such that the Timeout is bumped up to something like 5 minutes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;At this point you can select &lt;code&gt;Test&lt;/code&gt; on the top right and check to make sure your function's working.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finally, you can set up a scheduled trigger by selecting &lt;code&gt;Add trigger&lt;/code&gt; in the &lt;strong&gt;Designer&lt;/strong&gt;. I'd recommend setting up a simple EventBridge trigger that runs on a cron (&lt;code&gt;cron(13 * * * *)&lt;/code&gt;) or with a set frequency (&lt;code&gt;rate(1 hour)&lt;/code&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Restoring from backup
&lt;/h3&gt;

&lt;p&gt;You could set up a Lambda to restore your database from backup that's triggered by emailing a photo of yourself crying to an unsolicited email address using AWS Computer Vision, but for the sake of this article I figured I'd just include the easy way to do it.&lt;/p&gt;

&lt;p&gt;In the same repo that the backup script is in lies a &lt;a href="https://github.com/schickling/dockerfiles/tree/master/postgres-restore-s3"&gt;restore script&lt;/a&gt;. It's hosted on &lt;a href="https://registry.hub.docker.com/r/schickling/postgres-restore-s3"&gt;DockerHub&lt;/a&gt; making it really easy to pull and run locally:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker pull schickling/postgres-restore-s3
docker run &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_ACCESS_KEY_ID&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;key &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_SECRET_ACCESS_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;secret &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_BUCKET&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;my-bucket &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;S3_PREFIX&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;backup &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_DATABASE&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;dbname &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_USER&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;user &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_PASSWORD&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;password &lt;span class="nt"&gt;-e&lt;/span&gt; &lt;span class="nv"&gt;POSTGRES_HOST&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;localhost schickling/postgres-restore-s3
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>lambda</category>
      <category>docker</category>
      <category>postgres</category>
      <category>aws</category>
    </item>
  </channel>
</rss>
