When deploying python code to AWS Lambda, you are required to zip all your code into a single zip file, and add all your package dependencies (python libraries that you usually pip install into your virtualenv) into the root level of your directory.
I wanted a way to do this easily, without having to log into the AWS console
and using the GUI to zip all my contents every time I modify the source code. I also wanted a way to test the code and stop the process from deploying if the tests didn't pass. Kind of what you would expect if you had a continuous integration server, such as Jenkins. The goal was to be able to modify code or pip install packages, and with a single command be able to test my code, zip my code, and deploy it to lambda.
If you aren't familiar with
make, here is a description:
A GNU build automation tool that automatically builds programs from source code by reading files called Makefiles, which specifies how to build the program.
This is commonly used in c programs to compile all source code and bundle it into a single executable file, and comes pre-installed on most UNIX systems, such as MacOS and Linux. Another great benefit to
make is that it is smart enough to re-build just the files that have changed, and doesn't regenerate every file every time. Rules will execute in the order you specify, and stop executing if any of the commands fail.
Ultimately, it seems like a great use case for our needs, since we essentially
want to transform our source files to a single target result (the zip file)! We also want the deployment process to halt and stop executing commands if our tests fail.
First, just a quick intro to some
make commands and syntax:
- To do stuff with make, you type
makein your terminal in a directory that has a file called "Makefile"
- A Makefile is a collection of rules. Each rule is a recipe to do a specific thing.
- Recipe syntax:
<target>: <prerequisites...> <commands>
The target would be what you call from the terminal to execute commands, such as
make <target>, similar to a function name.
prerequisites are anything this target relies upon, which could be a file or another target.
commands are where you can specify a series of commands you’d like to execute, like a function body.
The target is required, the prerequisites and commands are optional, but you must have one or the other.
- Magic variables: This is special make syntax that allows us to easily refer to the target / prerequisites.
# Example: libs/: requirements.txt requirements-test.txt $@ # refers to target: "libs/" $< # refers to the first prerequisite: "requirements.txt" $^ # refers to all prerequisites: "requirements.txt requirements-test.txt"
The syntax is unique and could take time to get used to! But in my opinion it is part of what makes
make (uh) a powerful tool when compiling source code, rather than using a bash script. Here's a cheat sheet for reference.
We will want to do the following:
- Test our code
- Zip our code
- Deploy our code
- Clean up any files / dirs we created
This tutorial assumes you have the following installed:
- aws cli
Test our code
test: mypy pylint nose tox ## Run all tests pip install $^ # install your test dependencies ## run your tests here. some examples: python -m unittest discover tests/ nose pylint main.py mypy main.py tox # etc.
Zip our code
We begin by installing the library dependencies.
# Our target is the libs dir, which has a requirements file as a prerequisite libs: requirements.txt ## Install all libraries @[ -d $@ ] || mkdir $@ # Create the libs dir if it doesn't exist pip install -r $< -t $@ # We use -t to specify the destination of the # packages, so that it doesn't install in your virtual env by default
Zip the source code and the libraries.
# Our target is the zip file, which has libs as a prerequisite output.zip: libs ## Output all code to zip file zip -r $@ *.py # zip all python source code into output.zip cd $< && zip -rm ../$@ * # zip libraries installed in the libs dir into output.zip # We `cd` into the directory since zip will always keep the relative # paths, and lambda requires the library dependencies at the root. # Each line of a make command is run as a separate invocation of # the shell, which is why we need to combine the cd and zip command here. # We use the -rm flag so it removes the libraries after zipping, # since we don't need these.
Deploy our code
# Our target is deploy, which has the zip file as a prerequisite # Note: since deploy doesn't refer to an actual file or directory, it's good # practice to declare this as .PHONY, so make knows not to look for that file. .PHONY: deploy deploy: output.zip ## Deploy all code to aws -aws lambda update-function-code \ --function-name my-lambda-function \ --zip-file fileb://$< # This assumes you have the aws cli, but if not, run `pip install awscli` # The "-" says to ignore the exit status, and continue executing even if it # fails. This is so we can clean up the files when we're done, even if # deployment failed.
And finally, clean up files
.PHONY: clean clean: ## Remove zipped files and directories generated during build rm output.zip rmdir libs
Then, we'll want to run all these targets and execute it with a single command:
# `make all` will execute all prerequisite targets! .PHONY: all all: test deploy clean
Make is a powerful tool that can be used to run a series of commands, and can be a useful build automation tool when deploying to lambda.
We use it to zip our python source code, as well as our dependencies, and push the code to aws.
Now, with a single command in your terminal,
make all, we have our continuous integration pipeline set up for lambda!
Make can do a lot more with its special syntax, so if you'd like to learn more, check out this tutorial.