Writing backend code - like web services or anything else really - with AWS Lambda functions is amazingly easy, in particular when you choose Node.js as your weapon of choice. The amount of code required to get going is so sparse, it's almost magical. However, as you build out your Lambda, complexity will quickly rear its head, and you'll soon feel the need to add some tests.
Unit testing is a part of any good developer's workflow, but I feel it's especially important when dealing with dynamically typed languages, like vanilla Javascript. Its loosely-typed nature makes development fast, but also makes for a certain degree of uncertainty when making changes, or while refactoring. Good test coverage can make up for this, and it can allow you to work faster. If you're able to mock your Lambda's dependencies, you'll be pretty confident that your successful unit test is representative of the eventual production code.
Dependency Injection
"Dependency Injection" is the somewhat intidimidating term used in software engineering to describe something quite simple:
Dependency injection is a programming technique that makes a class independent of its dependencies. It achieves that by decoupling the usage of an object from its creation.
It's most useful when applied in the context of unit testing because it enables you to mock dependencies that shouldn't be active during tests.
In Node.js Lambda functions, dependencies are imported using the require() function. It creates a constant in the function's scope, pointing to some outside code. By default, you'll do this at the top level of your Node.js file, effectively making the dependency globally accessible to said file. Consider this snippet, where we're importing the AWS SDK, and creating a new instance of the DynamoDB DocumentClient:
const AWS = require('aws-sdk')
const documentClient = new AWS.DynamoDB.DocumentClient()
What happens when you unit test code that imports the above dependency? In this case, your test will establish a live connection to DynamoDB and potentially start reading and writing data to it! While you could argue this is a test in and of itself, this situation is far from ideal. Each unit test invocation will
- potentially incur costs
- write data to a live database, possibly messing up its consistency
- be slow
Richard Hyatt's Medium post from 2016 is still relevant today, as it describes how we can make dependency loading asynchronous and injectable by using the exports object to store and reference dependencies.
exports.deps = () => {
const AWS = require('aws-sdk')
const documentClient = new AWS.DynamoDB.DocumentClient()
return Promise.resolve({
dynamoBatchWrite: params => documentClient.batchWrite(params).promise()
})
}
The actual dependency import is enclosed into the deps function scope, and is made asynchronous by wrapping the result dictionary in a Promise. This asynchronicity allows us to overwrite the deps function during tests, while leaving it as-is in production.
The production code will just await the dependencies at the top, after which you'll be able to access the fully constructed dependencies:
exports.handler = async event => {
const deps = await exports.deps()
...
}
Now, for the test:
require('chai').should()
const lambda = require('../index')
const sinon = require('sinon')
describe('importOPML', () => {
beforeEach('mock dependencies', () => {
const mockWriter = sinon.mock()
mockWriter.resolves({ UnprocessedItems: [] })
lambda.deps = () => Promise.resolve({
dynamoBatchWrite: mockWriter
})
})
it('should succeed with empty opml', async () => {
// Using lambda here, will call the version that uses the mocked DynamoDB writer.
}
})
This happens to be a Chai test that uses Sinon for mocking, but the premise is the same. Before each test block is run, the beforeEach block is executed, which preps the lambda with the mock dependencies.
That's it. You're off to the races!
Top comments (8)
Great info. In your sample test does the require('../index') not load the non mocked lambda code first before sinon (via beforeach) gets a chance to mock it out? Is there a github repo? Thanks for your time.
The
deps
property does get created first, along with the rest of the lamdba, but it gets overwritten inbeforeEach
. Because thedeps
property is wrapped in aPromise
, it's not actually run immediately, allowing it to be overwritten by the test if necessary.Thanks for the info. In the example given will running the unit tests still result in connecting to the dynamodb database via 'const documentClient = new AWS.DynamoDB.DocumentClient()'
I double-checked by setting a breakpoint and it's not being triggered. Notice that the deps property is actually a function:
.. its body contents should only be run when deps is invoked as a function (in this case async).
Perfect, thanks for your time and this great article.
Alex, you are calling
const deps = await exports.deps()
inside the handler. Doesn't that mean every time the handler is executed, it will create an instance ofAWS.DynamoDB.DocumentClient
. Isn't it recommended for Lambdas to have all your initialization code outside the handler as I assume behind the scenes, it creates a container and stays warm till you keep hitting it periodically. Any initilization outside the handler is called once per container. Just a thought.Thanks for the suggestions here. I implemented something similar after reading this, but I'm wondering if there is really a need for the promise. In my case, what I did was create an exports.deps that returns an object like { s3: new AWS.S3() } and then I also created an exports.initDeps, which gets the object from exports.deps, and then assigns it to the top level variable. I have a fairly complex lambda, and I wanted to be able to test functions independently. So now the handler is the only function that calls exports.deps, and in my test code I overwrite initDeps, and then call any of the functions.
Would this work with AVA tests where tests are run in parallel?