DEV Community

Michael Masterson
Michael Masterson

Posted on • Originally published at Medium on

Building a Serverless REST API with Go and AWS Lambda

go-serverless

November 11, 2025

I’ve built Go Lambda APIs a few different ways over the years. Some of those experiments are in production right now — including the API that powers this site. What follows is the pattern I’ve landed on: clean, minimal, and straightforward to operate.

The short version: Go is one of the best Lambda runtimes available. Fast cold starts, a single static binary, a strong standard library, and a compiler that catches whole categories of runtime errors before they reach production. If you’re starting a new serverless API and aren’t already committed to another language, this is the stack I’d reach for.

Why Go specifically

Lambda cold starts are real. A Go function compiled to a Linux arm64 binary typically initializes in under 100ms — often under 50ms. Compare that to Node.js or Python runtimes that need to load an interpreter and dependencies before your first line of code runs. For a public-facing API, that difference matters.

The other thing Go gives you is a single self-contained binary. No node_modules, no virtual environments, no dependency resolution at deploy time. You compile locally, upload a zip file, and the Lambda runtime executes it directly. That simplicity pays dividends when you're debugging at 2am.

Project structure

Here’s how I structure a Go Lambda project in a monorepo:

apps/api/
├── contact/
│ └── main.go # POST /contact handler
├── dashboard/
│ └── main.go # Auth-gated routes
└── shared/
    ├── models.go # DynamoDB item structs
    ├── dynamo.go # Query/put helpers
    ├── mailer.go # SES wrapper
    └── util.go # ID generation, CORS headers
Enter fullscreen mode Exit fullscreen mode

Each Lambda function gets its own directory with a main.go. Common code lives in shared/ as an internal package. This avoids the mistake of deploying a massive monolith Lambda that does everything — instead, each function has a focused responsibility and independent deploy surface.

The shared package is where you put things that would otherwise be copy-pasted: DynamoDB helpers, the SES wrapper, CORS headers, ID generation. Keep it lean. If something is only used in one function, it stays in that function's package.

The handler pattern

Every Lambda handler has the same shape: receive an events.APIGatewayV2HTTPRequest, return an events.APIGatewayV2HTTPResponse. I put all routing inside a single handler function using a switch on method + path:

func handler(ctx context.Context, req events.APIGatewayV2HTTPRequest) (events.APIGatewayV2HTTPResponse, error) {
    headers := shared.CORSHeaders(allowedOrigin, "GET, POST, OPTIONS", "Content-Type, Authorization")
    method := req.RequestContext.HTTP.Method
    path := req.RequestContext.HTTP.Path

    if method == "OPTIONS" {
        return events.APIGatewayV2HTTPResponse{StatusCode: 204, Headers: headers}, nil
    }

    switch {
    case method == "POST" && path == "/contact":
        return handleContact(ctx, req, headers)
    case method == "GET" && path == "/health":
        return events.APIGatewayV2HTTPResponse{StatusCode: 200, Headers: headers, Body: `{"ok":true}`}, nil
    default:
        return errResponse(404, "not found", headers), nil
    }
}
Enter fullscreen mode Exit fullscreen mode

This is simpler than it looks. You don’t need a router library for a Lambda that handles 3–5 routes. The switch is readable, the cases are explicit, and there’s no magic. If a function grows beyond ~10 routes, that’s a signal it should be split.

Reading the JWT for auth-gated routes

API Gateway v2 with a Cognito JWT authorizer makes auth simple. The claims are injected into the request context — no token validation code needed in your Lambda:

func jwtEmail(req events.APIGatewayV2HTTPRequest) string {
    if req.RequestContext.Authorizer == nil {
        return ""
    }
    email, _ := req.RequestContext.Authorizer.JWT.Claims["email"]
    return email
}

func isAdmin(req events.APIGatewayV2HTTPRequest) bool {
    if req.RequestContext.Authorizer == nil {
        return false
    }
    groups := req.RequestContext.Authorizer.JWT.Claims["cognito:groups"]
    return strings.Contains(groups, "admin")
}
Enter fullscreen mode Exit fullscreen mode

The Cognito authorizer rejects requests with invalid or expired tokens before they reach your code. Your handler only runs for requests that passed auth — so a missing or empty email is a programming error, not a security gap.

Building and deploying with CDK

The build step is straightforward. For each Lambda, you cross-compile to Linux arm64 and zip the binary:

GOOS=linux GOARCH=arm64 go build -o bootstrap ./apps/api/contact/
zip contact.zip bootstrap
Enter fullscreen mode Exit fullscreen mode

The CDK stack in Go defines the function pointing at that zip:

contactFn := awslambda.NewFunction(stack, jsii.String("ContactFn"), &awslambda.FunctionProps{
    Runtime: awslambda.Runtime_PROVIDED_AL2023(),
    Handler: jsii.String("bootstrap"),
    Code: awslambda.Code_FromAsset(jsii.String("../../apps/api/contact.zip"), nil),
    Environment: &map[string]*string{
        "TABLE_NAME": jsii.String(*table.TableName()),
        "FROM_EMAIL": jsii.String(fromEmail),
        "ALLOWED_ORIGIN": jsii.String(allowedOrigin),
    },
})
Enter fullscreen mode Exit fullscreen mode

Wire the function to API Gateway, add a Cognito JWT authorizer for protected routes, and deploy. The whole stack — Lambda, API Gateway, DynamoDB table, IAM roles — comes up in a single cdk deploy.

The gotchas

A few things that aren’t obvious until you hit them:

The binary must be named bootstrap. The provided.al2023 runtime looks for a file named exactly bootstrap in the zip. Name it anything else and you'll get a cryptic runtime error on first invocation.

CORS preflight has to be handled explicitly. API Gateway v2 can handle CORS for you, or your Lambda can. Do one or the other — not both. If your Lambda returns CORS headers and the gateway also injects them, you’ll get duplicate headers and browsers will reject the response.

Cold start is not your P99. Cold starts only happen when a new execution environment is provisioned. Under steady traffic, most requests hit warm instances. Don’t optimize for cold start at the cost of readability — it’s rarely the bottleneck you think it is.

DynamoDB errors are usually IAM. If your Lambda can’t read or write to DynamoDB, check the execution role permissions before assuming your code is wrong. The AWS SDK swallows permission errors in ways that make them look like connection issues.

Is it worth it?

For the right workload — yes, absolutely. A Go Lambda API with DynamoDB behind API Gateway has essentially zero operational overhead. No servers to patch, no auto-scaling groups to tune, no capacity planning for moderate traffic. You pay for what you use, and Go’s efficiency means you use very little.

Where it breaks down: workloads that need persistent connections (WebSockets, long-running jobs), very high concurrency with unpredictable bursts (Lambda has account-level concurrency limits), or anything that genuinely benefits from a stateful server process. Know the constraints going in and it’s a powerful tool.

At M²S² Engineering Group, we help teams navigate exactly these kinds of architecture decisions — whether that’s serverless, containers, or something in between. If you’re weighing your options or just want a second opinion, let’s talk.

Top comments (0)