DEV Community

Anand Rathnas
Anand Rathnas

Posted on

Terraform aws_s3_object Data Source: Why Your `body` Might Be Null

I just spent an hour debugging a Terraform failure that had a surprisingly simple cause: the aws_s3_object data source's body attribute was returning null even though the file existed in S3.

The Setup

I store SSH public keys in S3 and read them during Terraform runs to create EC2 key pairs:

data "aws_s3_object" "ssh_public_key" {
  bucket = "my-tfstate-bucket"
  key    = "ssh/prod_key.pub"
}

locals {
  ssh_public_key = try(trimspace(data.aws_s3_object.ssh_public_key.body), null)
}

resource "aws_key_pair" "server" {
  key_name   = "my-server-key"
  public_key = local.ssh_public_key

  lifecycle {
    precondition {
      condition     = local.ssh_public_key != null && local.ssh_public_key != ""
      error_message = "SSH public key must be available in S3."
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

The Error

Error: Resource precondition failed

  on security.tf line 241, in resource "aws_key_pair" "server":
  241:       condition     = local.ssh_public_key != null && local.ssh_public_key != ""
    ├────────────────
    │ local.ssh_public_key is null

SSH public key must be available in S3.
Enter fullscreen mode Exit fullscreen mode

But the file DEFINITELY exists! I verified with AWS CLI:

$ aws s3 ls s3://my-tfstate-bucket/ssh/
2025-12-18 05:42:27        734 prod_key.pub
Enter fullscreen mode Exit fullscreen mode

The Problem

The aws_s3_object data source's body attribute has a footnote in the docs:

body - (Optional, Computed) Object data (see limitations to understand cases in which this field is actually available)

The limitations include:

  1. Binary files won't have a body
  2. Large files won't have a body
  3. Files with certain content-types might not have a body

When the file was uploaded via aws s3 cp, it may not have set the right content-type. The provider sees it as binary/octet-stream and doesn't populate body.

The Solution

Use an external data source with AWS CLI instead:

data "external" "ssh_public_key" {
  program = ["bash", "-c", <<-EOF
    KEY_B64=$(aws s3 cp s3://${local.bucket}/ssh/${var.environment}_key.pub - 2>/dev/null | base64 | tr -d '\n' || echo "")
    echo "{\"key_b64\": \"$KEY_B64\"}"
  EOF
  ]
}

locals {
  ssh_public_key_b64 = lookup(data.external.ssh_public_key.result, "key_b64", "")
  ssh_public_key     = local.ssh_public_key_b64 != "" ? trimspace(base64decode(local.ssh_public_key_b64)) : ""
}
Enter fullscreen mode Exit fullscreen mode

Why Base64?

The external data source returns JSON. SSH public keys contain characters that break JSON parsing. Base64 encoding safely passes the content through.

Alternative Solutions

1. Fix the Content-Type at Upload

aws s3 cp key.pub s3://bucket/ssh/key.pub --content-type "text/plain"
Enter fullscreen mode Exit fullscreen mode

If you control the upload, this might fix aws_s3_object.

2. Use aws_s3_object for Metadata Only

data "aws_s3_object" "ssh_key" {
  bucket = local.bucket
  key    = "ssh/key.pub"
}

# Just check it exists
locals {
  key_exists = data.aws_s3_object.ssh_key.content_length > 0
}
Enter fullscreen mode Exit fullscreen mode

Then read the actual content via external data source.

3. Use local_file with aws s3 sync

# In CI/CD before terraform
aws s3 cp s3://bucket/ssh/key.pub ./key.pub
Enter fullscreen mode Exit fullscreen mode
data "local_file" "ssh_key" {
  filename = "${path.module}/key.pub"
}
Enter fullscreen mode Exit fullscreen mode

Why aws_s3_object Should Work (But Sometimes Doesn't)

In theory, for a 734-byte text file, the body should be populated. But I've seen issues with:

  • AWS provider version changes
  • S3 eventual consistency (rare)
  • Cross-account access
  • Missing content-type metadata

The try() function masks the actual error, making debugging harder:

# This hides WHY body is null
ssh_public_key = try(data.aws_s3_object.key.body, null)
Enter fullscreen mode Exit fullscreen mode

Debugging Tips

1. Check the object metadata

aws s3api head-object --bucket mybucket --key ssh/key.pub
Enter fullscreen mode Exit fullscreen mode

Look for ContentType. If it's application/octet-stream, that might be the issue.

2. Try without try()

# Remove try() to see the actual error
ssh_public_key = trimspace(data.aws_s3_object.key.body)
Enter fullscreen mode Exit fullscreen mode

3. Use terraform console

terraform console
> data.aws_s3_object.ssh_public_key
Enter fullscreen mode Exit fullscreen mode

Lesson Learned

The aws_s3_object data source is great for checking if files exist and reading metadata. For reliably reading file content, especially in CI/CD pipelines, the external data source with AWS CLI is more robust.

# Reliable pattern for reading S3 text content
data "external" "file_content" {
  program = ["bash", "-c", "aws s3 cp s3://bucket/key - | base64 | jq -Rs '{content: .}'"]
}

locals {
  content = base64decode(data.external.file_content.result.content)
}
Enter fullscreen mode Exit fullscreen mode

Have you hit this issue? What workaround did you use? Let me know in the comments!

Building jo4.io - a URL shortener with analytics. Check it out at jo4.io

Top comments (0)