DEV Community

Cover image for The Zero-Cost Cloud Engineer Part 4: Cloud Storage, Secret Manager, and the Legacy Access Trap
Mohammad Awwaad
Mohammad Awwaad

Posted on

The Zero-Cost Cloud Engineer Part 4: Cloud Storage, Secret Manager, and the Legacy Access Trap

The Zero-Cost Cloud Engineer

Part 4: Hybrid Storage, Secrets, and the Legacy VM Trap

In our previous tutorials, we secured an internet-less Compute Engine VM, established centralized logging, and decoupled our architecture with Pub/Sub. Now, we hit the next major architectural bottleneck: Our 30GB Hard Drive limit.

If you allow users to upload files directly to your VM's block storage, you will quickly max out your Free Tier limits, crashing your OS. Resilient architectures offload files to Object Storage (Google Cloud Storage) and never hardcode connection properties.

This tutorial covers integrating Google Cloud Storage (GCS) and Secret Manager into a Spring Boot application, entirely zero-cost.


Step 1: Provisioning Hybrid Storage

In Google Cloud Storage (GCS), you don't have "folders"; you have "Buckets" filled with "Objects." For the Always Free tier, GCP gives you 5 GB-months of Standard Storage per month.

🚨 The FinOps Trap (Soft Delete): A "GB-month" calculates storage sequentially. If you upload a 5 GB file and delete it 24 hours later, you consume a fraction of a GB-month. However, Google recently enabled Soft Delete by default with a 7-day retention period. If you delete files to stay under the 5GB limit, Soft Delete secretly retains them, charging your quota and triggering a sudden billing alert.

To provision a strictly free bucket:

  1. Navigate to Cloud Storage Buckets in the GCP Console and click Create.
  2. Give it a globally unique name (e.g., zero-cost-bucket-yourname).
  3. Location: You must choose a Region that matches your VM (e.g., us-east1) to avoid data egress charges.
  4. Storage Class: Choose Standard.
  5. Protection: Expand "Choose how to protect object data" and Disable Soft Delete (or set retention to 0 days) to prevent hidden quota consumption.
  6. Click Create.

Step 2: The Security Vault (Secret Manager)

We don't want to hardcode our new bucket name into our source code. We want to fetch it securely on boot. GCP offers 6 completely free secret versions per month.

  1. Navigate to Secret Manager and click CREATE SECRET.
  2. Name: GCS_BUCKET_NAME
  3. Secret Value: your-bucket-name
  4. Leave replication as Automatic (Global) and click Create.

Step 3: Conquering the Legacy "Access Scopes" Trap

Your Compute Engine VM holds an IAM Identity. You use the GCP Console (IAM & Admin) to attach the Storage Object Admin and Secret Manager Secret Accessor roles to it. You assume you're finished.

🚨 The Legacy Architecture Trap: When you attempt to upload a file in Java, you instantly receive a PERMISSION_DENIED exception. Why? GCE VMs are handcuffed by a legacy system called "Access Scopes."

When you create a VM with GUI defaults, it applies the "Default access" scope, which throttles storage to devstorage.read_only and explicitly blocks Secret Managerβ€”entirely overriding your new IAM admin privileges!

The Architect's Fix: We must transition the VM to modern IAM validation by granting it the cloud-platform scope.

  1. Run these commands from your local laptop to strip the legacy handcuffs:
# 1. Stop the instance
gcloud compute instances stop free-tier-vm --zone=us-east1-b

# 2. Grant full API access (Delegating authority completely to IAM)
gcloud compute instances set-service-account free-tier-vm \
    --zone=us-east1-b \
    --scopes=https://www.googleapis.com/auth/cloud-platform

# 3. Restart the instance
gcloud compute instances start free-tier-vm --zone=us-east1-b
Enter fullscreen mode Exit fullscreen mode

Step 4: The Spring Boot Integration

With modern Spring Boot 3.4+ / 4.x, the config data loader evaluates cloud imports incredibly early. To prevent local tests running on your laptop from blindly seeking GCP credentials, use a profile-guarded application.yaml:

spring:
  application:
    name: hello-gcp
---
spring:
  config:
    activate:
      on-profile: "!test"
    import: sm://
Enter fullscreen mode Exit fullscreen mode

Now, create a controller that demands its configuration strictly from the Secret Manager vault during instantiation:

import com.google.cloud.storage.BlobId;
import com.google.cloud.storage.BlobInfo;
import com.google.cloud.storage.Storage;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.web.bind.annotation.*;
import org.springframework.web.multipart.MultipartFile;
import java.io.IOException;

@RestController
@RequestMapping("/api/files")
public class GcsUploadController {

    private final Storage storage;
    private final String bucketName;

    // The sm:// prefix forces a fetch from GCP Secret Manager!
    public GcsUploadController(Storage storage, @Value("${sm://GCS_BUCKET_NAME}") String bucketName) {
        this.storage = storage;
        this.bucketName = bucketName;
    }

    @PostMapping("/upload")
    public String uploadFile(@RequestParam("file") MultipartFile file) throws IOException {
        BlobInfo blobInfo = BlobInfo.newBuilder(BlobId.of(bucketName, file.getOriginalFilename()))
                .setContentType(file.getContentType())
                .build();

        storage.create(blobInfo, file.getBytes());
        return "Uploaded " + file.getOriginalFilename() + " directly to secure bucket!";
    }
}
Enter fullscreen mode Exit fullscreen mode

Note: For local Maven builds (mvn clean package), remember to create a test profile that uses @MockitoBean and spring.cloud.gcp.core.enabled=false to bypass these enterprise connections.


Step 5: End-to-End Verification

Deploy your updated .jar to your internet-less VM via Identity-Aware Proxy (IAP) as we outlined in past tutorials.

To verify the file upload, we bridge our local laptop to the private server using port forwarding:

  1. Establish a Local Tunnel:

    gcloud compute ssh free-tier-vm --tunnel-through-iap -- -L 8080:localhost:8080
    
  2. Trigger the Upload via cURL:

    curl -F "file=@/path/to/any/local_test.jpg" http://localhost:8080/api/files/upload
    
  3. Validate: The terminal will return "Uploaded local_test.jpg directly to secure bucket!". If you refresh the GCP Console Storage Browser, you will see your file sitting securely in the cloud, completely detached from your VM's tiny hard drive.


Summary

By identifying the "Soft Delete" billing trap, locking credentials inside Secret Manager, tearing down the legacy Compute Engine "Access Scopes," and streaming files natively to GCS, you have built a production-grade file repository without spending a dime.

In Part 5, we will bring the entire ecosystem under control natively using Infrastructure as Code (Terraform).

Top comments (0)