DEV Community

Cover image for Renaming and Migrating Google Cloud Storage Buckets Using Storage Transfer Service
Weslley Ribeiro
Weslley Ribeiro

Posted on

Renaming and Migrating Google Cloud Storage Buckets Using Storage Transfer Service

Comparing Speed

gcloud storage cp
Enter fullscreen mode Exit fullscreen mode

Average throughput: 14.1MiB/s
Average throughput: 14.1MiB/s

gcloud transfer jobs create
Enter fullscreen mode Exit fullscreen mode

Average throughput: 604.96MiB/s
Average throughput: 604.96 MiB/s

Google recommend the following guidelines:

  • Transferring less than 1 TB: Use gcloud cp.
  • Transferring more than 1 TB: Use Cloud Storage Transfer Service.

If you're moving more than 0.4 TB, STS is already a more efficient and safer choice.

Utilizing Storage Transfer Service (STS)

Migrating and renaming Google Cloud Storage (GCS) buckets using Storage Transfer Service is the fastest and safest strategy for large datasets.

Why STS?

  • Parallelization: Transfers multiple files simultaneously.
  • Resilience: Handles failures and retries automatically.
  • Scheduling: Allows for one-time or recurring transfers.
  • Metadata Preservation: Retains object metadata during transfers. you can learning more about this here.

In the tutorial, the challenge was to migrate a GCS bucket to a new project. However, using gcloud storage cp for large datasets can be slow and inefficient. We’ll solve this using STS.

Before Executing

  • Enable the Storage Transfer Service API in both projects.
  • The first transfer job (source → temp) is created in the source project.
  • Ensure the logged-in Service Account has the necessary roles assigned:

roles/storage.admin
roles/storagetransfer.admin
roles/iam.securityAdmin (optional, for IAM policy manipulation)

Executing First Script

  • Clone the GitHub repository to execute this tutorial. It contains 2 scripts you’ll need to fill in with your environment variables. Follow the included README.
  • Run the first script
bash 1_script_source_to_temp.sh
Enter fullscreen mode Exit fullscreen mode

The script will:

  1. Create a temporary bucket
  2. Copy IAM policies from the original bucket to policies.json
  3. Let you edit policies.json to keep existing permissions and add new ones
  4. Apply all policies to the temporary bucket on confirmation
  5. Assign roles to the Storage Transfer Service account to source and temp bucket
  6. Create a job and initiate transfer from source to temporary bucket

To monitor detailed progress (percentage, transfer speed, files copied), use the Google Cloud Console:
Console: https://console.cloud.google.com/transfer/

Executing Second Script

  • temp and new original bucket are in the same project (new project) the job from temp to new origin will be create in the new project
  • You’ll need to fill in with your environment variables. Follow the included README.
  • Run the Second script
bash 2_script_temp_to_new_source.sh
Enter fullscreen mode Exit fullscreen mode

The script will:

  1. ⚠️ After ensuring all data is transferred, delete the Original Bucket
  2. Recreate the Bucket with the Original Name in the new project
  3. Copy IAM policies from the temp bucket to policies.json
  4. Let you edit policies.json to keep existing permissions and add new ones
  5. Apply all policies to the new Original Bucket on confirmation
  6. Assign roles to the Storage Transfer Service account to temp and new original bucket
  7. Create a job and initiate transfer from temporary to new original bucket
  8. ⚠️ After ensuring all data is transferred, delete the temporary bucket

Conclusion

At the end, two transfer jobs will be created:

  • One in the source project (source → temp)

  • One in the destination project (temp → renamed bucket)

Both are one-time transfers using Storage Transfer Service, ensuring a high-speed, metadata-preserving, and downtime-free bucket migration.

Top comments (0)