<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Weslley Ribeiro</title>
    <description>The latest articles on DEV Community by Weslley Ribeiro (@weslley_ribeiro01).</description>
    <link>https://dev.to/weslley_ribeiro01</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/weslley_ribeiro01"/>
    <language>en</language>
    <item>
      <title>Renaming and Migrating Google Cloud Storage Buckets Using Storage Transfer Service</title>
      <dc:creator>Weslley Ribeiro</dc:creator>
      <pubDate>Fri, 23 May 2025 14:38:02 +0000</pubDate>
      <link>https://dev.to/weslley_ribeiro01/renaming-and-migrating-google-cloud-storage-buckets-using-storage-transfer-service-2ail</link>
      <guid>https://dev.to/weslley_ribeiro01/renaming-and-migrating-google-cloud-storage-buckets-using-storage-transfer-service-2ail</guid>
      <description>&lt;h2&gt;
  
  
  Comparing Speed
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud storage &lt;span class="nb"&gt;cp&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Average throughput: 14.1MiB/s &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn14tld9587nf6jfj4v1j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn14tld9587nf6jfj4v1j.png" alt="Average throughput: 14.1MiB/s" width="800" height="65"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud transfer &lt;span class="nb"&gt;jobs &lt;/span&gt;create
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Average throughput: 604.96MiB/s &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo40ew84gtvsn38f4204s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo40ew84gtvsn38f4204s.png" alt="Average throughput: 604.96 MiB/s" width="800" height="161"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Google recommend the following guidelines:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Transferring less than 1 TB: Use &lt;code&gt;gcloud cp&lt;/code&gt;. &lt;/li&gt;
&lt;li&gt;Transferring more than 1 TB: Use Cloud Storage Transfer Service.&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;If you're moving more than 0.4 TB, STS is already a more efficient and safer choice.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Utilizing Storage Transfer Service (STS)
&lt;/h2&gt;

&lt;p&gt;Migrating and renaming Google Cloud Storage (GCS) buckets using Storage Transfer Service is the fastest and safest strategy for large datasets.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why STS?
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Parallelization: Transfers multiple files simultaneously.&lt;/li&gt;
&lt;li&gt;Resilience: Handles failures and retries automatically.&lt;/li&gt;
&lt;li&gt;Scheduling: Allows for one-time or recurring transfers.&lt;/li&gt;
&lt;li&gt;Metadata Preservation: Retains object metadata during transfers.
you can learning more about this &lt;a href="https://cloud.google.com/storage-transfer/docs/cloud-storage-to-cloud-storage" rel="noopener noreferrer"&gt;here.&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In the &lt;a href="https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-1-2kcl"&gt;tutorial&lt;/a&gt;, the challenge was to migrate a GCS bucket to a new project. However, using &lt;code&gt;gcloud storage cp&lt;/code&gt; for large datasets can be slow and inefficient. We’ll solve this using STS.&lt;/p&gt;

&lt;h2&gt;
  
  
  Before Executing
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Enable the Storage Transfer Service API in both projects.&lt;/li&gt;
&lt;li&gt;The first transfer job (source → temp) is created in the source project.&lt;/li&gt;
&lt;li&gt;Ensure the logged-in Service Account has the necessary roles assigned:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;roles/storage.admin&lt;/code&gt;&lt;br&gt;
&lt;code&gt;roles/storagetransfer.admin&lt;/code&gt;&lt;br&gt;
&lt;code&gt;roles/iam.securityAdmin (optional, for IAM policy manipulation)&lt;/code&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Executing First Script
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Clone the GitHub &lt;a href="https://github.com/wessRibeiro/moving-buckets-sts" rel="noopener noreferrer"&gt;repository&lt;/a&gt; to execute this tutorial. It contains 2 scripts you’ll need to fill in with your environment variables. Follow the included &lt;code&gt;README&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Run the first script
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bash 1_script_source_to_temp.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The script will:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a temporary bucket&lt;/li&gt;
&lt;li&gt;Copy IAM policies from the original bucket to policies.json&lt;/li&gt;
&lt;li&gt;Let you edit policies.json to keep existing permissions and add new ones&lt;/li&gt;
&lt;li&gt;Apply all policies to the temporary bucket on confirmation&lt;/li&gt;
&lt;li&gt;Assign roles to the Storage Transfer Service account to source and temp bucket&lt;/li&gt;
&lt;li&gt;Create a job and initiate transfer from source to temporary bucket&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;To monitor detailed progress (percentage, transfer speed, files copied), use the Google Cloud Console:&lt;br&gt;
Console: &lt;a href="https://console.cloud.google.com/transfer/" rel="noopener noreferrer"&gt;https://console.cloud.google.com/transfer/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Executing Second Script
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;temp and new original bucket are in the same project (new project) the job from temp to new origin will be create in the new project&lt;/li&gt;
&lt;li&gt;You’ll need to fill in with your environment variables. Follow the included &lt;code&gt;README&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Run the Second script
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bash 2_script_temp_to_new_source.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The script will:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;⚠️ After ensuring all data is transferred, delete the Original Bucket&lt;/li&gt;
&lt;li&gt;Recreate the Bucket with the Original Name in the new project&lt;/li&gt;
&lt;li&gt; Copy IAM policies from the temp bucket to &lt;code&gt;policies.json&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Let you edit policies.json to keep existing permissions and add new ones&lt;/li&gt;
&lt;li&gt;Apply all policies to the new Original Bucket on confirmation&lt;/li&gt;
&lt;li&gt;Assign roles to the Storage Transfer Service account to temp and new original bucket&lt;/li&gt;
&lt;li&gt;Create a job and initiate transfer from temporary to new original bucket&lt;/li&gt;
&lt;li&gt;⚠️ After ensuring all data is transferred, delete the temporary bucket&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;At the end, two transfer jobs will be created:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;One in the source project (source → temp)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;One in the destination project (temp → renamed bucket)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Both are one-time transfers using Storage Transfer Service, ensuring a high-speed, metadata-preserving, and downtime-free bucket migration.&lt;/p&gt;

</description>
      <category>google</category>
      <category>googlecloud</category>
      <category>devops</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Moving Google Cloud Scheduler between projects: A Safe Strategy</title>
      <dc:creator>Weslley Ribeiro</dc:creator>
      <pubDate>Thu, 15 May 2025 17:45:03 +0000</pubDate>
      <link>https://dev.to/weslley_ribeiro01/moving-google-cloud-scheduler-between-projects-a-safe-strategy-9pm</link>
      <guid>https://dev.to/weslley_ribeiro01/moving-google-cloud-scheduler-between-projects-a-safe-strategy-9pm</guid>
      <description>&lt;h1&gt;
  
  
  🔄 Use Case
&lt;/h1&gt;

&lt;p&gt;You have scheduled jobs using Cloud Scheduler in a project called &lt;strong&gt;Project A&lt;/strong&gt; (e.g., &lt;code&gt;develop&lt;/code&gt;), but now you want to migrate them to &lt;strong&gt;Project B&lt;/strong&gt;, where you're centralizing the infrastructure resources of your application.&lt;/p&gt;

&lt;p&gt;This approach follows Google's recommended best practices:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Centralization&lt;/strong&gt;: makes maintenance and control easier.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost organization&lt;/strong&gt;: improves billing tracking.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automation&lt;/strong&gt;: avoids manually recreating jobs one by one.
&lt;/li&gt;
&lt;/ul&gt;




&lt;h1&gt;
  
  
  ⚙️ Prerequisites
&lt;/h1&gt;

&lt;p&gt;Clone the &lt;a href="https://github.com/wessRibeiro/moving-scheduler-script" rel="noopener noreferrer"&gt;repo&lt;/a&gt; at repository it contains the script you’ll need to fill in with your environment variables. Follow the included README.&lt;/p&gt;

&lt;h3&gt;
  
  
  Edit script variables
&lt;/h3&gt;

&lt;p&gt;At the beginning of the script, modify the values below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;SOURCE_PROJECT&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"source-project-id"&lt;/span&gt;
&lt;span class="nv"&gt;DESTINATION_PROJECT&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"destination-project-id"&lt;/span&gt;
&lt;span class="nv"&gt;SOURCE_LOCATION&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"source-region"&lt;/span&gt;         &lt;span class="c"&gt;# e.g., us-central1&lt;/span&gt;
&lt;span class="nv"&gt;DESTINATION_LOCATION&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"destination-region"&lt;/span&gt;   &lt;span class="c"&gt;# e.g., us-central1&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Install required tools
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;gcloud CLI&lt;/code&gt; (authenticated in both projects with appropriate permissions)
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;jq&lt;/code&gt; (for JSON manipulation)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Required permissions
&lt;/h3&gt;

&lt;p&gt;The account running the script must have permissions to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;List, describe, and pause jobs in the source project
&lt;/li&gt;
&lt;li&gt;Create and pause jobs in the destination project
&lt;/li&gt;
&lt;/ul&gt;




&lt;h1&gt;
  
  
  🚀 What the script does
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;Lists all Cloud Scheduler jobs in the source project:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud scheduler &lt;span class="nb"&gt;jobs &lt;/span&gt;list &lt;span class="nt"&gt;--project&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$SOURCE_PROJECT&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="nt"&gt;--location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="nv"&gt;$SOURCE_LOCATION&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;For each job found:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Asks if you want to copy it (y/n)&lt;/li&gt;
&lt;li&gt;If the job already exists in the destination, it notifies and skips&lt;/li&gt;
&lt;li&gt;Otherwise:

&lt;ul&gt;
&lt;li&gt;Exports the job data (&lt;code&gt;gcloud scheduler jobs describe&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Saves it to a file named &lt;code&gt;schedulers.json&lt;/code&gt; (for backup/audit)&lt;/li&gt;
&lt;li&gt;Creates a new job in the destination with the same data (schedule, time zone, HTTP method, headers, body, retries, etc.)&lt;/li&gt;
&lt;li&gt;Pauses the job in the source project (to avoid duplicate execution)&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;(Optional)&lt;/em&gt; Pauses the job in the destination if it was already paused in the source&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h1&gt;
  
  
  📝 Script Structure
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Fully interactive (confirmation for each job)&lt;/li&gt;
&lt;li&gt;Saves original job JSON data in the &lt;code&gt;schedulers.json&lt;/code&gt; file (for later review or re-execution)&lt;/li&gt;
&lt;li&gt;Supports:

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;httpTarget&lt;/code&gt; (including base64 body)&lt;/li&gt;
&lt;li&gt;Custom headers&lt;/li&gt;
&lt;li&gt;Retry settings (attempts, backoff, etc.)&lt;/li&gt;
&lt;li&gt;Paused jobs remain paused in the destination&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h1&gt;
  
  
  💡 Important Tips
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;Carefully review each job before applying.&lt;/li&gt;
&lt;li&gt;Use &lt;code&gt;schedulers.json&lt;/code&gt; as a backup.&lt;/li&gt;
&lt;li&gt;After migration, update any systems or variables that depend on job location (e.g., logging, monitoring, or IAM bindings).&lt;/li&gt;
&lt;/ul&gt;




&lt;h1&gt;
  
  
  📌 Practical Example
&lt;/h1&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;bash migrate_scheduler_jobs.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The script will list all available jobs.&lt;/p&gt;

&lt;p&gt;For each one, it will ask:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;wish copy job 'my-job' to project 'destination'? (y/n)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After copying, it will confirm:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;✅ Job 'my-job' created successfully.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And it will pause the original job:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;⏸️ Pausing job 'my-job' in source project...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h1&gt;
  
  
  📂 &lt;code&gt;schedulers.json&lt;/code&gt; File
&lt;/h1&gt;

&lt;p&gt;This file serves as an execution history, containing all exported jobs from the source project.&lt;/p&gt;

&lt;p&gt;You can later use it to compare jobs between environments or even recreate them manually if needed.&lt;/p&gt;




&lt;h1&gt;
  
  
  ✅ Conclusion
&lt;/h1&gt;

&lt;p&gt;This script is a practical and safe solution to migrate Cloud Scheduler jobs between different projects in Google Cloud, maintaining configurations intact and reducing the risk of human error.&lt;/p&gt;

&lt;p&gt;If you want to evolve this process into a CI/CD scenario, you can automate script calls using environment variables defined in your pipeline.&lt;/p&gt;

&lt;p&gt;If you need help adapting or improving it, I can assist!&lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>devops</category>
      <category>cloud</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Moving and Renaming Google Cloud Storage Buckets Without Downtime: A Safe Strategy - part 1</title>
      <dc:creator>Weslley Ribeiro</dc:creator>
      <pubDate>Fri, 25 Apr 2025 19:19:49 +0000</pubDate>
      <link>https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-1-2kcl</link>
      <guid>https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-1-2kcl</guid>
      <description>&lt;h2&gt;
  
  
  📦 My Scenario
&lt;/h2&gt;

&lt;p&gt;Currently, I have a &lt;strong&gt;bucket&lt;/strong&gt; located in a project called &lt;strong&gt;Project 1&lt;/strong&gt;, which is serving the production environment of my application. As the project has grown, the goal is to move this bucket to &lt;strong&gt;Project 2&lt;/strong&gt;, which contains all other infrastructure resources for my application in a centralized way.&lt;br&gt;
This migration will follow Google's recommended &lt;strong&gt;best practices, centralizing all resources into a single project, which will provide&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Better cost tracking&lt;/strong&gt;: Keeping all resources in one project allows for more efficient and precise cost control.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Easier maintenance&lt;/strong&gt;: Consolidating infrastructure in a single project simplifies management and operations, reducing complexity.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  🔍 Summary of limitations and alternatives for bucket migration as recommended by Google
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Some bucket properties are &lt;strong&gt;permanent&lt;/strong&gt; e &lt;strong&gt;cannot be changed&lt;/strong&gt; after creation:

&lt;ul&gt;
&lt;li&gt;Bucket name&lt;/li&gt;
&lt;li&gt;Geographic location&lt;/li&gt;
&lt;li&gt;Associated project&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;The process of "moving" or "renaming" a bucket involves creating a new bucket and migrating the data.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Strategy for empty buckets&lt;/strong&gt;&lt;br&gt;
     - Delete the old bucket.&lt;br&gt;
     - Create a new bucket with the desired properties.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Delete the empty original bucket&lt;/span&gt;
gcloud storage buckets delete gs://SOURCE_BUCKET

&lt;span class="c"&gt;# Create the new bucket&lt;/span&gt;
gcloud storage buckets create gs://DESTINATION_BUCKET &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;--location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;REGION &lt;span class="se"&gt;\&lt;/span&gt;
       &lt;span class="nt"&gt;--project&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;PROJECT_ID
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Strategy for buckets with data&lt;/strong&gt;&lt;br&gt;
     - Create a new bucket with the desired properties.&lt;br&gt;
     - Copy the data from the old bucket to the new one.&lt;br&gt;
     - Delete the old bucket and its contents.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Create the new bucket&lt;/span&gt;
gcloud storage buckets create gs://DESTINATION_BUCKET &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--project&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;ID_DO_PROJETO &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;LOCALIZACAO

&lt;span class="c"&gt;# Recursively copy all objects&lt;/span&gt;
gcloud storage &lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="nt"&gt;--recursive&lt;/span&gt; gs://SOURCE_BUCKET/&lt;span class="k"&gt;*&lt;/span&gt; gs://DESTINATION_BUCKET

&lt;span class="c"&gt;# Delete the original bucket and its contents&lt;/span&gt;
gcloud storage &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;--recursive&lt;/span&gt; gs://SOURCE_BUCKET
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Strategy to reuse the same bucket name&lt;/strong&gt;&lt;br&gt;
     - Create a temporary bucket with a different name.&lt;br&gt;
     - Copy the data into this temporary bucket.&lt;br&gt;
     - Delete the original bucket.&lt;br&gt;
     - Create a new bucket with the same name as the original.&lt;br&gt;
     - Copy the data back from the temporary bucket to the new bucket.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Create temporary bucket&lt;/span&gt;
gcloud storage buckets create gs://TEMP_BUCKET &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--project&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;ID_DO_PROJETO &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;LOCALIZACAO
&lt;span class="c"&gt;# Copy data from original bucket to temporary&lt;/span&gt;
gcloud storage &lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="nt"&gt;--recursive&lt;/span&gt; gs://SOURCE_BUCKET/&lt;span class="k"&gt;*&lt;/span&gt; gs://TEMP_BUCKET
&lt;span class="c"&gt;# Delete original bucket&lt;/span&gt;
gcloud storage &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;--recursive&lt;/span&gt; gs://SOURCE_BUCKET
&lt;span class="c"&gt;# Recreate the original bucket with the same name&lt;/span&gt;
gcloud storage buckets create gs://SOURCE_BUCKET &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--project&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;ID_DO_PROJETO &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--location&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;LOCALIZACAO
&lt;span class="c"&gt;# Copy data back from temporary to new bucket&lt;/span&gt;
gcloud storage &lt;span class="nb"&gt;cp&lt;/span&gt; &lt;span class="nt"&gt;--recursive&lt;/span&gt; gs://TEMP_BUCKET/&lt;span class="k"&gt;*&lt;/span&gt; gs://SOURCE_BUCKET
&lt;span class="c"&gt;# Delete temporary bucket&lt;/span&gt;
gcloud storage &lt;span class="nb"&gt;rm&lt;/span&gt; &lt;span class="nt"&gt;--recursive&lt;/span&gt; gs://TEMP_BUCKET
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://cloud.google.com/storage/docs/moving-buckets" rel="noopener noreferrer"&gt;Supporting documentation&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  🧪 Evaluation
&lt;/h2&gt;

&lt;p&gt;Both the development and production buckets are in Project 1. I’ll start by applying the migration strategy in the &lt;strong&gt;development&lt;/strong&gt; environment to fail safely before touching production.&lt;br&gt;
I chose the strategy that keeps the same bucket name. Unfortunately, any option involves a tradeoff — in this case, a short &lt;strong&gt;downtime&lt;/strong&gt;.&lt;br&gt;
My application uses a CI/CD pipeline, and the bucket name is stored as an environment variable in &lt;strong&gt;GitHub Actions&lt;/strong&gt;. So, theoretically, after making the changes in the cloud, I can just update the variable, and the application will start using the new bucket.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzokyj64ua1t4cyqey9pn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzokyj64ua1t4cyqey9pn.png" alt="print github actions env variables" width="800" height="85"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Before executing anything, always ask yourself: “What could go wrong?” By answering this, you’ll be more prepared to execute.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  💡 A fourth custom strategy (no downtime)
&lt;/h2&gt;

&lt;p&gt;While analyzing my scenario and Google's suggestion of preserving the bucket name, I identified the need for a fourth strategy that overcomes the downtime tradeoff with something more acceptable for my case.&lt;br&gt;
The goal is to &lt;strong&gt;move the bucket while keeping the same name and avoiding downtime&lt;/strong&gt;. Here's the proposed strategy:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create a temporary bucket;&lt;/li&gt;
&lt;li&gt;Copy data;&lt;/li&gt;
&lt;li&gt;Point the application to the temporary bucket;&lt;/li&gt;
&lt;li&gt;Delete the original bucket;&lt;/li&gt;
&lt;li&gt;Recreate the bucket with the original name;&lt;/li&gt;
&lt;li&gt;Copy data;&lt;/li&gt;
&lt;li&gt;Point the application back to it;&lt;/li&gt;
&lt;li&gt;Sync any final changes from the temporary bucket.
This avoids downtime while accepting other operational tradeoffs that are easier to manage in my case.
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv79knfkt5v51n9wste5i.png" alt="flow fourth estrategy" width="800" height="1200"&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;✅ Main Advantage&lt;/th&gt;
&lt;th&gt;⚠️ Main Tradeoff&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Zero Downtime&lt;/td&gt;
&lt;td&gt;Higher operational complexity&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  🚀 Executing
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Map all points where the bucket is used
&lt;em&gt;I searched the code and found some hardcoded URLs from an old implementation.
Since I’m keeping the bucket name, no changes are needed there.&lt;/em&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqr2kf1cdwl5rjemp6ar.png" alt="URLs hardcoded" width="800" height="62"&gt;
&lt;/li&gt;
&lt;li&gt;Check the current bucket size to estimate the transfer duration and validate completeness
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud storage &lt;span class="nb"&gt;du&lt;/span&gt; &lt;span class="nt"&gt;-s&lt;/span&gt; gs://SOURCE_BUCKET
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Returned:&lt;br&gt;
&lt;code&gt;2064031471&lt;/code&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Unit&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Bytes&lt;/td&gt;
&lt;td&gt;2.064.031.471 B&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Megabytes&lt;/td&gt;
&lt;td&gt;~1.968 MB&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Gigabytes&lt;/td&gt;
&lt;td&gt;~1,92 GB&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;This small size indicates that transfer won’t be a bottleneck.&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Make sure to replicate configurations such as lifecycle rules, IAM policies, object visibility, ACLs, etc.
Use this to inspect the original bucket:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud storage buckets describe gs://SOURCE_BUCKET
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Clone the repo at &lt;a href="https://github.com/wessRibeiro/moving-buckets-script" rel="noopener noreferrer"&gt;repository&lt;/a&gt; it contains 2 scripts you’ll need to fill in with your environment variables. Follow the included &lt;code&gt;README&lt;/code&gt;. &lt;br&gt;
Run the first script&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frx4mz4m122ual1euvlcz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frx4mz4m122ual1euvlcz.png" alt="bash script" width="800" height="39"&gt;&lt;/a&gt;&lt;br&gt;
The script will: &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a temporary bucket&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Copy IAM policies from the original bucket to policies.json&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Let you edit policies.json to keep existing permissions and add new ones&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Apply all policies to the temporary bucket on confirmation &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz1orvjftt1nz965ptdv8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz1orvjftt1nz965ptdv8.png" alt="first script steps" width="800" height="218"&gt;&lt;/a&gt;&lt;br&gt;
Then it starts copying objects and folders&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fixjo19qng70wuhetw1sq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fixjo19qng70wuhetw1sq.png" alt="coping files" width="800" height="55"&gt;&lt;/a&gt;&lt;br&gt;
It was fast!&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foswu6zkkc8rl9ozrmsc4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foswu6zkkc8rl9ozrmsc4.png" alt="copy time" width="800" height="135"&gt;&lt;/a&gt;&lt;br&gt;
Now, it’s time to &lt;strong&gt;point your app to the temporary bucket&lt;/strong&gt; while the script pauses. This avoids downtime since the temporary bucket is ready.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdf4yjhwifjvmswjpz6n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdf4yjhwifjvmswjpz6n.png" alt="changing bucket name to temporary" width="800" height="36"&gt;&lt;/a&gt;&lt;br&gt;
After updating the variables and redeploying with zero downtime:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0nrixgzka0irr58g3y4q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0nrixgzka0irr58g3y4q.png" alt="new application values" width="800" height="141"&gt;&lt;/a&gt;&lt;br&gt;
I tested the app — the image uploaded successfully to the temporary bucket!&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdtocynlwu45oc21feim8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdtocynlwu45oc21feim8.png" alt="apllication test" width="800" height="353"&gt;&lt;/a&gt;&lt;br&gt;
After confirming the application was using the temporary bucket, I resumed the script and it performed a final sync&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmxk2a7x6brr0kbfkduld.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmxk2a7x6brr0kbfkduld.png" alt="sync buckets" width="800" height="107"&gt;&lt;/a&gt;&lt;br&gt;
With that, we conclude &lt;strong&gt;Part 1&lt;/strong&gt; of the strategy — successfully migrating the app to use a temporary bucket with no downtime. &lt;a href="https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-2-3abb"&gt;Go to part 2&lt;/a&gt; to finish the transition: delete the old bucket, recreate it with the original name, and sync everything back.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>googlecloud</category>
      <category>tutorial</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>Moving and Renaming Google Cloud Storage Buckets Without Downtime: A Safe Strategy - part 2</title>
      <dc:creator>Weslley Ribeiro</dc:creator>
      <pubDate>Fri, 25 Apr 2025 19:19:11 +0000</pubDate>
      <link>https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-2-3abb</link>
      <guid>https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-2-3abb</guid>
      <description>&lt;p&gt;To execute the strategy of moving a bucket without downtime, it is necessary to work with an interactive script to standardize the actions, as well as perform some application updates during the process.&lt;br&gt;
In &lt;a href="https://dev.to/weslley_ribeiro01/moving-and-renaming-google-cloud-storage-buckets-without-downtime-a-safe-strategy-part-1-2kcl"&gt;Parte 1&lt;/a&gt; we understood the required steps to start the migration strategy.&lt;br&gt;
After executing it, we need to analyze if any issues occurred — such as errors accessing old files, difficulties inserting new objects into the new bucket, or any unexpected application errors.&lt;br&gt;
After completing all necessary validations, we can mitigate any concerns that deleting the old and temporary buckets could cause problems.&lt;/p&gt;

&lt;h2&gt;
  
  
  Completing the Migration Process
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;To execute this script, the account authenticated in the gcloud CLI must have the following permissions&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Storage Admin or equivalent permissions in both the source and destination projects.&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;p&gt;Execute Script 2 (which includes deletion actions)&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyeftzauzlk6xgx6v1z9q.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyeftzauzlk6xgx6v1z9q.webp" alt="running script 2" width="757" height="52"&gt;&lt;/a&gt;&lt;br&gt;
Confirm the deletion of the old bucket&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqek70qmceyo2l7aubjnu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqek70qmceyo2l7aubjnu.png" alt="Deleting the old bucket" width="800" height="47"&gt;&lt;/a&gt;&lt;br&gt;
Notice that the script deletes the old bucket and creates a new one with the same name in the new destination project.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fej5umbyi42a3bib287vi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fej5umbyi42a3bib287vi.png" alt="Deletion result" width="800" height="186"&gt;&lt;/a&gt;&lt;br&gt;
After validating and confirming the information inside the policies.json file, and pressing ENTER, the policies will be added one by one&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8oahbys70h3b8kxfkef.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu8oahbys70h3b8kxfkef.png" alt="Policies applied" width="800" height="257"&gt;&lt;/a&gt;&lt;br&gt;
The script then starts copying the files from the temporary bucket to the newly recreated bucket&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxm0naecwy2rdcc171bhm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxm0naecwy2rdcc171bhm.png" alt="Copying files" width="800" height="46"&gt;&lt;/a&gt;&lt;br&gt;
Once the copy finishes, the script will prompt you to update your application to use the newly created bucket (if it is not already), and will wait until you press ENTER to proceed.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faq6avjb4u9aotbrdpva5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faq6avjb4u9aotbrdpva5.png" alt="Copy completed" width="800" height="93"&gt;&lt;/a&gt;&lt;br&gt;
After updating the application, the script will perform a final sync to ensure consistency, and then prompt you to confirm the deletion of the temporary bucket.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3xmwnj51c1xcksfqgee.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3xmwnj51c1xcksfqgee.png" alt="sync" width="800" height="131"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdcaw23y7wqbn9op21gp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdcaw23y7wqbn9op21gp.png" alt="Migration completed" width="423" height="114"&gt;&lt;/a&gt;&lt;br&gt;
Congratulations! 🎉&lt;br&gt;
You have successfully completed a bucket migration with zero downtime.&lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>tutorial</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
