Hello, I'm Maneshwar. I'm building git-lrc, an AI code reviewer that runs on every commit. It is free, unlimited, and source-available on Github. Star Us to help devs discover the project. Do give it a try and share your feedback for improving the product.
---Migrating a virtual machine between cloud providers is a powerful way to gain flexibility and avoid vendor lock-in.
If you've been running a service on an AWS EC2 instance and are ready to move it to Google Cloud Platform (GCP), this post will walk you through how to export the entire EC2 instance (OS, files, configs, apps) and recreate it on GCP as a Compute Engine VM.
Why Migrate This Way?
Typical migrations often involve recreating environments and manually syncing files — which is time-consuming and error-prone. This approach allows you to:
- Preserve full OS and disk state
- Avoid surprises with missing configs or packages
- Port legacy apps without rebuilding
Prerequisites
- An AWS EC2 instance running Linux
- Access to AWS CLI & GCP SDK
- A GCP project and bucket ready for the import
- An S3 bucket in AWS for temporary storage
- IAM permissions on both sides
Step 1: Create an AMI of Your EC2 Instance
You can’t export a live EC2 instance directly — you first create an Amazon Machine Image (AMI).
aws ec2 create-image \
--region us-east-1 \
--instance-id i-xxxxxxxxxxxxxxxxx \
--name "my-ec2-export" \
--no-reboot
Save the ImageId returned. You'll need it in the next step.
Step 2: Export the AMI to an S3 Bucket
Before exporting, you need a service role named vmimport with specific trust and permission policies. Here’s how to set that up.
1. Create a trust policy file (trust-policy.json):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": { "Service": "vmie.amazonaws.com" },
"Action": "sts:AssumeRole",
"Condition": { "StringEquals": { "sts:ExternalId": "vmimport" } }
}
]
}
2. Create the role:
aws iam create-role \
--role-name vmimport \
--assume-role-policy-document file://trust-policy.json
3. Attach permissions (role-policy.json):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetBucketLocation", "s3:GetObject", "s3:PutObject", "s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::your-s3-bucket",
"arn:aws:s3:::your-s3-bucket/*"
]
},
{
"Effect": "Allow",
"Action": [
"ec2:ExportImage", "ec2:Describe*"
],
"Resource": "*"
}
]
}
aws iam put-role-policy \
--role-name vmimport \
--policy-name vmimport \
--policy-document file://role-policy.json
Export the AMI as VHD
aws ec2 export-image \
--region us-east-1 \
--image-id ami-xxxxxxxxxxxxxxx \
--disk-image-format VHD \
--s3-export-location S3Bucket=your-s3-bucket,S3Prefix=ec2-export/ \
--role-name vmimport
Use describe-export-image-tasks to monitor progress:
aws ec2 describe-export-image-tasks --region us-east-1
Step 3: Transfer to GCP and Import
1. Download the .vhd from S3 and upload it to GCS
gsutil cp local-image.vhd gs://your-gcp-bucket/ec2-image.vhd
Or use a signed S3 URL and upload directly to GCP via server-side copy.
2. Import the image to GCP:
gcloud compute images import my-ec2-image \
--source-file=gs://your-gcp-bucket/ec2-image.vhd \
--os=debian-10 \
--timeout=60m
Step 4: Launch a VM from the Imported Image
gcloud compute instances create my-ec2-on-gcp \
--image=my-ec2-image \
--image-project=your-gcp-project \
--zone=us-central1-a \
--machine-type=e2-micro
Done!
You now have your original AWS EC2 instance running in Google Cloud — same OS, same configuration, same files. This method is particularly useful when dealing with:
- Legacy systems
- Deeply customized Linux environments
- Self-hosted apps not easily containerized
Tips
- Always test the image in a staging VM before routing production traffic.
- For Windows instances, the process is similar but requires additional licensing checks.
- Consider switching to GCP’s OS login and IAM-managed SSH for added security.
helps you get all your backend APIs documented in a few minutes.
With , you can generate interactive API docs that allow users to search and execute endpoints directly from the browser.
If you're tired of updating manually or syncing collections, give it a shot.
*AI agents write code fast. They also silently remove logic, change behavior, and introduce bugs -- without telling you. You often find out in production.
git-lrc fixes this. It hooks into git commit and reviews every diff before it lands. 60-second setup. Completely free.*
Any feedback or contributors are welcome! It's online, source-available, and ready for anyone to use.
⭐ Star it on GitHub:
HexmosTech
/
git-lrc
Free, Unlimited AI Code Reviews That Run on Commit
| 🇩🇰 Dansk | 🇪🇸 Español | 🇮🇷 Farsi | 🇫🇮 Suomi | 🇯🇵 日本語 | 🇳🇴 Norsk | 🇵🇹 Português | 🇷🇺 Русский | 🇦🇱 Shqip | 🇨🇳 中文 |
git-lrc
Free, Unlimited AI Code Reviews That Run on Commit
AI agents write code fast. They also silently remove logic, change behavior, and introduce bugs -- without telling you. You often find out in production.
git-lrc fixes this. It hooks into git commit and reviews every diff before it lands. 60-second setup. Completely free.
See It In Action
See git-lrc catch serious security issues such as leaked credentials, expensive cloud operations, and sensitive material in log statements
git-lrc-intro-60s.mp4
Why
- 🤖 AI agents silently break things. Code removed. Logic changed. Edge cases gone. You won't notice until production.
- 🔍 Catch it before it ships. AI-powered inline comments show you exactly what changed and what looks wrong.
- 🔁 Build a…
Top comments (0)