Every now and then, developers find themselves in a situation where they need to migrate container images between Amazon ECR repositories.
Maybe the naming convention changed.
Maybe a new repo structure was introduced.
Maybe you just want to clean things up.
Whatever the reason, doing this manually — one image at a time — is slow and painful. And in fast-moving teams, anything that slows us down affects our velocity and robustness.
That’s why I built an automation script to migrate images quickly and safely. It uses skopeo under the hood to copy images between repositories without pulling and pushing them manually.
Let’s dive in. 👇
🛠️ Prerequisites
- AWS CLI configured with a profile that has access to the ECR repositories
 - 
jqfor JSON parsing - 
skopeofor copying container images 
  
  
  Installing skopeo
- macOS (Homebrew)
 
  brew install skopeo
- Ubuntu/Debian
 
  sudo apt-get update
  sudo apt-get -y install skopeo
- 
Windows (via WSL2 or Chocolatey)
- If you’re using WSL2 (Ubuntu), just use the Ubuntu instructions.
 - If you’re on Windows with Chocolatey:
 
choco install skopeo 
📜 The Migration Script
Below is the script I use on macOS (works with GNU/Linux and WSL as well, only date syntax differs).
It migrates all images pushed within the last N days from a source ECR repo to a destination ECR repo.
👉 Save it as migrate-ecr-images.sh and run it.
#!/usr/bin/env bash
set -euo pipefail
# --- check for skopeo and install if missing (macOS Homebrew) ---
if ! command -v skopeo >/dev/null 2>&1; then
  echo "skopeo not found, installing..."
  if ! command -v brew >/dev/null 2>&1; then
    echo "Homebrew is required but not found. Please install Homebrew first: https://brew.sh/"
    exit 1
  fi
  brew install skopeo
fi
SRC_REPO="java-micro-repo"
DST_REPO="repo"
WINDOW_DAYS=90
DRY_RUN=false
# --- detect env from AWS profile ---
REGION="${AWS_REGION:-${AWS_DEFAULT_REGION:-$(aws configure get region)}}"
: "${REGION:?Region not set in environment or AWS config}"
ACCOUNT_ID="$(aws sts get-caller-identity --query Account --output text)"
REG_HOST="${ACCOUNT_ID}.dkr.ecr.${REGION}.amazonaws.com"
# --- BSD/GNU date helpers ---
is_bsd_date=false
if date -v-1d +%s >/dev/null 2>&1; then is_bsd_date=true; fi
cutoff_epoch() {
  if $is_bsd_date; then date -u -v-"${WINDOW_DAYS}"d +%s; else date -u -d "${WINDOW_DAYS} days ago" +%s; fi
}
to_epoch() {
  ts="$1"
  if $is_bsd_date; then
    ts_norm=$(printf "%s" "$ts" \
      | sed -E 's/Z$/+0000/' \
      | sed -E 's/([+-][0-9]{2}):([0-9]{2})$/\1\2/' \
      | sed -E 's/([0-9]{2}:[0-9]{2}:[0-9]{2})\.[0-9]+/\1/')
    case "$ts_norm" in *+????|*-[0-9][0-9][0-9][0-9]) : ;; *) ts_norm="${ts_norm}+0000" ;; esac
    date -j -u -f "%Y-%m-%dT%H:%M:%S%z" "$ts_norm" +%s
  else
    date -u -d "$ts" +%s
  fi
}
CUTOFF_EPOCH="$(cutoff_epoch)"
echo "Source:      ${REG_HOST}/${SRC_REPO}"
echo "Destination: ${REG_HOST}/${DST_REPO}"
echo "Window:      last ${WINDOW_DAYS} days"
echo "Dry run:     ${DRY_RUN}"
echo
# --- ensure destination repo exists ---
if ! aws ecr describe-repositories --repository-names "$DST_REPO" >/dev/null 2>&1; then
  echo "Creating destination repo: $DST_REPO"
  aws ecr create-repository --repository-name "$DST_REPO" >/dev/null
  aws ecr put-image-tag-mutability --repository-name "$DST_REPO" --image-tag-mutability MUTABLE >/dev/null || true
fi
# --- get ECR password for skopeo auth ---
ECR_PASS="$(aws ecr get-login-password --region "$REGION")"
# --- collect recent tags ---
TMP_LIST="$(mktemp)"; TMP_TAGS="$(mktemp)"
aws ecr describe-images --repository-name "$SRC_REPO" --output json \
| jq -r '.imageDetails[] | select(.imageTags!=null) | .imagePushedAt as $t | .imageTags[] | "\($t)\t\(.)"' > "$TMP_LIST"
while IFS=$'\t' read -r pushed tag; do
  epoch="$(to_epoch "$pushed")" || continue
  if [ "$epoch" -ge "$CUTOFF_EPOCH" ]; then printf "%s\n" "$tag"; fi
done < "$TMP_LIST" | sort -u > "$TMP_TAGS"
if ! [ -s "$TMP_TAGS" ]; then
  echo "No images pushed in the last ${WINDOW_DAYS} days. Nothing to do."
  rm -f "$TMP_LIST" "$TMP_TAGS"; exit 0
fi
echo "Found $(wc -l < "$TMP_TAGS" | tr -d ' ') tag(s):"
sed 's/^/  - /' "$TMP_TAGS"
echo
FAILED=()
while IFS= read -r TAG; do
  [ -z "$TAG" ] && continue
  SRC="docker://${REG_HOST}/${SRC_REPO}:${TAG}"
  DST="docker://${REG_HOST}/${DST_REPO}:${TAG}"
  echo "Copying ${TAG} …"
  if [ "$DRY_RUN" = true ]; then
    echo "  [dry-run] skopeo copy --all ${SRC} -> ${DST}"
    continue
  fi
  if ! skopeo copy --all \
        --src-creds "AWS:${ECR_PASS}" \
        --dest-creds "AWS:${ECR_PASS}" \
        "$SRC" "$DST"; then
    echo "  ❌ Failed: ${TAG}"
    FAILED+=("$TAG")
  else
    echo "  ✅ Done: ${TAG}"
  fi
done < "$TMP_TAGS"
rm -f "$TMP_LIST" "$TMP_TAGS"
echo
if [ "${#FAILED[@]}" -gt 0 ]; then
  echo "Completed with failures:"
  printf '  - %s\n' "${FAILED[@]}"
  exit 1
else
  echo "All requested tags copied successfully."
fi
⚡ How It Works
Finds recent images
Usingaws ecr describe-images, it filters tags by theirimagePushedAtdate within the lastWINDOW_DAYS.Creates the destination repo if missing
So you don’t have to do it manually.Copies images with skopeo
No need to pull and re-push—skopeocopies directly between registries.Supports dry-run mode
SetDRY_RUN=trueto preview what will happen.
🔧 Example Usage
Copy the last 90 days of images from java-micro-repo to repo:
SRC_REPO="java-micro-repo" DST_REPO="repo" WINDOW_DAYS=90 ./migrate-ecr-images.sh
Run a dry run (no changes made):
DRY_RUN=true ./migrate-ecr-images.sh
✅ Why This Helps
- Saves hours compared to manual copying
 - Automates repo creation if missing
 - Respects velocity → lets you focus on coding, not housekeeping
 - Cross-platform: works on macOS, Ubuntu, and Windows
 
🎯 Final Thoughts
This script became a huge time-saver in my team. Anytime we need to rename or restructure ECR repositories, it’s just one command away.
If you work with AWS ECR often, I recommend keeping this script handy in your toolbox. 🔧
👉 What about you — have you run into ECR image migration headaches?
Drop your thoughts in the comments—I’d love to hear how you’ve solved it.
    
Top comments (1)
Thanks, I'll try building something similar for Azure. This reference is helpful.