DEV Community

Jayanth MKV
Jayanth MKV

Posted on

1 1 1 2 1

Pushing Python Packages to Artifact Registry Using Cloud Build

Google Artifact Registry is a powerful solution for managing and hosting Python package artifacts in a private, secure, and scalable way. This guide provides a step-by-step walkthrough to push Python package .whl files to the Artifact Registry using Google Cloud Build and a secret (creds) from Google Secret Manager for authentication.


Prerequisites

  1. Artifact Registry Setup:

    • Create a Python repository in your Artifact Registry:
     gcloud artifacts repositories create python-packages \
       --repository-format=python \
       --location=us-central1 \
       --description="Python packages repository"
    
  2. Secret Setup:

    • Store your key as a secret in Google Secret Manager:
     gcloud secrets create creds --data-file=path/to/key.json
    
  • Grant Cloud Build access to the secret:(Optional, can also be done using IAM)

     gcloud secrets add-iam-policy-binding creds \
       --member="serviceAccount:$(gcloud projects describe $PROJECT_ID --format='value(projectNumber)')@cloudbuild.gserviceaccount.com" \
       --role="roles/secretmanager.secretAccessor"
    
  1. Cloud Build Permissions: Ensure your Cloud Build service account has the necessary permissions to access the Artifact Registry and Secret Manager.

Cloud Build YAML Configuration

Here's the full working cloudbuild.yaml file:

options:
  machineType: E2_HIGHCPU_8
  substitutionOption: ALLOW_LOOSE
  logging: CLOUD_LOGGING_ONLY

steps:
  # Step 1: Access the secret `creds` and save it as `key.json`
  - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk'
    entrypoint: bash
    args:
      - '-c'
      - |
        gcloud secrets versions access latest --secret=creds > /workspace/key.json

  # Step 2: Configure `.pypirc` with the Artifact Registry credentials
  - name: 'python'
    entrypoint: bash
    args:
      - '-c'
      - |
        cat > ~/.pypirc << EOL
        [distutils]
        index-servers = tower-common-repo

        [tower-common-repo]
        repository: https://us-central1-python.pkg.dev/$PROJECT_ID/python-packages/
        username: _json_key_base64
        password: $(base64 -w0 /workspace/key.json)
        EOL

        # Step 3: Build and upload the Python package
        pip install twine build && \
        python -m build && \
        twine upload --repository tower-common-repo dist/* --verbose
Enter fullscreen mode Exit fullscreen mode

Step-by-Step Explanation

  1. Define Build Options:

    • Set the machine type, substitution behavior, and logging options.
    • These configurations ensure efficient builds and manageable logs.
  2. Retrieve key.json Secret:

    • Use gcloud secrets versions access to fetch the key.json file securely from Secret Manager.
    • Save the file to a known location (/workspace/key.json).
  3. Configure .pypirc:

    • Generate a .pypirc file dynamically. This file is required for twine to authenticate with the Artifact Registry.
    • The password is base64-encoded content of key.json.
  4. Build and Push Package:

    • Install necessary tools (twine, build).
    • Build the Python package (python -m build).
    • Use twine upload to push the .whl file to the Artifact Registry.

Triggering the Build

Save the cloudbuild.yaml file and trigger the build or can connect to github repository:

gcloud builds submit --config=cloudbuild.yaml .
Enter fullscreen mode Exit fullscreen mode

Key Points

  • Secure Secrets Management: The secret (key.json) is accessed securely using Google Secret Manager.
  • Dynamic Configuration: .pypirc is generated during the build, ensuring no sensitive data is stored in the repository.
  • Automated Upload: The process automates package building and pushing, reducing manual intervention.

Validation

After the build completes:

  1. Verify the uploaded package in the Artifact Registry:
   gcloud artifacts packages list --repository=python-packages --location=us-central1
Enter fullscreen mode Exit fullscreen mode
  1. Check for errors or warnings in the build logs.

Heroku

Simplify your DevOps and maximize your time.

Since 2007, Heroku has been the go-to platform for developers as it monitors uptime, performance, and infrastructure concerns, allowing you to focus on writing code.

Learn More

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs