DEV Community

hasanmn
hasanmn

Posted on

Using Gemini CLI with Vertex AI (Without Worrying About Your Data)

If you've been playing around with Gemini CLI, you might have glossed over a small but important detail: when you use it through the standard Gemini API, there's a real chance your input data gets fed back into Google's model training pipeline. For personal tinkering, that's probably fine. For anything work-related -- internal tools, client data, proprietary code -- that's a different story.

The fix is straightforward: route everything through Vertex AI instead. Google's enterprise data handling guarantees apply there, which means your prompts stay your prompts. This post walks through the full setup from scratch.

Prerequisites

You'll need Node.js installed on your machine. That's it on the local side. On the cloud side, you'll need a GCP project with billing enabled and the gcloud CLI installed.

Step 1 -- Install Gemini CLI

npm install -g @google/gemini-cli
Enter fullscreen mode Exit fullscreen mode

A global install so you can call gemini from anywhere in your terminal.

Step 2 -- Configure GCP

Two things to do in the GCP console:

Enable the Vertex AI API for your project. Navigate to APIs & Services → Library, search for "Vertex AI API", and hit Enable.

Create a service account for the CLI to impersonate:

# Create the service account
gcloud iam service-accounts create svc-vertex-cli \
  --display-name="Gemini CLI Service Account"

# Grant the minimum required role
gcloud projects add-iam-policy-binding <PROJECT_ID> \
  --member="serviceAccount:svc-vertex-cli@<PROJECT_ID>.iam.gserviceaccount.com" \
  --role="roles/aiplatform.user"
Enter fullscreen mode Exit fullscreen mode

The roles/aiplatform.user role is the minimum needed to execute prompts. If your organization enforces keyless authentication (a smart policy worth enabling), you can ban service account key creation under Organization Policies -- the ADC approach in Step 4 doesn't need a key file anyway.

Step 3 -- Set Up Environment Variables

Gemini CLI loads environment variables from ~/.gemini/.env. Create it if it doesn't exist:

mkdir -p ~/.gemini
vim ~/.gemini/.env
Enter fullscreen mode Exit fullscreen mode

Then add the following:

GOOGLE_GENAI_USE_VERTEXAI=true
GOOGLE_CLOUD_PROJECT=<PROJECT_ID>
GOOGLE_CLOUD_LOCATION=us-central1
Enter fullscreen mode Exit fullscreen mode

A note on location: us-central1 is the safest default since it has the broadest model availability. Other regions may not support all models yet.

Note: If you have GOOGLE_API_KEY or GEMINI_API_KEY set anywhere in your environment, you must unset them. They take precedence and will prevent ADC from working.

unset GOOGLE_API_KEY
unset GEMINI_API_KEY

Step 4 -- Log In with ADC

Application Default Credentials (ADC) is how the CLI authenticates without needing a service account key file sitting on disk. The trick is to log in as yourself first, then impersonate the service account:

# Log in with your own Google account
gcloud auth login

# Create ADC credentials that impersonate the service account
gcloud auth application-default login \
  --impersonate-service-account=svc-vertex-cli@<PROJECT_ID>.iam.gserviceaccount.com
Enter fullscreen mode Exit fullscreen mode

The second command creates a credentials file that the CLI (and any other GCP SDK) picks up automatically. Your personal account needs the roles/iam.serviceAccountTokenCreator role on the service account to impersonate it -- if you get a permission error, that's the likely cause.

Step 5 -- Run It

Interactive mode -- starts a REPL-style session:

gemini
Enter fullscreen mode Exit fullscreen mode

Single prompt -- great for scripting or quick one-offs:

gemini -p "Summarize this in 3 bullet points: ..."
Enter fullscreen mode Exit fullscreen mode

Specify a model -- pass the model ID directly:

gemini -m gemini-2.5-pro
Enter fullscreen mode Exit fullscreen mode




Why Bother with All This?

Fair question -- the standard Gemini API is easier to set up. The trade-off comes down to data control. Vertex AI gives you:

  • No training data opt-in -- your inputs aren't used to improve Google's models
  • Enterprise compliance -- easier to satisfy security reviews and internal policies
  • Audit logging -- Cloud Logging captures every API call, useful for debugging and compliance
  • Keyless auth -- no credentials file to accidentally commit or rotate

For a CLI tool you're going to use daily on real work, that's a worthwhile few extra minutes of setup.

Quick Reference

Command What it does
gemini Start interactive mode
gemini -p "..." Run a single prompt
gemini -m <model-id> Specify a model

Once it's running, the experience is identical to the regular Gemini CLI -- the Vertex AI routing is completely transparent. You get all the same features, just without the data handling concerns.

Top comments (0)