Most documentation for Vertex AI Agent Engine focuses on the Python SDK (vertexai.agent_engines.create). That works fine for one-off deployments, but if you want your agent infrastructure managed declaratively alongside the rest of your GCP resources, Terraform is the right tool.
This post walks through a complete Terraform setup for deploying a Google ADK agent to Vertex AI Agent Engine using google_vertex_ai_reasoning_engine.
Prerequisites
- Terraform >= 1.5
-
googleorgoogle-betaprovider -
aiplatform.googleapis.comenabled - A Google ADK agent wrapped in
AdkApp
How Agent Engine Deployment Works
The deployment model is straightforward: tar.gz your source code, base64-encode it, and pass it to the API via inline_source. The runtime handles dependency installation, session management, and streaming — you just provide the entrypoint.
The Agent Entrypoint
The key requirement is an AdkApp instance at the module level. This is what Terraform's entrypoint_object points to.
# src/myagent/agent.py
from google.adk.agents import Agent
from vertexai.agent_engines import AdkApp
def get_weather(city: str) -> dict:
"""Returns weather for a given city."""
return {"city": city, "weather": "sunny"}
root_agent = Agent(
name="weather_agent",
model="gemini-2.5-flash",
description="Answers weather questions",
instruction="Answer the user's weather question using the get_weather tool.",
tools=[get_weather],
)
# This is what entrypoint_object references
agent_engine = AdkApp(agent=root_agent)
Wrapping in AdkApp automatically exposes create_session, stream_query, and other ADK methods as callable endpoints.
Building the Source Archive
Agent Engine expects a base64-encoded tar.gz containing your source files and a requirements.txt. Here's a minimal build script that uses only the Python standard library:
# scripts/build_source.py
import base64
import io
import json
import sys
import tarfile
from pathlib import Path
def main():
query = json.load(sys.stdin)
project_root = Path(query["project_root"]).resolve()
src_dir = project_root / "src"
requirements = project_root / "requirements.txt"
buf = io.BytesIO()
with tarfile.open(fileobj=buf, mode="w:gz") as tar:
for path in sorted(src_dir.rglob("*.py")):
arcname = path.relative_to(project_root)
tar.add(path, arcname=str(arcname))
tar.add(requirements, arcname="requirements.txt")
b64 = base64.b64encode(buf.getvalue()).decode()
json.dump({"base64": b64}, sys.stdout)
if __name__ == "__main__":
main()
The requirements.txt lists PyPI package names — the Agent Engine runtime installs them at deploy time:
google-adk>=1.0.0
google-cloud-aiplatform[agent_engines]>=1.93.0
Terraform Configuration
provider.tf
terraform {
required_providers {
google = {
source = "hashicorp/google"
version = "~> 6.0"
}
google-beta = {
source = "hashicorp/google-beta"
version = "~> 6.0"
}
}
}
provider "google-beta" {
project = var.project_id
region = "asia-northeast1"
}
Wiring the Archive Build
Use the external data source to invoke the build script during terraform plan/apply:
data "external" "agent_source" {
program = ["python3", "${path.module}/scripts/build_source.py"]
query = {
project_root = "${path.module}/../.."
}
}
Every apply picks up the latest source automatically.
The Agent Engine Resource
resource "google_vertex_ai_reasoning_engine" "my_agent" {
provider = google-beta
display_name = "my-agent-${var.env}"
description = "ADK weather agent"
region = "asia-northeast1"
spec {
agent_framework = "google-adk"
service_account = var.service_account_email
class_methods = jsonencode([
{ name = "create_session", api_mode = "" },
{ name = "get_session", api_mode = "" },
{ name = "list_sessions", api_mode = "" },
{ name = "delete_session", api_mode = "" },
{ name = "stream_query", api_mode = "stream" },
])
source_code_spec {
inline_source {
source_archive = data.external.agent_source.result.base64
}
python_spec {
entrypoint_module = "src.myagent.agent"
entrypoint_object = "agent_engine"
requirements_file = "requirements.txt"
version = "3.12"
}
}
deployment_spec {
env {
name = "LOG_LEVEL"
value = "INFO"
}
secret_env {
name = "API_TOKEN"
secret_ref {
secret = "my-api-token"
version = "latest"
}
}
}
}
}
Block Reference
| Block | Purpose |
|---|---|
agent_framework |
Must be "google-adk" — tells the runtime which framework to use |
class_methods |
Enumerates callable methods; api_mode = "stream" enables SSE |
inline_source |
Embeds the base64 tar.gz directly — no GCS bucket needed |
python_spec |
Specifies the entrypoint and Python version |
deployment_spec |
Injects env vars and Secret Manager secrets at runtime |
class_methods in Detail
Every ADK method you want to expose must be declared explicitly:
class_methods = jsonencode([
{ name = "create_session", api_mode = "" },
{ name = "get_session", api_mode = "" },
{ name = "list_sessions", api_mode = "" },
{ name = "delete_session", api_mode = "" },
{ name = "stream_query", api_mode = "stream" },
])
api_mode = "stream" makes the method return a Server-Sent Events stream. Only stream_query needs this — the rest are standard request/response.
Gotchas
Provider choice matters. google_vertex_ai_reasoning_engine is available in both google and google-beta providers. Make sure the provider attribute matches whichever you configure.
The external data source re-runs on every plan. This is by design — you always get the latest source. If your build script is slow, consider caching or only calling it on apply.
entrypoint_module uses dot notation, not file paths. src.myagent.agent maps to src/myagent/agent.py in the archive. Match this to your actual directory structure.
Secret Manager secrets must already exist. Terraform reads existing secrets via data sources — it doesn't create them. Provision secrets separately before running apply.
Deploy and Verify
terraform init
terraform plan
terraform apply
Export the resource name to call the agent from Python:
output "agent_engine_resource_name" {
value = google_vertex_ai_reasoning_engine.my_agent.name
}
import vertexai
from vertexai.agent_engines import AdkApp
vertexai.init(project="my-project", location="asia-northeast1")
agent = AdkApp.from_resource_name(
"projects/my-project/locations/asia-northeast1/reasoningEngines/<ID>"
)
session = agent.create_session(user_id="user-1")
for chunk in agent.stream_query(
user_id="user-1",
session_id=session["id"],
message="What's the weather in Tokyo?",
):
print(chunk)
Summary
-
google_vertex_ai_reasoning_engineis available in bothgoogleandgoogle-betaproviders - Source code is delivered as a base64 tar.gz via
inline_source— no GCS required - A minimal Python script using only stdlib is enough to produce the archive
-
class_methodsmust explicitly enumerate every ADK method you want to expose - Secret Manager integration is declarative via
secret_envblocks
Terraforming Agent Engine makes cleanup (terraform destroy), environment promotion, and drift detection straightforward. The ADK + Terraform combination has sparse documentation, so hopefully this fills the gap.
Top comments (0)