DEV Community

Alain Airom
Alain Airom

Posted on

My first Experience with Google’s Agent Development Kit (ADK-Go)

A first short experience with Google’s Go language Agent Development Kit (ADK-Go)

Introduction

The landscape of Artificial Intelligence is shifting from models that simply “chat” to agents that “do.” As developers look for more robust ways to integrate LLMs into real-world workflows, Google has released the Agent Development Kit (ADK) — an open-source framework designed to take the complexity out of building reliable, tool-augmented AI agents. Whether you are looking to automate multi-step research tasks, manage local file systems, or bridge the gap between cloud models and local execution, the ADK provides the standardized scaffolding necessary to move from experimental prompts to production-ready autonomous assistants. By open-sourcing this toolkit, Google is offering the developer community a structured, modular approach to session management, tool integration, and model orchestration, effectively lowering the barrier to entry for the next generation of “agentic” software.

Excerpt from ADK-Go Repository

_Agent Development Kit (ADK) is a flexible and modular framework that applies software development principles to AI agent creation. It is designed to simplify building, deploying, and orchestrating agent workflows, from simple tasks to complex systems. While optimized for Gemini, ADK is model-agnostic, deployment-agnostic, and compatible with other frameworks.

This Go version of ADK is ideal for developers building cloud-native agent applications, leveraging Go’s strengths in concurrency and performance.

  • Idiomatic Go: Designed to feel natural and leverage the power of Go.
  • Rich Tool Ecosystem: Utilize pre-built tools, custom functions, or integrate existing tools to give agents diverse capabilities.
  • Code-First Development: Define agent logic, tools, and orchestration directly in Go for ultimate flexibility, testability, and versioning.
  • Modular Multi-Agent Systems: Design scalable applications by composing multiple specialized agents.
  • Deploy Anywhere: Easily containerize and deploy agents, with strong support for cloud-native environments like Google Cloud Run._

Running the Provided Sample Out-of-the-Box!

Creating a basic agent is a matter of minutes using the example provided on the documentation site!

  • Obtain you Google API key which you should export as is or put in an .env file.
export GOOGLE_API_KEY="abcdefghixxxxxxxxxxxxx"
Enter fullscreen mode Exit fullscreen mode
  • Copy/paste the code
package main

import (
    "context"
    "log"
    "os"

    "google.golang.org/adk/agent"
    "google.golang.org/adk/agent/llmagent"
    "google.golang.org/adk/cmd/launcher"
    "google.golang.org/adk/cmd/launcher/full"
    "google.golang.org/adk/model/gemini"
    "google.golang.org/adk/tool"
    "google.golang.org/adk/tool/geminitool"
    "google.golang.org/genai"
)

func main() {
    ctx := context.Background()

    model, err := gemini.NewModel(ctx, "gemini-2.5-flash", &genai.ClientConfig{
        APIKey: os.Getenv("GOOGLE_API_KEY"),
    })
    if err != nil {
        log.Fatalf("Failed to create model: %v", err)
    }

    timeAgent, err := llmagent.New(llmagent.Config{
        Name:        "hello_time_agent",
        Model:       model,
        Description: "Tells the current time in a specified city.",
        Instruction: "You are a helpful assistant that tells the current time in a city.",
        Tools: []tool.Tool{
            geminitool.GoogleSearch{},
        },
    })
    if err != nil {
        log.Fatalf("Failed to create agent: %v", err)
    }

    config := &launcher.Config{
        AgentLoader: agent.NewSingleLoader(timeAgent),
    }

    l := full.NewLauncher()
    if err = l.Execute(ctx, config, os.Args[1:]); err != nil {
        log.Fatalf("Run failed: %v\n\n%s", err, l.CommandLineSyntax())
    }
}
Enter fullscreen mode Exit fullscreen mode
  • Configure the dependencies
go mod init my-agent/main
go mod tidy
Enter fullscreen mode Exit fullscreen mode
  • You can run the application either in console mode or as a web app!
go run agent.go


User -> what is the time in Paris?

Agent -> The current time in Paris, France is 08:31 AM. Paris is currently observing Central European Summer Time (CEST), which is UTC+2.
User -> 
Enter fullscreen mode Exit fullscreen mode

ADK Web is not meant for use in production deployments. You should use ADK Web for development and debugging purposes only.

go run agent.go web api webui
Enter fullscreen mode Exit fullscreen mode


Implementing A local Agent using Ollama and Lessons Learned

After a series of intensive integration tests exhausted my Google API quotas, I pivoted to a local development workflow. By leveraging Ollama to host models locally, I was able to continue refining the agent without API constraints. To speed up the transition, I brought in Bob, our AI construction specialist, to streamline the architecture and get the local environment running in record time.

First Test: Console Mode Application

The first attempt is just a simple Go application in console mode.

package main

import (
 "bufio"
 "context"
 "fmt"
 "iter"
 "log"
 "os"
 "os/exec"
 "strings"

 "google.golang.org/adk/agent"
 "google.golang.org/adk/agent/llmagent"
 "google.golang.org/adk/cmd/launcher"
 "google.golang.org/adk/cmd/launcher/full"
 "google.golang.org/adk/model"
 "google.golang.org/adk/session"
 "google.golang.org/adk/tool"
 "google.golang.org/adk/tool/geminitool"
 "google.golang.org/genai"

 openai "github.com/openai/openai-go/v3"
 "github.com/openai/openai-go/v3/option"
)

// OllamaModel wraps Ollama to implement the adk model.LLM interface
type OllamaModel struct {
 client    openai.Client
 modelName string
}

// NewOllamaModel creates a new Ollama model instance
func NewOllamaModel(modelName string, baseURL string) (*OllamaModel, error) {
 if baseURL == "" {
  baseURL = "http://localhost:11434/v1"
 }

 client := openai.NewClient(
  option.WithBaseURL(baseURL),
  option.WithAPIKey("ollama"), // Ollama doesn't need a real API key
 )

 return &OllamaModel{
  client:    client,
  modelName: modelName,
 }, nil
}

// Name returns the model name
func (o *OllamaModel) Name() string {
 return o.modelName
}

// GenerateContent implements the model.LLM interface
func (o *OllamaModel) GenerateContent(ctx context.Context, req *model.LLMRequest, stream bool) iter.Seq2[*model.LLMResponse, error] {
 return func(yield func(*model.LLMResponse, error) bool) {
  // Convert ADK request to OpenAI format
  messages := []openai.ChatCompletionMessageParamUnion{}

  // Add system instruction from config if present
  if req.Config != nil && req.Config.SystemInstruction != nil && len(req.Config.SystemInstruction.Parts) > 0 {
   var systemText strings.Builder
   for _, part := range req.Config.SystemInstruction.Parts {
    if part.Text != "" {
     systemText.WriteString(part.Text)
    }
   }
   if systemText.Len() > 0 {
    messages = append(messages, openai.SystemMessage(systemText.String()))
   }
  }

  // Add conversation history
  for _, content := range req.Contents {
   role := "user"
   if content.Role == "model" {
    role = "assistant"
   }

   var contentText strings.Builder
   for _, part := range content.Parts {
    if part.Text != "" {
     contentText.WriteString(part.Text)
    }
   }

   if contentText.Len() > 0 {
    if role == "user" {
     messages = append(messages, openai.UserMessage(contentText.String()))
    } else {
     messages = append(messages, openai.AssistantMessage(contentText.String()))
    }
   }
  }

  // Call Ollama via OpenAI-compatible API
  params := openai.ChatCompletionNewParams{
   Messages: messages,
   Model:    openai.ChatModel(o.modelName),
  }

  completion, err := o.client.Chat.Completions.New(ctx, params)
  if err != nil {
   yield(nil, fmt.Errorf("ollama API error: %w", err))
   return
  }

  // Convert response back to ADK format
  if len(completion.Choices) == 0 {
   yield(nil, fmt.Errorf("no response from Ollama"))
   return
  }

  responseText := completion.Choices[0].Message.Content

  response := &model.LLMResponse{
   Content: &genai.Content{
    Parts: []*genai.Part{
     genai.NewPartFromText(responseText),
    },
    Role: "model",
   },
  }

  yield(response, nil)
 }
}

func main() {
 // Load environment variables from .env file
 if data, err := os.ReadFile(".env"); err == nil {
  for _, line := range strings.Split(string(data), "\n") {
   line = strings.TrimSpace(line)
   if strings.HasPrefix(line, "export ") {
    line = strings.TrimPrefix(line, "export ")
   }
   if parts := strings.SplitN(line, "=", 2); len(parts) == 2 {
    key := strings.TrimSpace(parts[0])
    value := strings.Trim(strings.TrimSpace(parts[1]), "\"")
    os.Setenv(key, value)
   }
  }
 }

 // Set default model if not specified
 if os.Getenv("OLLAMA_MODEL") == "" {
  os.Setenv("OLLAMA_MODEL", "mistral:7b")
 }

 // Check if we are the sub-process worker
 if len(os.Args) > 1 && os.Args[1] == "internal-run" {
  runAgentWorker()
  return
 }

 // Interactive Loop
 fmt.Println("--- Ollama Agent (ADK) ---")
 fmt.Println("Type 'exit' to stop.")

 scanner := bufio.NewScanner(os.Stdin)
 for {
  fmt.Print("\nUser -> ")
  if !scanner.Scan() {
   break
  }

  input := strings.TrimSpace(scanner.Text())
  if input == "exit" || input == "quit" {
   fmt.Println("Goodbye!")
   break
  }
  if input == "" {
   continue
  }

  // Execute ourselves with the 'internal-run' flag
  cmd := exec.Command(os.Args[0], "internal-run", input)
  cmd.Stdout = os.Stdout
  cmd.Stderr = os.Stderr
  cmd.Env = os.Environ() // Pass environment variables to subprocess

  _ = cmd.Run()
 }
}

func runAgentWorker() {
 ctx := context.Background()

 // Get the user input from args
 if len(os.Args) < 3 {
  return
 }
 userInput := os.Args[2]

 // Create Ollama model - use model from environment or default
 modelName := os.Getenv("OLLAMA_MODEL")
 if modelName == "" {
  modelName = "mistral:7b"
 }

 ollamaModel, err := NewOllamaModel(modelName, "http://localhost:11434/v1")
 if err != nil {
  log.Fatalf("Failed to create Ollama model: %v", err)
 }
 log.Printf("DEBUG: Using Ollama model: %s", modelName)

 // Get Google API key for search tool
 apiKey := os.Getenv("GOOGLE_API_KEY")
 var tools []tool.Tool
 if apiKey != "" {
  // Add Google Search tool if API key is available
  tools = append(tools, geminitool.GoogleSearch{})
  log.Printf("DEBUG: Google Search tool added (API key present: %d chars)", len(apiKey))
 } else {
  log.Printf("DEBUG: No Google API key found, search tool not available")
 }

 // Create agent with Ollama model
 agentInstance, err := llmagent.New(llmagent.Config{
  Name:        "ollama_assistant",
  Model:       ollamaModel,
  Description: "A helpful assistant powered by IBM Granite via Ollama.",
  Instruction: "You are a helpful assistant. When asked about current information like time, weather, or recent events, you MUST use the Google Search tool to find accurate, up-to-date information. Do not make up answers or say you cannot access real-time data - use the search tool instead.",
  Tools:       tools,
 })
 if err != nil {
  log.Fatalf("Failed to create agent: %v", err)
 }

 // Create session service
 sessionService := session.InMemoryService()

 config := &launcher.Config{
  SessionService: sessionService,
  AgentLoader:    agent.NewSingleLoader(agentInstance),
 }

 l := full.NewLauncher()

 // Run console mode and pipe the user input to it
 // Create a pipe to send input
 r, w, err := os.Pipe()
 if err != nil {
  log.Fatalf("Failed to create pipe: %v", err)
 }
 oldStdin := os.Stdin
 os.Stdin = r

 // Write the input and exit command
 go func() {
  fmt.Fprintln(w, userInput)
  fmt.Fprintln(w, "exit")
  w.Close()
 }()

 // Execute in console mode
 if err := l.Execute(ctx, config, []string{"console"}); err != nil {
  os.Stdin = oldStdin
  log.Fatalf("Run failed: %v", err)
 }

 os.Stdin = oldStdin
}

// Made with Bob
Enter fullscreen mode Exit fullscreen mode

The application runs as expected, even in console mode!


Lessons learned while making this application

What one should be aware of (either reading the documentation or learning by doing 😂);

  • The ADK is not designed to function as a s*tandalone CLI* or monolithic application. Instead, it is built as a highly modular agentic middleware. It is optimized to run in Interactive Mode, where it is orchestrated by other applications or agents, or in Web Mode, providing a robust backend for browser-based AI interfaces.
  • Another important fact: Not ALL models are fit for function calling!

While It is entirely possible to architect simple, zero-tool agents that function purely on the model’s pre-existing weights. While useful for general reasoning or creative tasks, these agents lack the real-time grounding provided by the ADK’s orchestration layer, making them less effective for data-sensitive or live environments.

# Ollama Model Configuration
# Models that likely support function calling (my list):
# - llama3.2:latest (recommended, 2GB)
# - llama3:latest (4.7GB)
# - mistral:7b (4.4GB)
# - deepseek-r1:latest (5.2GB)
#
# Models without function calling:
# - ibm/granite4:3b (default, no tools)
# - gemma3:4b (no tools)
# export OLLAMA_MODEL="llama3.2:latest"
export OLLAMA_MODEL="mistral:7b"
Enter fullscreen mode Exit fullscreen mode

Second Test: Web UI Mode Application

Next, I wanted to see this agent in its natural habitat: a Web UI. While I prepped the environment, Bob essentially finished the job in two clicks. The result? A perfectly functional web interface that looks and feels exactly like the live demos on the ADK documentation site — Ollama integration and all.


package main

import (
 "context"
 "fmt"
 "iter"
 "log"
 "os"
 "strings"

 "google.golang.org/adk/agent"
 "google.golang.org/adk/agent/llmagent"
 "google.golang.org/adk/cmd/launcher"
 "google.golang.org/adk/cmd/launcher/full"
 "google.golang.org/adk/model"
 "google.golang.org/adk/tool"
 "google.golang.org/adk/tool/geminitool"
 "google.golang.org/genai"

 openai "github.com/openai/openai-go/v3"
 "github.com/openai/openai-go/v3/option"
)

// OllamaModel wraps Ollama to implement the adk model.LLM interface
type OllamaModel struct {
 client    openai.Client
 modelName string
}

// NewOllamaModel creates a new Ollama model instance
func NewOllamaModel(modelName string, baseURL string) (*OllamaModel, error) {
 if baseURL == "" {
  baseURL = "http://localhost:11434/v1"
 }

 client := openai.NewClient(
  option.WithBaseURL(baseURL),
  option.WithAPIKey("ollama"), // Ollama doesn't need a real API key
 )

 return &OllamaModel{
  client:    client,
  modelName: modelName,
 }, nil
}

// Name returns the model name
func (o *OllamaModel) Name() string {
 return o.modelName
}

// GenerateContent implements the model.LLM interface
func (o *OllamaModel) GenerateContent(ctx context.Context, req *model.LLMRequest, stream bool) iter.Seq2[*model.LLMResponse, error] {
 return func(yield func(*model.LLMResponse, error) bool) {
  // Convert ADK request to OpenAI format
  messages := []openai.ChatCompletionMessageParamUnion{}

  // Add system instruction from config if present
  if req.Config != nil && req.Config.SystemInstruction != nil && len(req.Config.SystemInstruction.Parts) > 0 {
   var systemText strings.Builder
   for _, part := range req.Config.SystemInstruction.Parts {
    if part.Text != "" {
     systemText.WriteString(part.Text)
    }
   }
   if systemText.Len() > 0 {
    messages = append(messages, openai.SystemMessage(systemText.String()))
   }
  }

  // Add conversation history
  for _, content := range req.Contents {
   role := "user"
   if content.Role == "model" {
    role = "assistant"
   }

   var contentText strings.Builder
   for _, part := range content.Parts {
    if part.Text != "" {
     contentText.WriteString(part.Text)
    }
   }

   if contentText.Len() > 0 {
    if role == "user" {
     messages = append(messages, openai.UserMessage(contentText.String()))
    } else {
     messages = append(messages, openai.AssistantMessage(contentText.String()))
    }
   }
  }

  // Call Ollama via OpenAI-compatible API
  params := openai.ChatCompletionNewParams{
   Messages: messages,
   Model:    openai.ChatModel(o.modelName),
  }

  completion, err := o.client.Chat.Completions.New(ctx, params)
  if err != nil {
   yield(nil, fmt.Errorf("ollama API error: %w", err))
   return
  }

  // Convert response back to ADK format
  if len(completion.Choices) == 0 {
   yield(nil, fmt.Errorf("no response from Ollama"))
   return
  }

  responseText := completion.Choices[0].Message.Content

  response := &model.LLMResponse{
   Content: &genai.Content{
    Parts: []*genai.Part{
     genai.NewPartFromText(responseText),
    },
    Role: "model",
   },
  }

  yield(response, nil)
 }
}

func main() {
 // Load environment variables from .env file
 if data, err := os.ReadFile(".env"); err == nil {
  for _, line := range strings.Split(string(data), "\n") {
   line = strings.TrimSpace(line)
   if strings.HasPrefix(line, "export ") {
    line = strings.TrimPrefix(line, "export ")
   }
   if parts := strings.SplitN(line, "=", 2); len(parts) == 2 {
    key := strings.TrimSpace(parts[0])
    value := strings.Trim(strings.TrimSpace(parts[1]), "\"")
    os.Setenv(key, value)
   }
  }
 }

 ctx := context.Background()

 // Set default model if not specified
 modelName := os.Getenv("OLLAMA_MODEL")
 if modelName == "" {
  modelName = "mistral:7b"
 }

 // Create Ollama model
 ollamaModel, err := NewOllamaModel(modelName, "http://localhost:11434/v1")
 if err != nil {
  log.Fatalf("Failed to create Ollama model: %v", err)
 }
 log.Printf("Using Ollama model: %s", modelName)

 // Get Google API key for search tool
 apiKey := os.Getenv("GOOGLE_API_KEY")
 var tools []tool.Tool
 if apiKey != "" {
  // Add Google Search tool if API key is available
  tools = append(tools, geminitool.GoogleSearch{})
  log.Printf("Google Search tool enabled")
 } else {
  log.Printf("No Google API key found, search tool not available")
 }

 // Create agent with Ollama model
 agentInstance, err := llmagent.New(llmagent.Config{
  Name:        "ollama_assistant",
  Model:       ollamaModel,
  Description: "A helpful assistant powered via Ollama and ADK.",
  Instruction: "You are a helpful assistant. When asked about current information like time, weather, or recent events, you MUST use the Google Search tool to find accurate, up-to-date information. Do not make up answers or say you cannot access real-time data - use the search tool instead.",
  Tools:       tools,
 })
 if err != nil {
  log.Fatalf("Failed to create agent: %v", err)
 }

 config := &launcher.Config{
  AgentLoader: agent.NewSingleLoader(agentInstance),
 }

 l := full.NewLauncher()

 // Execute with command line arguments (supports web api webui)
 if err = l.Execute(ctx, config, os.Args[1:]); err != nil {
  log.Fatalf("Run failed: %v\n\n%s", err, l.CommandLineSyntax())
 }
}

// Made with Bob
Enter fullscreen mode Exit fullscreen mode

Conclusion

The Agent Development Kit (ADK) is a framework that prioritizes interaction over isolation, designed to transform static LLMs into dynamic collaborators that can reach out to the world via Google Search or run privately through Ollama. By moving the logic into a Web UI and leveraging Bob’s structural guidance, we can demonstrate that the ADK isn’t just a tool for building chatbots — it’s an engine for creating “agentic” workflows that are as at home in a browser as they are in a complex multi-agent ecosystem. Whether you’re relying on a model’s internal knowledge or empowering it with real-time data, the ADK provides the standardized bridge needed to turn AI potential into practical, integrated reality.

>>> Thanks for reading <<<

Links

Top comments (0)