DEV Community

Zrcic
Zrcic

Posted on

You Built a Kotlin AI Agent. Should You Migrate to Koog?

JetBrains Koog vs Google GenAI SDK — Should You Switch?

You've built a working AI agent with Google GenAI. Now JetBrains releases Koog, a Kotlin-first AI framework. Should you migrate? Probably not — here's why.

See also: SQLite Embeddings for Local Agents for the RAG implementation referenced in this article.

The Problem

When building AI agents in Kotlin, you have two main options:

  1. Google GenAI SDK — Direct API wrapper, full control, manual tool loop
  2. JetBrains Koog — Higher-level framework, Kotlin DSL, built-in agent patterns

Both work. The question is: what do you gain and lose by switching?

What Koog Offers

Koog is JetBrains' attempt to bring Kotlin-native ergonomics to AI agent development. It wraps multiple LLM providers (Gemini, OpenAI, Anthropic) behind a unified DSL.

Cleaner Tool Definitions

With Google GenAI, tool definitions are verbose:

val currentTimeDeclaration = FunctionDeclaration.builder()
    .name("get_current_time")
    .description("Get the current date and time")
    .parameters(
        Schema.builder()
            .type(Type.Known.OBJECT)
            .properties(emptyMap<String, Schema>())
            .build()
    )
    .build()
Enter fullscreen mode Exit fullscreen mode

Koog simplifies this with a DSL approach:

val getCurrentTime = tool(
    name = "get_current_time",
    description = "Get the current date and time"
) {
    TimeResult(
        datetime = LocalDateTime.now().toString(),
        formatted = LocalDateTime.now().format(pattern)
    )
}
Enter fullscreen mode Exit fullscreen mode

Automatic Tool Execution Loop

With Google GenAI, you write the loop yourself:

while (response.functionCalls()?.isNotEmpty() == true) {
    val functionCalls = response.functionCalls()!!
    val responseParts = mutableListOf<Part>()

    for (fc in functionCalls) {
        val result = handleFunction(fc.name(), fc.args())
        responseParts.add(Part.fromFunctionResponse(name, result))
    }

    history.add(functionResponseContent)
    response = client.models.generateContent(model, history, config)
}
Enter fullscreen mode Exit fullscreen mode

Koog handles this internally — you just define tools and let the framework orchestrate.

Multi-Provider Support

Koog abstracts the LLM provider:

// Conceptual example — verify against current Koog docs
val agent = AIAgent(
    executor = simpleGeminiExecutor(apiKey),
    toolRegistry = buildToolRegistry { tool(getCurrentTime) }
)
Enter fullscreen mode Exit fullscreen mode

Switching from Gemini to OpenAI means swapping the executor, not rewriting tools.

What You Lose

Fine-Grained Control

Your manual loop lets you inject logic between iterations. For example, adding RAG context before each user message:

val relevantChunks = documents.search(input)
val message = if (relevantChunks.isNotEmpty()) {
    val context = relevantChunks.joinToString("\n\n")
    "Context:\n$context\n\nUser: $input"
} else {
    input
}
Enter fullscreen mode Exit fullscreen mode

With Koog, you'd need to work within its abstraction — possible, but less direct.

Maturity

Google GenAI SDK is battle-tested. Koog shipped in 2024 and is still evolving. Documentation gaps and edge cases are expected.

Simplicity

Your current stack is lean. Adding Koog introduces another layer:

Current:                With Koog:
┌──────────────────┐    ┌──────────────────┐
│    Your Code     │    │    Your Code     │
├──────────────────┤    ├──────────────────┤
│  Google GenAI    │    │      Koog        │
└──────────────────┘    ├──────────────────┤
                        │  Provider SDK    │
                        └──────────────────┘
Enter fullscreen mode Exit fullscreen mode

More layers = more things that can break.

A Related Decision: Structured Output

While evaluating frameworks, another question often comes up: should you use structured output?

Structured output forces the LLM to respond in a specific JSON schema:

val config = GenerateContentConfig.builder()
    .responseMimeType("application/json")
    .responseSchema(Schema.builder()
        .type(Type.Known.OBJECT)
        .properties(mapOf(
            "intent" to Schema.builder().type(Type.Known.STRING).build(),
            "confidence" to Schema.builder().type(Type.Known.NUMBER).build()
        ))
        .build())
    .build()
Enter fullscreen mode Exit fullscreen mode

Use it when:

  • Extracting specific fields from user input
  • Building APIs that need predictable JSON responses
  • Downstream processing requires consistent format

Skip it when:

  • Building conversational bots (like Telegram)
  • Free-form responses are the goal
  • You want the model to format naturally

For a chat agent, free-form text is usually what you want.

When to Switch to Koog

Consider Koog if:

Scenario Koog Value
Supporting multiple LLM providers High
Adding 10+ tools with complex schemas High
Using advanced patterns (subagents, planning) Medium
Current code is becoming hard to maintain Medium
Just want cleaner syntax Low

Don't switch just for syntactic sugar. Your 30-line tool loop isn't the bottleneck.

Practical Recommendations

Stay with Google GenAI if:

  • Your agent works and you have full control
  • You're only using Gemini
  • RAG or custom logic is tightly integrated into your flow

Consider Koog if:

  • You're starting a new project with multi-provider requirements
  • You need complex agent patterns out of the box
  • You value Kotlin DSL ergonomics over explicit control

Hybrid approach:

  • Keep Google GenAI as the backend
  • Extract your tool definitions into a more DSL-like pattern
  • Get some ergonomic wins without a full framework migration

The Bottom Line

JetBrains Koog trades control for convenience. If you've already built a working agent with Google GenAI and manual tool orchestration, migration isn't worth it unless you need multi-provider support or advanced agent patterns. Your explicit loop gives you flexibility that abstractions hide. For new projects with complex requirements, Koog is worth evaluating — but for a functional Telegram bot with RAG, keep what works.

Resources

Top comments (0)