<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Mollie Pettit</title>
    <description>The latest articles on DEV Community by Mollie Pettit (@mollie_pettit_2fa2a4d10f7).</description>
    <link>https://dev.to/mollie_pettit_2fa2a4d10f7</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mollie_pettit_2fa2a4d10f7"/>
    <language>en</language>
    <item>
      <title>Building Connected Agents with MCP and A2A</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Wed, 28 Jan 2026 20:07:34 +0000</pubDate>
      <link>https://dev.to/googleai/building-connected-agents-with-mcp-and-a2a-47b6</link>
      <guid>https://dev.to/googleai/building-connected-agents-with-mcp-and-a2a-47b6</guid>
      <description>&lt;p&gt;To build a production-ready agentic system, where intelligent agents can freely collaborate and act, we need standards and shared protocols for how agents talk to tools and how they talk to each other.&lt;/p&gt;

&lt;p&gt;In the Agent Production Patterns module in the &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Production-Ready AI with Google Cloud Learning Path&lt;/a&gt;, we focus on interoperability, exploring the standard patterns for connecting agents to data, tools and each other. Here are three hands-on labs to help you build these skills.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Foundations of ADK, MCP, A2A
&lt;/h2&gt;

&lt;p&gt;This lab serves as your "Hello World" for the modern agent stack. You will build a simple, specialized agent that demonstrates how &lt;a href="https://google.github.io/adk-docs/" rel="noopener noreferrer"&gt;Agent Development Kit (ADK)&lt;/a&gt;, &lt;a href="https://cloud.google.com/discover/what-is-model-context-protocol?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Model Context Protocol (MCP)&lt;/a&gt;, and &lt;a href="https://a2a-protocol.org/latest/" rel="noopener noreferrer"&gt;Agent to Agent Protocol (A2A)&lt;/a&gt; work together. &lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Start the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/codelabs/currency-agent?utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Getting Started with MCP, ADK and A2A&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a specialized currency agent that handles exchange rates, demonstrating the core building blocks ADK, MCP, and A2A communication.&lt;/em&gt;&lt;/p&gt;


&lt;/div&gt;


&lt;h2&gt;
  
  
  Connecting to Data with MCP
&lt;/h2&gt;

&lt;p&gt;Once you understand the basics, the next step is giving your agent access to knowledge. Whether you are analyzing massive datasets or searching operational records, the &lt;a href="https://github.com/googleapis/genai-toolbox" rel="noopener noreferrer"&gt;MCP Toolbox&lt;/a&gt; provides a standard way to connect your agent to your databases.&lt;/p&gt;

&lt;h3&gt;
  
  
  Expose a BigQuery database to an MCP Client
&lt;/h3&gt;

&lt;p&gt;This lab shows you how to expose &lt;a href="https://github.com/googleapis/genai-toolbox" rel="noopener noreferrer"&gt;BigQuery tables&lt;/a&gt; to an &lt;a href="https://cloud.google.com/discover/what-is-model-context-protocol?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;MCP client&lt;/a&gt;.&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Start the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/mcp-toolbox-bigquery-dataset?utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;MCP Toolbox for Databases: Making BigQuery datasets available to MCP clients&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Configure the MCP Toolbox to expose a public BigQuery dataset to an AI agent, enabling it to query and analyze datasets using natural language.&lt;/em&gt;&lt;br&gt;

&lt;/p&gt;
&lt;/div&gt;


&lt;h3&gt;
  
  
  Expose a CloudSQL database to an MCP Client
&lt;/h3&gt;

&lt;p&gt;If you need your agent to search for specific records—like flight schedules or hotel inventory—this lab demonstrates how to connect to a &lt;a href="https://cloud.google.com/sql?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;CloudSQL&lt;/a&gt; relational database.&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Start the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/travel-agent-mcp-toolbox-adk?utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build a Travel Agent using MCP Toolbox for Databases and Agent Development Kit (ADK)&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a full-stack agent that interacts with a Cloud SQL database to search for flights and hotels, demonstrating how to securely expose relational data to an AI agent.&lt;/em&gt;&lt;br&gt;

&lt;/p&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  From Prototype to Production
&lt;/h2&gt;

&lt;p&gt;By moving away from custom integrations and adopting standards like &lt;a href="https://cloud.google.com/discover/what-is-model-context-protocol?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;MCP&lt;/a&gt; and &lt;a href="https://a2a-protocol.org/latest/" rel="noopener noreferrer"&gt;A2A&lt;/a&gt;, you can build agents that are easier to maintain and scale. These labs provide the practical patterns you need to connect your agents to your data, your tools, and each other.&lt;/p&gt;

&lt;p&gt;These labs are part of the &lt;strong&gt;Agent Production Patterns&lt;/strong&gt; module in our official &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b466009122&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Production-Ready AI with Google Cloud Learning Path&lt;/a&gt;. Explore the full curriculum for more content that will help you bridge the gap from a promising prototype to a production-grade AI application.&lt;/p&gt;

&lt;p&gt;Share your progress using the hashtag &lt;strong&gt;#ProductionReadyAI&lt;/strong&gt;. Happy learning!&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>agents</category>
      <category>ai</category>
      <category>database</category>
    </item>
    <item>
      <title>Agent Factory Recap: Build AI Apps in Minutes with Google's Logan Kilpatrick</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Mon, 26 Jan 2026 15:01:57 +0000</pubDate>
      <link>https://dev.to/googleai/agent-factory-recap-build-ai-apps-in-minutes-with-googles-logan-kilpatrick-139l</link>
      <guid>https://dev.to/googleai/agent-factory-recap-build-ai-apps-in-minutes-with-googles-logan-kilpatrick-139l</guid>
      <description>&lt;p&gt;In our latest episode of &lt;a href="https://www.youtube.com/playlist?list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs" rel="noopener noreferrer"&gt;The Agent Factory&lt;/a&gt;, we were thrilled to welcome Logan Kilpatrick from &lt;a href="https://deepmind.google/" rel="noopener noreferrer"&gt;Google Deep Mind&lt;/a&gt; for a &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; session that showcased the tools shaping the future of AI development. Logan, who has had a front-row seat to the generative AI revolution at both OpenAI and now Google, gave us a hands-on tour of the &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; experience in &lt;a href="https://aistudio.google.com/?utm_source=partner&amp;amp;utm_medium=partner&amp;amp;utm_campaign=FY25-Global-DEVpartnership-misc-AIS&amp;amp;utm_content=-&amp;amp;utm_term=-" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt;, showing just how fast you can go from an idea to a fully-functional AI application.&lt;/p&gt;

&lt;p&gt;This post guides you through the key ideas from our conversation. Use it to quickly recap topics or dive deeper into specific segments with links and timestamps.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Build Experience in Google AI Studio - What is it?
&lt;/h2&gt;

&lt;p&gt;This episode focused on the &lt;a href="https://aistudio.google.com/apps" rel="noopener noreferrer"&gt;Build feature&lt;/a&gt; in &lt;a href="https://aistudio.google.com/?utm_source=partner&amp;amp;utm_medium=partner&amp;amp;utm_campaign=FY25-Global-DEVpartnership-misc-AIS&amp;amp;utm_content=-&amp;amp;utm_term=-" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt; and Logan used the term &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; to describe the experience of using it. This feature is designed to radically accelerate how developers create &lt;a href="https://cloud.google.com/discover/ai-applications?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI-powered apps&lt;/a&gt;. The core idea is to move from a natural language prompt of an idea for an app to a live, running application in under a minute. It handles the scaffolding, code generation, and even error correction, allowing you to focus on iterating and refining your idea.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Factory Floor
&lt;/h2&gt;

&lt;p&gt;The Factory Floor is our segment for getting hands-on. Here, we moved from high-level concepts to practical code with live demos.&lt;/p&gt;

&lt;h3&gt;
  
  
  Vibe Coding a Virtual Food Photographer
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=-D3tQT9R06KkrdzM&amp;amp;t=74" rel="noopener noreferrer"&gt;01:14&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;To kick things off, Logan hit the "I'm Feeling Lucky" button to generate a random app idea: a virtual food photographer for restaurant owners. The goal was to &lt;a href="https://aistudio.google.com/apps" rel="noopener noreferrer"&gt;build&lt;/a&gt; an app that could:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Accept a simple text-based menu.&lt;/li&gt;
&lt;li&gt;Generate realistic, high-end photography for each dish.&lt;/li&gt;
&lt;li&gt;Allow for style toggles like "rustic and dark" or "bright and modern."&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In about 90 seconds, we had a running web app. Logan fed it a quirky menu of pizza, blueberries, and popcorn, and the app generated images of each. We also saw how you can use AI-suggested features to iteratively adjust the prepared photos—like adding butter to the popcorn, and add functionality—like changing the entire design aesthetic of the site.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyjyza3mlt943fz7tkksb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyjyza3mlt943fz7tkksb.png" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Grounding with Google Maps
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=I7-NVKOnceWz5uUe&amp;amp;t=625" rel="noopener noreferrer"&gt;10:25&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Next, Logan showcased one of the most exciting new features: &lt;a href="https://blog.google/technology/developers/grounding-google-maps-gemini-api/?utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;grounding with Google Maps&lt;/a&gt;. This allows the &lt;a href="https://ai.google.dev/gemini-api/docs/models?utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini models&lt;/a&gt; to connect directly to Google Maps to pull in rich, real-time place data without setting up a separate API. He demonstrated a starter template app that acted as a local guide, finding Italian restaurants in Chicago and describing the neighborhood.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkeye9ci83qp5nnajwaxy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkeye9ci83qp5nnajwaxy.png" width="716" height="486"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Exploring the AI Studio Gallery
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=XTNVEE70JsZ-64Gx&amp;amp;t=895" rel="noopener noreferrer"&gt;14:55&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;For developers looking for inspiration, Logan walked us through the &lt;a href="https://aistudio.google.com/apps?source=showcase" rel="noopener noreferrer"&gt;AI Studio Gallery&lt;/a&gt;. This is a collection of pre-built, interactive examples that show what the models are capable of. Two highlights were:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Prompt DJ&lt;/strong&gt;: An app that uses the &lt;a href="https://cloud.google.com/vertex-ai/generative-ai/docs/music/generate-music?utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Lyria model&lt;/a&gt; to generate novel, real-time music based on a prompt.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Vibe Check&lt;/strong&gt;: A fun tool for visually testing and comparing how different models respond to the same prompt, which is becoming a popular way for developers to quickly evaluate a model's suitability for their use case.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvfxhh9kdg34wqf3qaiux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvfxhh9kdg34wqf3qaiux.png" width="800" height="475"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  "Yap to App": A Conversational Pair Programmer
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=orrbTaI8Hul5UWMu&amp;amp;t=1191" rel="noopener noreferrer"&gt;19:51&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;For the final demo, Logan used a speech-to-text input to describe an app idea which he called "Yap to App". His pitch: an AI pair programmer that could generate HTML code and then vocally coach him on how to improve it. After turning his spoken request into a written prompt, AI Studio built a voice-interactive app. The AI assistant generated a simple HTML card and then, when asked, provided verbal suggestions for improvement. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4x9aj006tcci0psetd3p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4x9aj006tcci0psetd3p.png" width="800" height="413"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Agent Industry Pulse
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=0OoxIssx045SIByw&amp;amp;t=1579" rel="noopener noreferrer"&gt;26:19&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In this segment, we covered some of the biggest recent launches in the &lt;a href="https://cloud.google.com/discover/what-are-ai-agents?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;agent&lt;/a&gt; ecosystem:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://ai.google.dev/gemini-api/docs/video?utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Veo 3.1&lt;/a&gt;&lt;/strong&gt;: Google's new state-of-the-art video generation model that builds on Veo 3, adding richer native audio and the ability to define the first and last frames of a video to generate seamless transitions. Smitha showcased a quick applet, built entirely in AI Studio, where users can upload a selfie of themselves and generate a video of their future career in AI using Veo 3.1.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Anthropic's Skills&lt;/strong&gt;: A new feature that allows you to give Claude specific tools (like an Excel script) that it can decide to use on its own to complete a task. We compared this to Gemini Gems, noting the difference in approach between creating a persona (Gem) and providing a tool (Skill).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Recent Google Launches&lt;/strong&gt;: Logan highlighted several other key releases, including the new &lt;a href="https://blog.google/technology/google-deepmind/gemini-computer-use-model/?utm_campaign=CDR_0x6e136736_awareness_b452057599&amp;amp;utm_medium=external&amp;amp;utm_source=youtube" rel="noopener noreferrer"&gt;Gemini computer use model&lt;/a&gt; for building agents that can navigate browsers, updates to the &lt;a href="https://ai.google.dev/gemini-api/docs/models?utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Flash and Flash-Lite models&lt;/a&gt;, and &lt;a href="https://blog.google/technology/developers/ai-studio-updates-more-control/?utm_campaign=CDR_0x6e136736_awareness_b452057599&amp;amp;utm_medium=external&amp;amp;utm_source=youtube" rel="noopener noreferrer"&gt;foundational upgrades to the AI Studio experience&lt;/a&gt; itself.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Logan Kilpatrick on the Future of AI Development
&lt;/h2&gt;

&lt;p&gt;We also had the chance to discuss the bigger picture with Logan, from developer reactions to the future of models themselves.&lt;/p&gt;

&lt;h3&gt;
  
  
  Grounding with Google Maps
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=oxku5g-tCB3O1oWJ&amp;amp;t=1886" rel="noopener noreferrer"&gt;31:26&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;When asked which launch developers have been most excited about, Logan admitted he was surprised by the overwhelmingly positive reception for &lt;a href="https://blog.google/technology/developers/grounding-google-maps-gemini-api/?utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;grounding with Google Maps&lt;/a&gt;. He noted that the &lt;a href="https://mapsplatform.google.com/lp/maps-apis/" rel="noopener noreferrer"&gt;Maps API&lt;/a&gt; is one of the most widely used developer APIs in the world, and making it incredibly simple to integrate with Gemini unlocked key use cases for countless developers and startups.&lt;/p&gt;

&lt;h3&gt;
  
  
  From Models to Systems: The Next Frontier
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/azvA2Bn2aXw?si=icS7YYRGOJOHRwes&amp;amp;t=1946" rel="noopener noreferrer"&gt;32:26&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Looking ahead, Logan shared his excitement for the continued progress on code generation, which he sees as a fundamental accelerant for all other AI capabilities. He also pointed out a trend: models are evolving from simple tools into complex systems.&lt;/p&gt;

&lt;p&gt;Historically, a model was something that took a token in and produced a token out. Now, models are starting to look more like &lt;a href="https://cloud.google.com/discover/what-are-ai-agents?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;agents&lt;/a&gt; out of the box. They can take actions: spinning up code sandboxes, pinging APIs, and navigating browsers. "Folks have thought about agents and models as these decoupled concepts," Logan said, "and it feels like they're coming closer and closer together as the model capabilities keep improving."&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This conversation was a powerful reminder of how quickly the barrier to entry for building sophisticated AI applications is falling. With tools like &lt;a href="https://aistudio.google.com/?utm_source=partner&amp;amp;utm_medium=partner&amp;amp;utm_campaign=FY25-Global-DEVpartnership-misc-AIS&amp;amp;utm_content=-&amp;amp;utm_term=-" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt;, the ability to turn a creative spark into a working prototype is no longer a matter of weeks or days, but minutes. The focus is shifting from complex scaffolding to rapid, creative iteration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your turn to build
&lt;/h2&gt;

&lt;p&gt;We hope this episode inspired you to get hands-on. Head over to &lt;a href="https://aistudio.google.com/?utm_source=partner&amp;amp;utm_medium=partner&amp;amp;utm_campaign=FY25-Global-DEVpartnership-misc-AIS&amp;amp;utm_content=-&amp;amp;utm_term=-" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt; to try out &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058652&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; for yourself, and don't forget to watch the full episode for all the details.&lt;/p&gt;

&lt;h2&gt;
  
  
  Connect with us
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Logan  → &lt;a href="https://www.linkedin.com/in/logankilpatrick/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://x.com/OfficialLoganK" rel="noopener noreferrer"&gt;X&lt;/a&gt;, &lt;a href="https://bsky.app/profile/officiallogank.bsky.social" rel="noopener noreferrer"&gt;BlueSky&lt;/a&gt;, &lt;a href="https://logank.ai/" rel="noopener noreferrer"&gt;blog&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Mollie  → &lt;a href="https://www.linkedin.com/in/molliepettit/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://x.com/MollzMP" rel="noopener noreferrer"&gt;X&lt;/a&gt;, &lt;a href="https://bsky.app/profile/mollzmp.bsky.social" rel="noopener noreferrer"&gt;BlueSky&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Smitha → &lt;a href="https://www.linkedin.com/in/smithakolan/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://www.youtube.com/@smithakolan" rel="noopener noreferrer"&gt;YouTube&lt;/a&gt;, &lt;a href="https://x.com/smithakolan" rel="noopener noreferrer"&gt;X&lt;/a&gt;, &lt;a href="https://www.instagram.com/girlknowsai/" rel="noopener noreferrer"&gt;Instagram&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>gemini</category>
      <category>veo</category>
      <category>vibecoding</category>
      <category>ai</category>
    </item>
    <item>
      <title>Production-Ready AI with Google Cloud Learning Path</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Tue, 20 Jan 2026 14:51:04 +0000</pubDate>
      <link>https://dev.to/googleai/production-ready-ai-with-google-cloud-learning-path-313d</link>
      <guid>https://dev.to/googleai/production-ready-ai-with-google-cloud-learning-path-313d</guid>
      <description>&lt;p&gt;We're excited to launch the &lt;strong&gt;Production-Ready AI with Google Cloud Learning Path&lt;/strong&gt;, a free series designed to take your AI projects from prototype to production.&lt;/p&gt;

&lt;p&gt;This page is the central hub for the curriculum. We'll be updating it weekly with new modules from now through mid-December.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why We Built This: Bridging the Prototype-to-Production Gap
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://cloud.google.com/use-cases/generative-ai?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Generative AI&lt;/a&gt; makes it easy to build an impressive prototype. But moving from that proof-of-concept to a secure, scalable, and observable production system is where many projects stall. This is the &lt;strong&gt;prototype-to-production&lt;/strong&gt; gap. It's the challenge of answering hard questions about &lt;a href="https://cloud.google.com/security/securing-ai?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;security&lt;/a&gt;, infrastructure, and &lt;a href="https://cloud.google.com/docs/observability?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;monitoring&lt;/a&gt; for a system that now includes a probabilistic model.&lt;/p&gt;

&lt;p&gt;It’s a journey we’ve been on with our own teams at &lt;a href="https://cloud.google.com/free?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Google Cloud&lt;/a&gt;. To solve for this ongoing challenge, we built a comprehensive internal playbook focused on production-grade best practices. After seeing the playbook's success, we knew we had to share it.&lt;/p&gt;

&lt;p&gt;This learning path is that playbook, adapted for all developers. The path's curriculum combines the power of &lt;a href="https://cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini models&lt;/a&gt; with production-grade tools like &lt;a href="https://cloud.google.com/vertex-ai?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI&lt;/a&gt;, &lt;a href="https://cloud.google.com/kubernetes-engine?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Google Kubernetes Engine (GKE)&lt;/a&gt;, and &lt;a href="https://cloud.google.com/run?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Cloud Run.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We're excited to share this curriculum with the developer community. Share your progress and connect with others on the journey using the hashtag &lt;strong&gt;#ProductionReadyAI&lt;/strong&gt;. Happy learning! &lt;/p&gt;

&lt;h2&gt;
  
  
  The Curriculum
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Developing Apps that use LLMs
&lt;/h3&gt;

&lt;p&gt;Start with the fundamentals of building applications and interacting with models using the &lt;a href="https://docs.cloud.google.com/vertex-ai/docs/python-sdk/use-vertex-ai-python-sdk-ref?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI SDK&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/your-first-ai-application-is-easier-than-you-think?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Your First AI Application is Easier Than You Think&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Go to lab!&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Developing LLM Apps with the Vertex AI SDK&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a Gemini chatbot with the Vertex AI SDK, integrating real-time data via external tools and refining outputs with prompt engineering.&lt;/em&gt;&lt;/p&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  Deploying Open Models
&lt;/h3&gt;

&lt;p&gt;Learn to serve and scale open source models efficiently by deploying them on production-grade platforms like &lt;a href="https://cloud.google.com/kubernetes-engine?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Google Kubernetes Engine (GKE)&lt;/a&gt;, &lt;a href="https://cloud.google.com/run?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Cloud Run&lt;/a&gt;, and &lt;a href="https://docs.cloud.google.com/vertex-ai/docs/general/deployment?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI endpoints&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/hands-on-with-gemma-3-on-google-cloud?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Hands-on with Gemma 3 on Google Cloud&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Go to labs!&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/devsite/codelabs/serve-gemma3-with-vllm-on-cloud-run?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Serving Gemma 3 with vLLM on Cloud Run&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Deploy Gemma 3 to Cloud Run using vLLM, leveraging GPU acceleration to expose an OpenAI-compatible API endpoint.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/5-deploying-agents/deploying-open-models-gke?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Deploying Open Models on GKE&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Prototype locally using Ollama, then deploy a scalable inference service to GKE Autopilot using standard Kubernetes manifests.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  Developing Agents
&lt;/h3&gt;

&lt;p&gt;Learn to build AI agents that can reason, plan, and use tools to accomplish complex tasks with the &lt;a href="https://github.com/google/adk-python" rel="noopener noreferrer"&gt;Agent Development Kit (ADK)&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/build-your-first-adk-agent-workforce?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build Your First ADK Agent Workforce&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Go to labs!&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-foundation?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Building AI Agents with ADK:The Foundation&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Write the essential code to define and run a basic agent using ADK.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-empowering-with-tools?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Building AI Agents with ADK:Empowering with Tools&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Learn how to use tools to interact with external applications and services.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/3-developing-agents/build-a-multi-agent-system-with-adk?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build Multi-Agent Systems with ADK&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Orchestrate a complex, automated workflow with a team of specialized agents that work in sequence, in loops, and in parallel.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  Securing AI Applications
&lt;/h3&gt;

&lt;p&gt;Master the essential practices for &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/3-developing-agents/build-a-multi-agent-system-with-adk?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;securing&lt;/a&gt; your infrastructure, data, and AI-powered endpoints in a production environment.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/building-a-production-ready-ai-security-foundation?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Building a Production-Ready AI Security Foundation&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to labs!&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/4-securing-ai-applications/securing-ai-applications#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Securing AI Applications&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Learn to use Model Armor to secure Generative AI applications against prompt injection and data leakage.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/4-securing-ai-applications/securing-data-used-for-ai-applications#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Securing Data Used for AI Applications&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build an automated pipeline to inspect, classify, and de-identify PII for use in AI development using Sensitive Data Protection.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/4-securing-ai-applications/securing-data-used-for-ai-applications#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Securing Infrastructure for AI Applications&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Secure an AI development environment by implementing network isolation, hardened compute instances, and protected storage.&lt;/em&gt;

&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;




&lt;h3&gt;
  
  
  Deploying Agents
&lt;/h3&gt;

&lt;p&gt;Take your agents to production by deploying them on scalable, managed platforms like Agent Engine, Cloud Run, and Google Kubernetes Engine (GKE).&lt;/p&gt;

&lt;p&gt;Summary: &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/from-code-to-cloud-three-labs-for-deploying-your-ai-agent/?e=48754805" rel="noopener noreferrer"&gt;From Code to Cloud: Three Labs for Deploying Your AI Agent&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Go to labs!&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/create-multi-agents-adk-a2a#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Deploy ADK Agents to Agent Engine&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Deploy a multi-agent system with Agent Engine without provisioning any infrastructure.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/5-deploying-agents/deploy-an-adk-agent-to-cloud-run#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build and deploy an ADK agent on Cloud Run&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Containerize an ADK agent and deploy it to a secure public HTTPS endpoint on Cloud Run to experience the speed of a serverless workflow.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/5-deploying-agents/deploy-adk-agents-to-gke#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Deploy ADK agents to Google Kubernetes Engine (GKE)&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Deploy an ADK agent to a managed GKE cluster, configuring autoscaling and resource limits using standard manifests.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  Evaluation
&lt;/h3&gt;

&lt;p&gt;Discover how to rigorously evaluate the performance of your LLM outputs, agents, and RAG systems to ensure quality and reliability.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/master-generative-ai-evaluation-from-single-prompts-to-complex-agents?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Master Generative AI Evaluation: From Single Prompts to Complex Agents&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to labs!&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/6-ai-evaluation/evaluating-single-llm-outputs-with-vertex-ai-evaluation#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Evaluating Single LLM Outputs With Vertex AI Evaluation&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Run your first automated evaluation job to measure the quality of raw LLM responses.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/6-ai-evaluation/evaluate-rag-systems-with-vertex-ai#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Evaluate RAG Systems with Vertex AI&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Assess the performance of a RAG pipeline by measuring both retrieval quality and generation accuracy.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/adk-eval/instructions?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Evaluating Agents with ADK&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Capture agent execution traces and apply automated evaluation to ensure your agent makes the right decisions.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/bigquery-adk-eval#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build and Evaluate BigQuery Agents using ADK and GenAI Eval Service&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a data-driven agent and test its ability to generate accurate SQL and data insights.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  Agent Production Patterns
&lt;/h3&gt;

&lt;p&gt;Learn how to enhance your agent's capabilities with agentic RAG, Model Context Protocol (MCP) tools, and Agent to Agent (A2A) protocol.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/building-connected-agents-with-mcp-and-a2a?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Building Connected Agents with MCP and A2A&lt;/a&gt;&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  

&lt;p&gt;&lt;strong&gt;Go to labs!&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/currency-agent?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Getting Started with MCP, ADK and A2A&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a specialized currency ADK agent that leverages MCP and A2A communication.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/mcp-toolbox-bigquery-dataset?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;MCP Toolbox for Databases: Making BigQuery datasets available to MCP clients&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Configure the MCP Toolbox to expose a public BigQuery dataset to an AI agent, enabling it to query and analyze datasets using natural language.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/travel-agent-mcp-toolbox-adk?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build a Travel Agent using MCP Toolbox for Databases and Agent Development Kit (ADK)&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a full-stack agent that interacts with a Cloud SQL database, demonstrating how to securely expose relational data to an AI agent.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  From Data Foundations to Advanced RAG
&lt;/h3&gt;

&lt;p&gt;Learn to build high-performance RAG systems by mastering the full data lifecycle, from generating vector embeddings within your database to implementing advanced retrieval patterns.&lt;/p&gt;

&lt;h4&gt;
  
  
  The AI Data Layer Foundation
&lt;/h4&gt;

&lt;p&gt;Discover how to transform your operational databases into AI-ready vector stores. Learn to generate embeddings, perform semantic search, and leverage built-in AI functions directly within AlloyDB and Cloud SQL.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; Coming Soon!&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to labs!&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/alloydb-ai-embedding#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Getting started with Vector Embeddings with AlloyDB AI&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Generate embeddings and perform semantic search within AlloyDB to ground Gen AI responses.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/csql-pg-ai-embedding#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Getting started with Vector Embeddings in Cloud SQL for PostgreSQL&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Generate embeddings and perform semantic search within Cloud SQL for PostgreSQL to ground Gen AI responses.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/csql-mysql-ai-embedding#0?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Getting started with Vector Embeddings in Cloud SQL for MySQL&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Generate embeddings and perform semantic search within Cloud SQL for MySQL to ground Gen AI responses.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/alloydb-ai-mm-embeddings?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Multimodal Embeddings in AlloyDB&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Perform semantic search on both text and images using multimodal embeddings in AlloyDB.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/alloydb-ai-operators?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AlloyDB AI Operators and Reranking&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Filter data with natural language and apply reranking to improve search precision using AlloyDB AI functions.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/alloydb-ai-nl-sql?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Generate SQL using AlloyDB AI natural language&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Generate reliable SQL queries from natural language by configuring custom schema context in AlloyDB.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h4&gt;
  
  
  Building the RAG application
&lt;/h4&gt;

&lt;p&gt;Move beyond basic vector search. Learn to architect robust RAG applications by using advanced retrieval strategies and leveraging tools.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; Coming soon!&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to labs!&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/7-advanced-agent-capabilities/building-agents-with-retrieval-augmented-generation?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Intro to Agentic RAG&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a multi-tool agent that combines retrieval from unstructured documents and structured data to answer reasoning-heavy questions.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/genai-db-retrieval-app?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AlloyDB Agentic RAG Application with MCP Toolbox&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Deploy the MCP Toolbox to connect an interactive AI application to an AlloyDB database for grounded responses.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/8-advanced-rag-methods/advanced-rag-methods?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Advanced RAG Techniques&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Implement and evaluate advanced strategies (Chunking, Reranking, Query Transformation) to improve the precision and recall of your RAG pipeline.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h3&gt;
  
  
  Fine-Tuning
&lt;/h3&gt;

&lt;p&gt;Go beyond prompting and learn how to fine-tune both open and proprietary models to improve performance on specific tasks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary:&lt;/strong&gt; Coming soon!&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to labs!&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/9-ai-finetuning/finetune-gemini-vertex-ai?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Fine-tune Gemini on Vertex AI&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Perform supervised fine-tuning on Gemini 2.5 Flash to adapt it for specific tasks like summarization.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/9-ai-finetuning/finetune-open-source-models-gke?utm_campaign=CDR_0x6e136736_default_b475324991&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Fine-tune Open Source LLMs on Google Cloud&lt;/a&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Build a production-grade fine-tuning pipeline for Llama 2 on GKE using LoRA and PyTorch.&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;p&gt;We're committed to making this a living, evolving resource and will be adding to it over time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Do you feel something is missing? &lt;a href="https://docs.google.com/forms/d/e/1FAIpQLSd2doeNpKTDFWKjPE5xJDi232TBZF-RaA3gcxfHY4QGEma1sg/viewform?usp=dialog" rel="noopener noreferrer"&gt;Tell us here!&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>vertexai</category>
      <category>agents</category>
      <category>security</category>
    </item>
    <item>
      <title>Agent Factory Recap: Deep Dive into Gemini CLI with Taylor Mullen</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Wed, 17 Dec 2025 00:18:20 +0000</pubDate>
      <link>https://dev.to/googleai/agent-factory-recap-deep-dive-into-gemini-cli-with-taylor-mullen-51nf</link>
      <guid>https://dev.to/googleai/agent-factory-recap-deep-dive-into-gemini-cli-with-taylor-mullen-51nf</guid>
      <description>&lt;p&gt;In the latest episode of the &lt;a href="https://youtube.com/playlist?list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs&amp;amp;feature=shared" rel="noopener noreferrer"&gt;Agent Factory podcast&lt;/a&gt;, Amit Miraj and I took a deep dive into the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt;. We were joined by the creator of the Gemini CLI, Taylor Mullen, who shared the origin story, design philosophy, and future roadmap.&lt;/p&gt;

&lt;p&gt;This post guides you through the key ideas from our conversation. Use it to quickly recap topics or dive deeper into specific segments with links and timestamps.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is the Gemini CLI?
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt; is a powerful, conversational &lt;a href="https://cloud.google.com/discover/what-are-ai-agents?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI agent&lt;/a&gt; that lives directly in your command line. It's designed to be a versatile assistant that can help you with your everyday workflows. Unlike a simple chatbot, the Gemini CLI is &lt;em&gt;&lt;a href="https://cloud.google.com/discover/what-is-agentic-ai?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;agentic&lt;/a&gt;&lt;/em&gt;. This means it can reason, choose tools, and execute multi-step plans to accomplish a goal, all while keeping you informed. It's &lt;a href="https://github.com/google-gemini/gemini-cli" rel="noopener noreferrer"&gt;open-source&lt;/a&gt;, &lt;a href="https://github.com/google-gemini/gemini-cli/blob/main/docs/extension.md" rel="noopener noreferrer"&gt;extensible&lt;/a&gt;, and as we learned from its creator, Taylor Mullen, it's built with a deep understanding of the developer workflow.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7ryqs8txt8utpg1x1io.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft7ryqs8txt8utpg1x1io.png" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Factory Floor
&lt;/h2&gt;

&lt;p&gt;The Factory Floor is our segment for getting hands-on. This week, we put the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt; to the test with two real-world demos designed to tackle everyday challenges.&lt;/p&gt;

&lt;h3&gt;
  
  
  Onboarding to a New Codebase with Gemini CLI
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=142s" rel="noopener noreferrer"&gt;02:22&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;I kicked off the demos by tackling a problem I think every developer has faced: &lt;strong&gt;getting up to speed with a new codebase&lt;/strong&gt;. This included using the Gemini CLI to complete the following tasks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Clone the &lt;a href="https://github.com/google/adk-python" rel="noopener noreferrer"&gt;python ADK repository&lt;/a&gt; from GitHub with a simple, natural language command&lt;/li&gt;
&lt;li&gt;Generate a complete project overview&lt;/li&gt;
&lt;li&gt;Utilize the &lt;a href="https://github.com/a-bonus/google-docs-mcp" rel="noopener noreferrer"&gt;google-docs-mcp&lt;/a&gt; (Model Context Protocol) server to save the generated summary directly to Google Docs&lt;/li&gt;
&lt;li&gt;Analyze the project's contribution history to understand contribution culture and workflow&lt;/li&gt;
&lt;li&gt;Find the best first task for a new contributor&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Read more on &lt;a href="https://cloud.google.com/discover/what-is-model-context-protocol?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;MCP servers and how they work here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnmyz4tarncsstu7cm13h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnmyz4tarncsstu7cm13h.png" width="800" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Supercharging Your Research with Gemini CLI
&lt;/h2&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=698s" rel="noopener noreferrer"&gt;11:38&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;For the next demo, Amit tackled a problem close to his heart: &lt;strong&gt;keeping up with the flood of new AI research papers&lt;/strong&gt;. He showed how he built a personal research assistant using the Gemini CLI to complete the following tasks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Process a directory of research papers and generate an interactive webpage explainer for each one&lt;/li&gt;
&lt;li&gt;Iterate on a simple prompt, creating a detailed, multi-part prompt to generate a better output&lt;/li&gt;
&lt;li&gt;Save the complex prompt as a reusable &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/gemini-cli-custom-slash-commands?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;custom slash command&lt;/a&gt;
Amit also shared &lt;a href="https://github.com/amitkmaraj/gemini-cli-custom-slash-commands?tab=readme-ov-file" rel="noopener noreferrer"&gt;gemini-cli-custom-slash-commands&lt;/a&gt;, a repository he put together that contains 10 practical workflow commands for Gemini CLI.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6juf6b8arg15kcr80n9y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6juf6b8arg15kcr80n9y.png" width="800" height="362"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Agent Industry Pulse
&lt;/h2&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=1046s" rel="noopener noreferrer"&gt;17:26&lt;/a&gt;]&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://blog.langchain.com/langchain-langchain-1-0-alpha-releases/" rel="noopener noreferrer"&gt;Lang Chain 1.0 Alpha&lt;/a&gt;&lt;/strong&gt;: The popular library is refocusing around a new unified agent abstraction built on Lang Graph, bringing production-grade features like state management and human-in-the-loop to the forefront.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://ai.google.dev/gemma/docs/embeddinggemma?utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Embedding Gemma&lt;/a&gt;&lt;/strong&gt;: Google's new family of open, lightweight embedding models that allow developers to build on-device, privacy-centric applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://docs.google.com/document/d/1rsaK53T3Lg5KoGwvf8ukOUvbELRtH-V0LnOIFDxBryE/preview?tab=t.0#heading=h.pxcur8v2qagu" rel="noopener noreferrer"&gt;Agentic Design Patterns for Building AI Applications&lt;/a&gt;&lt;/strong&gt;: A new book that aims to create a repository of educational resources around agent patterns.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://developers.googleblog.com/en/introducing-gemma-3-270m?utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemma 3 270M&lt;/a&gt;&lt;/strong&gt;: A tiny 270 million parameter model from Google, perfect for creating small, efficient sub-agents for simple tasks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://developers.googleblog.com/en/gemini-cli-is-now-integrated-into-zed/?utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI in Zed Code Editor&lt;/a&gt;&lt;/strong&gt;: The Gemini CLI is now integrated directly into the Z Code editor, allowing developers to explain code and generate snippets without switching contexts.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/ashishpatel26/500-AI-Agents-Projects" rel="noopener noreferrer"&gt;500 AI Agents Projects&lt;/a&gt;&lt;/strong&gt;: A GitHub repository with a categorized list of open-source agent projects.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/afshinea/stanford-cme-295-transformers-large-language-models" rel="noopener noreferrer"&gt;Transformers &amp;amp; LLMs cheatsheet&lt;/a&gt;&lt;/strong&gt;: A resource from a team at Stanford that provides a great starting point or refresher on the fundamentals of LLMs.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Taylor Mullen on the Gemini CLI
&lt;/h2&gt;

&lt;p&gt;The highlight of the episode for me was our in-depth conversation with Taylor Mullen. He gave us a fascinating look behind the curtain at the philosophy and future of the Gemini CLI. Here are some of the key questions we covered:&lt;/p&gt;

&lt;h3&gt;
  
  
  Gemini CLI Origin Story
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=1260s" rel="noopener noreferrer"&gt;21:00&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;Taylor explained that the project started about a year and a half ago as an experiment with &lt;a href="https://cloud.google.com/discover/what-is-a-multi-agent-system?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;multi-agent systems&lt;/a&gt;. While the CLI version was the most compelling, the technology at the time made it too slow and expensive. He said it was "one of those things... that was almost a little bit too early." Later, seeing the developer community embrace other AI-powered CLIs proved the demand was there. This inspired him to revisit the idea, leading to a week-long sprint where he built the first prototype.&lt;/p&gt;

&lt;h3&gt;
  
  
  On Building in the Open
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=1454s" rel="noopener noreferrer"&gt;24:14&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;For Taylor, the number one reason for making the Gemini CLI &lt;a href="https://github.com/google-gemini/gemini-cli" rel="noopener noreferrer"&gt;open source&lt;/a&gt; was &lt;strong&gt;trust and security&lt;/strong&gt;. He emphasized, "We want people to see exactly how it operates... so they can have trust." He also spoke passionately about the open-source community, calling it the "number one thing that's on my mind." He sees the community as an essential partner that helps keep the project grounded, secure, and building the right things for users.&lt;/p&gt;

&lt;h3&gt;
  
  
  Using Gemini CLI to Build Itself
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=1625s" rel="noopener noreferrer"&gt;27:05&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;When I asked Taylor how his team manages to ship an incredible &lt;strong&gt;100 to 150 features, bug fixes, and enhancements every single week&lt;/strong&gt;, his answer was simple: they use the Gemini CLI to build itself.&lt;/p&gt;

&lt;p&gt;Taylor shared a story about the CLI's first self-built feature: its own Markdown renderer. He explained that while using AI to 10x productivity is becoming easier, the real challenge is achieving 100x. For his team, this means using the agent to parallelize workflows and optimize human time. It's not about the AI getting everything right on the first try, but about creating a tight feedback loop for human-AI collaboration at scale.&lt;/p&gt;

&lt;h3&gt;
  
  
  Gemini CLI under the hood: "&lt;strong&gt;Do what a person would do&lt;/strong&gt;"
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=1858s" rel="noopener noreferrer"&gt;30:58&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;The guiding principle, Taylor said, is to "do what a person would do and don't take shortcuts." He revealed that, surprisingly, the Gemini CLI doesn't use embeddings for code search. Instead, it performs an agentic search, using tools like &lt;code&gt;grep&lt;/code&gt;, reading files, and finding references. This mimics the exact process a human developer would use to understand a codebase. The goal is to ground the AI in the most relevant, real-time context possible to produce the best results.&lt;/p&gt;

&lt;h3&gt;
  
  
  On Self-Healing and Creative Problem-Solving
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=1994s" rel="noopener noreferrer"&gt;33:14&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;We also discussed the agent's ability to "self-heal." When the CLI hits a wall, it doesn't just fail; it proposes a new plan. Taylor gave an example where the agent, after being asked for a shareable link, created a GitHub repo and used GitHub Pages to deploy the content.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next: The Future is Extensible
&lt;/h2&gt;

&lt;p&gt;Timestamp: &lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw&amp;amp;t=2119s" rel="noopener noreferrer"&gt;[35:19]&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The team is doubling down on &lt;strong&gt;extensibility&lt;/strong&gt;. The vision is to create a rich ecosystem where anyone can build, share, and install extensions. These are not just new tools, but curated bundles of commands, instructions, and &lt;a href="https://cloud.google.com/discover/what-is-model-context-protocol?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b442903523&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;MCP servers&lt;/a&gt; tailored for specific workflows. He's excited to see what the community will build and how users will customize the Gemini CLI for their unique needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your turn to build
&lt;/h2&gt;

&lt;p&gt;The best way to understand the power of the Gemini CLI is to try it yourself. &lt;/p&gt;

&lt;p&gt;Check out the &lt;a href="https://github.com/google-gemini/gemini-cli/tree/main" rel="noopener noreferrer"&gt;Gemini CLI on GitHub&lt;/a&gt; to see community projects, file an issue, or contribute. Additionally, don't miss the full conversation: &lt;a href="https://www.youtube.com/watch?v=B-UpaBm8tiw" rel="noopener noreferrer"&gt;watch the episode&lt;/a&gt; and &lt;a href="https://www.youtube.com/playlist?list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs" rel="noopener noreferrer"&gt;subscribe to The Agent Factory&lt;/a&gt; to join us for our next deep dive.&lt;/p&gt;

&lt;h2&gt;
  
  
  Connect with us
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Taylor  → &lt;a href="https://www.linkedin.com/posts/ntaylormullen_geminicli-gemini-ai-activity-7371608566941540352-N6Dx?utm_source=social_share_send&amp;amp;utm_medium=member_desktop_web&amp;amp;rcm=ACoAAAWA_P8BnBYVs2I0ABuh8FjqB4glU_yGbtc" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://bsky.app/profile/ntaylormullen.bsky.social/post/3lyitypurnk2y" rel="noopener noreferrer"&gt;BlueSky&lt;/a&gt;, &lt;a href="https://x.com/ntaylormullen/status/1965842571961446524" rel="noopener noreferrer"&gt;X&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Amit  → &lt;a href="https://www.linkedin.com/in/amit-maraj/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://x.com/agenticamit" rel="noopener noreferrer"&gt;X&lt;/a&gt;, &lt;a href="http://tiktok.com/@agenticamit" rel="noopener noreferrer"&gt;TikTok&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Mollie  → &lt;a href="https://www.linkedin.com/in/molliepettit/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;, &lt;a href="https://bsky.app/profile/mollzmp.bsky.social" rel="noopener noreferrer"&gt;BlueSky&lt;/a&gt;, &lt;a href="https://x.com/MollzMP" rel="noopener noreferrer"&gt;X&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>gemini</category>
      <category>agents</category>
      <category>cli</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Build Your First ADK Agent Workforce</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Fri, 12 Dec 2025 21:01:49 +0000</pubDate>
      <link>https://dev.to/googleai/build-your-first-adk-agent-workforce-5b55</link>
      <guid>https://dev.to/googleai/build-your-first-adk-agent-workforce-5b55</guid>
      <description>&lt;p&gt;The world of &lt;a href="https://cloud.google.com/use-cases/generative-ai?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Generative AI&lt;/a&gt; is evolving rapidly, and &lt;a href="https://cloud.google.com/discover/what-are-ai-agents?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI Agents&lt;/a&gt; are at the forefront of this change. An AI agent is a software system designed to act on your behalf. They show reasoning, planning, and memory and have a level of autonomy to make decisions, learn, and adapt.&lt;/p&gt;

&lt;p&gt;At its core, an AI agent uses a &lt;a href="https://cloud.google.com/ai/llms?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;large language model (LLM)&lt;/a&gt;, like &lt;a href="https://ai.google.dev/gemini-api/docs/models?utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini&lt;/a&gt; to understand and reason. This allows it to process information from various sources, create a plan, and execute a series of tasks to reach a predefined objective. This is the key difference between a simple prompt-and-response and an agent: the ability to act on a multi-step plan.&lt;/p&gt;

&lt;p&gt;The great news is that you can now easily build your own AI agents, even without deep expertise, thanks to &lt;a href="https://google.github.io/adk-docs/" rel="noopener noreferrer"&gt;Agent Development Kit (ADK)&lt;/a&gt;. ADK is an open-source &lt;a href="https://google.github.io/adk-docs/get-started/python/" rel="noopener noreferrer"&gt;Python&lt;/a&gt; and &lt;a href="https://google.github.io/adk-docs/get-started/java/" rel="noopener noreferrer"&gt;Java&lt;/a&gt; framework by Google designed to simplify agent creation.&lt;/p&gt;

&lt;p&gt;To guide you, this post introduces three hands-on labs that cover the core patterns of agent development:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Building your first autonomous agent&lt;/li&gt;
&lt;li&gt;Empowering that agent with tools to interact with external services&lt;/li&gt;
&lt;li&gt;Orchestrate a multi-agent system where specialized agents collaborate&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Build your first agent
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-foundation?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;This lab&lt;/a&gt; introduces the foundational principles of &lt;a href="https://github.com/google/adk-python" rel="noopener noreferrer"&gt;ADK&lt;/a&gt; by guiding you through the construction of a &lt;strong&gt;personal assistant agent&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;You will write the code for the agent itself and will interact directly with the agent's core reasoning engine, powered by &lt;a href="https://ai.google.dev/gemini-api/docs/models?utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini&lt;/a&gt;, to see how it responds to a simple request. This lab is focused on building the fundamental scaffolding of every agent you'll create.&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-foundation?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Building AI Agents with ADK:The Foundation&lt;/a&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Write the essential code to define and run a basic agent, learning the core structure of ADK.&lt;/em&gt;&lt;br&gt;

&lt;/p&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  Empower your agent with tools
&lt;/h2&gt;

&lt;p&gt;An agent without custom tools can only rely on its built-in knowledge. To make it more powerful for your specific use-case, you can give it access to specialized tools. In this lab, you will learn three different ways to add tools:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Build a Custom Tool&lt;/strong&gt;: Write a currency exchange tool from scratch.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integrate a Built-in Tool&lt;/strong&gt;: Add ADK's pre-built &lt;a href="https://google.github.io/adk-docs/tools/built-in-tools/#google-search" rel="noopener noreferrer"&gt;Google Search tool&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Leverage a Third-Party Tool&lt;/strong&gt;: Import and use a &lt;a href="https://python.langchain.com/docs/integrations/tools/wikipedia/" rel="noopener noreferrer"&gt;Wikipedia tool&lt;/a&gt; from the LangChain library.&lt;/li&gt;
&lt;/ul&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-empowering-with-tools?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Empower ADK Agents with Tools&lt;/a&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Learn how to make your agent truly useful by giving it tools to interact with external applications and services.&lt;/em&gt;&lt;br&gt;

&lt;/p&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  Build a Team of Specialized Agents
&lt;/h2&gt;

&lt;p&gt;When a task is too complex for a single agent, you can build out a multi-agent team. This lab goes deep into the power of &lt;a href="https://cloud.google.com/discover/what-is-a-multi-agent-system?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;multi-agent systems&lt;/a&gt; by having you build a "movie pitch development team" that can research, write, and analyze a film concept.&lt;/p&gt;

&lt;p&gt;You will learn how to use &lt;a href="https://google.github.io/adk-docs/agents/workflow-agents/" rel="noopener noreferrer"&gt;ADK's Workflow Agents&lt;/a&gt; to control the flow of work automatically, without needing user input at every step. You'll also learn how to use the session state to pass information between the agents.&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/3-developing-agents/build-a-multi-agent-system-with-adk?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build Multi-Agent Systems with ADK&lt;/a&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Orchestrate a complex, automated workflow with a team of specialized agents that work in sequence, in loops, and in parallel.&lt;/em&gt;&lt;br&gt;

&lt;/p&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  Summary: Build Your First AI Teammate Today
&lt;/h2&gt;

&lt;p&gt;Ready to build your first &lt;a href="https://cloud.google.com/discover/what-are-ai-agents?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI agents&lt;/a&gt;? Dive into the codelabs from this post:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-foundation?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Building AI Agents with ADK: The Foundation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://codelabs.developers.google.com/devsite/codelabs/build-agents-with-adk-empowering-with-tools?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Empower ADK Agents with Tools&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/3-developing-agents/build-a-multi-agent-system-with-adk?hl=en#0&amp;amp;utm_campaign=CDR_0x6e136736_default_b456239055&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Build Multi-Agent Systems with ADK&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Share your progress and connect with others on the journey using the hashtag &lt;strong&gt;#ProductionReadyAI&lt;/strong&gt;. Happy learning! &lt;/p&gt;

&lt;p&gt;These labs are part of the &lt;strong&gt;Developing Agents&lt;/strong&gt; module in our official &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/production-ready-ai-with-google-cloud-learning-path" rel="noopener noreferrer"&gt;Production-Ready AI with Google Cloud&lt;/a&gt; program. Explore the full curriculum for more content that will help you bridge the gap from a promising prototype to a production-grade AI application.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>agents</category>
      <category>gemini</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Agent Factory Recap: Keith Ballinger on AI, The Future of Development, and Vibe Coding</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Fri, 12 Dec 2025 16:34:02 +0000</pubDate>
      <link>https://dev.to/googleai/agent-factory-recap-keith-ballinger-on-ai-the-future-of-development-and-vibe-coding-4ldf</link>
      <guid>https://dev.to/googleai/agent-factory-recap-keith-ballinger-on-ai-the-future-of-development-and-vibe-coding-4ldf</guid>
      <description>&lt;p&gt;In Episode #6 of the &lt;a href="https://youtube.com/playlist?list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs&amp;amp;feature=shared" rel="noopener noreferrer"&gt;Agent Factory podcast&lt;/a&gt;, Vlad Kolesnikov and I were joined by Keith Ballinger, VP and General Manager at &lt;a href="https://cloud.google.com/docs?_gl=1*1y8pxfz*_up*MQ..&amp;amp;gclid=CjwKCAjwq9rFBhAIEiwAGVAZP6h6c24JCTSWVSCEzWD_LsDgDhDB5b-3wvFqN5axIltM8EMHqo1AChoCPTMQAvD_BwE&amp;amp;gclsrc=aw.ds&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Google Cloud&lt;/a&gt;, for a deep dive into the transformative future of software development with AI. We explore how AI agents are reshaping the developer's role and boosting team productivity.&lt;/p&gt;

&lt;p&gt;This post guides you through the key ideas from our conversation. Use it to quickly recap topics or dive deeper into specific segments with links and timestamps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Keith Ballinger on the Future of Development
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What is "Impossible Computing"?
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://youtu.be/I-xS4nw-HfU?feature=shared&amp;amp;t=111" rel="noopener noreferrer"&gt;01:51&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;Keith Ballinger kicked off the discussion by redefining a term from his personal blog: "Impossible Computing." For him, it isn't about solving intractable computer science problems, but rather about making difficult, time-consuming tasks feel seamless and even joyful for developers.&lt;/p&gt;

&lt;p&gt;He described it as a way to “make things that were impossible or at least really, really hard for people, much more easy and almost seamless for them.”&lt;/p&gt;

&lt;h3&gt;
  
  
  AI's Impact on Team Productivity
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=303s" rel="noopener noreferrer"&gt;05:03&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;The conversation explored how AI's impact extends beyond the individual developer to the entire team. Keith shared a practical example of how his teams at Google Cloud use the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt; as a GitHub action to triage issues and conduct initial reviews on pull requests, showcasing Google Cloud's commitment to AI-powered software development.&lt;/p&gt;

&lt;p&gt;This approach delegates the more mundane tasks, freeing up human developers to focus on higher-level logic and quality control, ultimately breaking down bottlenecks and increasing the team's overall velocity.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Developer's New Role: A Conductor of an Orchestra
&lt;/h3&gt;

&lt;p&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=597s" rel="noopener noreferrer"&gt;09:57&lt;/a&gt;]&lt;/p&gt;

&lt;p&gt;A central theme of the conversation was the evolution of the developer's role. Keith suggested that developers are shifting from being coders who write every line to becoming "conductors of an orchestra."&lt;/p&gt;

&lt;p&gt;In this view, the developer holds the high-level vision (the system architecture) and directs a symphony of AI agents to execute the specific tasks. This paradigm elevates the developer's most critical skills to high-level design and context engineering—the craft of providing AI agents with the right information at the right time for efficient software development.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftdjy65tjw8stmrcaft5l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftdjy65tjw8stmrcaft5l.png" width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Factory Floor
&lt;/h2&gt;

&lt;p&gt;The Factory Floor is our segment for getting hands-on. Here, we moved from high-level concepts to practical code with live demos from both Keith and Vlad.&lt;/p&gt;

&lt;h2&gt;
  
  
  Showcase: The &lt;code&gt;Terminus&lt;/code&gt; and &lt;code&gt;Aether&lt;/code&gt; Projects
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Timestamps: [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=1262s" rel="noopener noreferrer"&gt;21:02&lt;/a&gt;] and [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=1697s" rel="noopener noreferrer"&gt;28:17&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Keith shared two of his open-source projects as tangible "demonstration[s] of &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; intended to provide a trustworthy and verifiable example that developers and researchers can use."&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://github.com/GoogleCloudPlatform/terminus" rel="noopener noreferrer"&gt;Terminus&lt;/a&gt;: A Go framework for building web applications with a terminal-style interface. Keith described it as a fun, exploratory project he built over a weekend. &lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/GoogleCloudPlatform/aether" rel="noopener noreferrer"&gt;Aether&lt;/a&gt;: An experimental programming language designed specifically for &lt;a href="https://ai.google.dev/gemini-api/docs/models?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;LLMs&lt;/a&gt;. He explained his thesis that a language built for machines—highly explicit and deterministic—could allow an AI to generate code more effectively than with languages designed for human readability. &lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Vibe Coding a Markdown App
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=1901s" rel="noopener noreferrer"&gt;31:41&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Keith provided a live demonstration of his &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; workflow. Starting with a single plain-English sentence, he guided the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt; to generate a user guide, technical architecture, and a step-by-step plan. This resulted in a functional command-line markdown viewer in under 15 minutes. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F60dq1q4bdvxw0wk6xgxk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F60dq1q4bdvxw0wk6xgxk.png" width="800" height="401"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Video with AI
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=2833s" rel="noopener noreferrer"&gt;47:13&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Vlad showcased a different application of AI agents: creative, multi-modal content generation. He walked through a workflow that used &lt;a href="https://ai.google.dev/gemini-api/docs/image-generation?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini 2.5 Flash Image&lt;/a&gt; (also known as &lt;strong&gt;Nano Banana&lt;/strong&gt;) and other AI tools to generate a viral video of a capybara for a fictional ad campaign. This demonstrated how to go from a simple prompt to a final video.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsf9ah2k3dhkzyl8ovahd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsf9ah2k3dhkzyl8ovahd.png" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Inspired by Vlad's Demo?
&lt;/h3&gt;

&lt;p&gt;If you're interested in learning how to build and deploy creative AI projects like the one Vlad showcased, the &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/accelerate-ai-with-cloud-run-sign-up-now-for-a-developer-workshop-near-you?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Accelerate AI with Cloud Run&lt;/a&gt; program is designed to help you take your ideas from prototype to production with workshops, labs, and more.&lt;/p&gt;

&lt;p&gt;Take the next step and &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/accelerate-ai-with-cloud-run-sign-up-now-for-a-developer-workshop-near-you?e=48754805&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;register here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgnrzvlsg79htajkwct4x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgnrzvlsg79htajkwct4x.png" width="800" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Developer Q&amp;amp;A
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;t=3397s" rel="noopener noreferrer"&gt;56:37&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;We wrapped up the episode by putting some great questions from the developer community to Keith.&lt;/p&gt;

&lt;h3&gt;
  
  
  On Infrastructure Bottlenecks for AI Workloads
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/I-xS4nw-HfU?feature=shared&amp;amp;t=3401" rel="noopener noreferrer"&gt;56:42&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Keith explained that he sees a role for both major cloud providers and a "healthy ecosystem of startups" in solving challenges like GPU utilization. He was especially excited about how serverless platforms are adapting, highlighting that &lt;a href="https://cloud.google.com/run?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Cloud Run&lt;/a&gt; now &lt;a href="https://cloud.google.com/run/docs/configuring/services/gpu?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;offers GPUs&lt;/a&gt; to provide the same fast, elastic experience for AI workloads that developers expect for other applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  On Multi-Cloud and Edge Deployment for AI
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [_&lt;a href="https://youtu.be/I-xS4nw-HfU?feature=shared&amp;amp;t=3496" rel="noopener noreferrer"&gt;https://youtu.be/I-xS4nw-HfU?feature=shared&amp;amp;t=3496&lt;/a&gt;&lt;/em&gt;]_&lt;/p&gt;

&lt;p&gt;In response to a question about a high-level service for orchestrating AI across &lt;a href="https://cloud.google.com/architecture/deployment-archetypes/multicloud?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;multi-cloud&lt;/a&gt; and &lt;a href="https://cloud.google.com/architecture/hybrid-multicloud-patterns-and-practices/edge-hybrid-pattern?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;edge&lt;/a&gt; deployment, Keith was candid that he hasn't heard a lot of direct customer demand for it yet. However, he called the area "untapped" and invited the question-asker to email him, showing a clear interest in exploring its potential.&lt;/p&gt;

&lt;h3&gt;
  
  
  On AI in Regulated Industries (Finance, Legal)
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/I-xS4nw-HfU?feature=shared&amp;amp;t=3553" rel="noopener noreferrer"&gt;59:13&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Calling it the "billion-dollar question," Keith emphasized that as AI accelerates development, the need for a mature and robust compliance regime becomes even more critical. His key advice was that the human review piece is more important than ever. He suggested the best place to start is using AI to assist and validate human work. For example, brainstorm a legal brief with an AI rather than having the AI write the final brief for court submission.&lt;/p&gt;

&lt;p&gt;We concluded this conversation feeling inspired by the future of AI in software development and the potential of AI Agents and the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_awareness_b442602643&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt;. For the complete conversation, &lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU" rel="noopener noreferrer"&gt;listen to our full episode with Keith Ballinger now&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Connect with us
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Keith  → &lt;a href="https://github.com/keithballinger/keithballinger/discussions" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Vlad  → &lt;a href="https://www.linkedin.com/in/vkolesnikov/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Mollie  → &lt;a href="https://www.linkedin.com/in/molliepettit/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>vibecoding</category>
      <category>productivity</category>
      <category>gemini</category>
    </item>
    <item>
      <title>Your First AI Application is Easier Than You Think</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Wed, 03 Dec 2025 21:02:47 +0000</pubDate>
      <link>https://dev.to/googleai/your-first-ai-application-is-easier-than-you-think-4dbc</link>
      <guid>https://dev.to/googleai/your-first-ai-application-is-easier-than-you-think-4dbc</guid>
      <description>&lt;p&gt;If you're a developer, you've seen &lt;a href="https://cloud.google.com/use-cases/generative-ai?e=48754805&amp;amp;hl&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;generative AI&lt;/a&gt; everywhere. It can feel like a complex world of &lt;a href="https://cloud.google.com/discover/what-is-an-ai-model?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;models&lt;/a&gt; and advanced concepts. It can be difficult to know where to actually start.&lt;/p&gt;

&lt;p&gt;The good news is that building your first AI-powered application is more accessible than you might imagine. You don't need to be an AI expert to get started. This post introduces a new codelab designed to bridge this gap and provide you with a first step. We'll guide you through the entire process of building a functional, interactive travel chatbot using Google's &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini model&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK#0?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Dive into the codelab and build your first AI application today!&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting the Stage: Your First Project
&lt;/h2&gt;

&lt;p&gt;In this &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK#0?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;codelab&lt;/a&gt;, you'll step into the role of a developer at a travel company tasked with building a new chat application. You'll start with a basic web application frontend and, step-by-step, you will bring it to life by connecting it to the power of &lt;a href="https://cloud.google.com/use-cases/generative-ai?e=48754805&amp;amp;hl&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;generative AI&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;By the end, you will have built a travel assistant that can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Answer questions about travel destinations.&lt;/li&gt;
&lt;li&gt;Provide personalized recommendations.&lt;/li&gt;
&lt;li&gt;Fetch real-time data, like the weather, to give genuinely helpful advice.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The process is broken down into a few key stages.&lt;/p&gt;

&lt;h2&gt;
  
  
  Making the First Connection
&lt;/h2&gt;

&lt;p&gt;Before you can do anything fancy, you need to get your application talking to the &lt;a href="https://cloud.google.com/discover/what-is-an-ai-model?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI model&lt;/a&gt;. An easy way to do this is with the &lt;a href="https://cloud.google.com/vertex-ai/docs/python-sdk/use-vertex-ai-python-sdk?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI SDK&lt;/a&gt;, a complete library for interacting with the &lt;a href="https://cloud.google.com/vertex-ai?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI platform&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwetd2fsxejcypffmdtgo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwetd2fsxejcypffmdtgo.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While the &lt;a href="https://cloud.google.com/vertex-ai/docs/python-sdk/use-vertex-ai-python-sdk?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI SDK&lt;/a&gt; is a powerful tool for the full machine learning lifecycle, this lab focuses on one of its most-used tools: building &lt;a href="https://cloud.google.com/use-cases/generative-ai?e=48754805&amp;amp;hl&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;generative AI&lt;/a&gt; applications. This part of the &lt;a href="https://cloud.google.com/vertex-ai/docs/python-sdk/use-vertex-ai-python-sdk?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Vertex AI SDK&lt;/a&gt; acts as the bridge between your application and the &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini model&lt;/a&gt;. Without it, you would have to manually handle all the complex wiring yourself—writing code to manage authentication, formatting intricate API requests, and parsing the responses. The Vertex AI SDK handles all that complexity for you so you can focus on what you actually want to do: send a message and get a response. &lt;/p&gt;

&lt;p&gt;In &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK#0?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;this codelab&lt;/a&gt;, you'll see just how simple it is.&lt;/p&gt;

&lt;h2&gt;
  
  
  Giving your AI purpose with system instructions
&lt;/h2&gt;

&lt;p&gt;Once your app is connected, you'll notice the AI's responses won't be tailored to your purposes yet. One way you can make it more useful for your specific use case is by giving it &lt;a href="https://cloud.google.com/vertex-ai/generative-ai/docs/learn/prompts/system-instructions?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;system instructions&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Hot Tip: Use Google AI Studio to Create Your System Instructions
&lt;/h2&gt;

&lt;p&gt;A great way to develop your &lt;a href="https://cloud.google.com/vertex-ai/generative-ai/docs/learn/prompts/system-instructions?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;system instructions&lt;/a&gt; is to leverage &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini&lt;/a&gt; as a creative partner to draft them for you. For example, you could ask &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini&lt;/a&gt; in &lt;a href="https://aistudio.google.com/welcome?utm_source=google&amp;amp;utm_medium=cpc&amp;amp;utm_campaign=FY25-global-DR-gsem-BKWS-1710442&amp;amp;utm_content=text-ad-none-any-DEV_c-CRE_726176659625-ADGP_Hybrid%20%7C%20BKWS%20-%20EXA%20%7C%20Txt-AI%20Studio-AI%20Studio-KWID_1276544732073-kwd-1276544732073&amp;amp;utm_term=KW_google%20ai%20studio-ST_google%20ai%20studio&amp;amp;gclsrc=aw.ds&amp;amp;gad_source=1&amp;amp;gad_campaignid=21030196240&amp;amp;gbraid=0AAAAACn9t67aW4ac1BtMao1eA_HHRwMKa&amp;amp;gclid=EAIaIQobChMIp_fMhfXHkAMVPgytBh2JBgb2EAAYASAAEgJ3V_D_BwE" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt; to generate a thorough set of instructions for a "sophisticated and friendly travel assistant."&lt;/p&gt;

&lt;p&gt;Once you have a draft, you can immediately test it, also in &lt;a href="https://aistudio.google.com/welcome?utm_source=google&amp;amp;utm_medium=cpc&amp;amp;utm_campaign=FY25-global-DR-gsem-BKWS-1710442&amp;amp;utm_content=text-ad-none-any-DEV_c-CRE_726176659625-ADGP_Hybrid%20%7C%20BKWS%20-%20EXA%20%7C%20Txt-AI%20Studio-AI%20Studio-KWID_1276544732073-kwd-1276544732073&amp;amp;utm_term=KW_google%20ai%20studio-ST_google%20ai%20studio&amp;amp;gclsrc=aw.ds&amp;amp;gad_source=1&amp;amp;gad_campaignid=21030196240&amp;amp;gbraid=0AAAAACn9t67aW4ac1BtMao1eA_HHRwMKa&amp;amp;gclid=EAIaIQobChMIp_fMhfXHkAMVPgytBh2JBgb2EAAYASAAEgJ3V_D_BwE" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt;. Start a new chat and in the panel to the right, set the &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini model&lt;/a&gt; to the one you're using in your app and paste the text into the system instruction field. This allows you to quickly interact with the model and see how it behaves with your instructions, all without writing any code. When you're happy with the results, you can copy the final version directly into your application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4pwjlxu4r41phawqip0p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4pwjlxu4r41phawqip0p.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting Your AI to the Real World
&lt;/h2&gt;

&lt;p&gt;This is where you break the model out of its knowledge silo and connect it to live data. By default, an &lt;a href="https://cloud.google.com/discover/what-is-an-ai-model?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI model&lt;/a&gt;'s knowledge is limited to the data it was trained on; it doesn't know today's weather. However, you can provide &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/models?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini&lt;/a&gt; with access to external knowledge using a powerful feature called &lt;a href="https://docs.cloud.google.com/vertex-ai/generative-ai/docs/multimodal/function-calling?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;function calling&lt;/a&gt;! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94bqsdkvxzws94f0e60t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F94bqsdkvxzws94f0e60t.png"&gt;&lt;/a&gt;&lt;br&gt;
The concept is simple: you write a basic Python function (like one to check the weather) and then describe that tool to the model. Then, when a user asks about the weather, the model can ask your application to run your function and use the live result in its answer. This allows the model to answer questions far beyond its training data, making it a much more powerful and useful assistant with access to up-to-the-minute information. &lt;/p&gt;

&lt;p&gt;In &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK#0?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;this lab&lt;/a&gt;, we used the &lt;a href="https://open-meteo.com/en/docs/geocoding-api" rel="noopener noreferrer"&gt;Geocoding API&lt;/a&gt; and the &lt;a href="https://open-meteo.com/en/docs" rel="noopener noreferrer"&gt;Weather Forecast API&lt;/a&gt; to provide the app with the ability to factor in the weather when answering questions about travel. &lt;/p&gt;
&lt;h2&gt;
  
  
  Your Journey Starts Here
&lt;/h2&gt;

&lt;p&gt;Building with &lt;a href="https://cloud.google.com/learn/what-is-artificial-intelligence?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI&lt;/a&gt; isn't about knowing everything at once. It's about taking the first step, building something tangible, and learning key concepts along the way. &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK#0?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;This codelab&lt;/a&gt; was designed to be that first step. By the end, you won't just have a working travel chatbot—you'll have hands-on experience with the fundamental building blocks of a production-ready &lt;a href="https://cloud.google.com/discover/ai-applications?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI application&lt;/a&gt;. You'll be surprised at what you can build.&lt;/p&gt;


&lt;div class="crayons-card c-embed"&gt;

  &lt;br&gt;
&lt;strong&gt;Go to the lab!&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Lab:&lt;/strong&gt; &lt;a href="https://codelabs.developers.google.com/codelabs/production-ready-ai-with-gc/1-developing-apps-that-use-llms/developing-LLM-apps-with-Vertex-AI-SDK#0?utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Developing LLM Apps with the Vertex AI SDK&lt;/a&gt;

&lt;p&gt;&lt;strong&gt;Objective:&lt;/strong&gt; &lt;em&gt;Learn to build a chatbot using the Vertex AI SDK and Gemini, integrate real-time data with external tools, and refine model output with prompt engineering.&lt;/em&gt;&lt;/p&gt;


&lt;/div&gt;


&lt;p&gt;Share your progress and connect with others on the journey using the hashtag &lt;strong&gt;#ProductionReadyAI&lt;/strong&gt;. Happy learning! &lt;/p&gt;

&lt;h2&gt;
  
  
  From Prototype to Production
&lt;/h2&gt;

&lt;p&gt;This lab is also the first module in our official &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/production-ready-ai-with-google-cloud-learning-path" rel="noopener noreferrer"&gt;Production-Ready AI with Google Cloud&lt;/a&gt; program. Explore the full curriculum for more content that will help you bridge the gap from a promising prototype to a production-grade &lt;a href="https://cloud.google.com/discover/ai-applications?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_default_b452058013&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;AI application&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>gemini</category>
      <category>vertexai</category>
      <category>beginners</category>
    </item>
    <item>
      <title>From Skeptic to Believer: A Process-Driven Approach to Vibe Coding</title>
      <dc:creator>Mollie Pettit</dc:creator>
      <pubDate>Wed, 05 Nov 2025 23:05:11 +0000</pubDate>
      <link>https://dev.to/googleai/a-skeptics-guide-to-vibe-coding-213p</link>
      <guid>https://dev.to/googleai/a-skeptics-guide-to-vibe-coding-213p</guid>
      <description>&lt;p&gt;I’ll admit, I've in the past had a negative reaction to the term &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442599931&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt;. (And in some circumstances, I still do.) It often brought to mind an engineer carelessly committing code that’s buggy or inefficient without proper checks.&lt;/p&gt;

&lt;p&gt;My perspective changed after our conversation with Keith Ballinger, VP and General Manager at &lt;a href="https://cloud.google.com/docs?utm_campaign=CDR_0x6e136736_awareness_b442599931&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Google Cloud&lt;/a&gt; on &lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs&amp;amp;index=7" rel="noopener noreferrer"&gt;episode 6&lt;/a&gt; of the &lt;a href="https://youtube.com/playlist?list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs&amp;amp;feature=shared" rel="noopener noreferrer"&gt;Agent Factor podcast&lt;/a&gt;. He showed us that his approach to vibe coding isn't about chaos; it's about a surprisingly &lt;strong&gt;structured template&lt;/strong&gt; that turns a vague 'vibe' into a concrete plan.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Demo: Vibe coding a command line Markdown viewer with the Gemini CLI&lt;/strong&gt;&lt;br&gt;


  &lt;iframe src="https://www.youtube.com/embed/K1KAedjQx6M"&gt;
  &lt;/iframe&gt;


&lt;/p&gt;

&lt;p&gt;My biggest takeaway is that vibe coding is more nuanced than I first thought. I now see its value on a spectrum: it's a &lt;strong&gt;powerful tool for exploration when you're in unfamiliar territory&lt;/strong&gt;, and a &lt;strong&gt;tool for acceleration when you're an expert in your domain&lt;/strong&gt;. It's a workflow for both creative ideation and efficient execution.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Template: a structured workflow for AI-assisted development
&lt;/h2&gt;

&lt;p&gt;On the show, Keith suggested on a whim that we vibe-code a command-line markdown viewer from scratch using the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_awareness_b442599931&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt;. This turned out to be my favorite moment of the show. What stuck out most to me was his methodical approach (which I am totally stealing, btw). &lt;/p&gt;

&lt;p&gt;Here’s a breakdown of the template he used:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmcfqnzou72kuoq3m4h0b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmcfqnzou72kuoq3m4h0b.png" alt="Flow of steps: start with the user, get language selection, create technical blueprint, generate detailed plan, give AI meta-instructions, Add personality"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: start with the user, not the code
&lt;/h3&gt;

&lt;p&gt;Before writing a single line of code, Keith’s first move was to define the user experience. He prompted the &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_awareness_b442599931&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt; to create a &lt;code&gt;UserGuide.md&lt;/code&gt;. &lt;em&gt;Timestamp: [&lt;a href="https://www.youtube.com/watch?v=K1KAedjQx6M&amp;amp;t=109s" rel="noopener noreferrer"&gt;01:49&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The demo prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;I want to build a command line markdown viewer. It should paginate long markdown files, have some syntax highlighting, and does not include any editing features. Write me a user guide and save it to UserGuide.md. Wait for my review, don't write any code yet.&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkm4joc1yrmiowyvvekn1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkm4joc1yrmiowyvvekn1.png" alt="Screenshot of the Gemini CLI with the first prompt"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I like it:&lt;/strong&gt; Doing this as a first step forces clarity on the project's goals from the end-user's perspective. It establishes a clear definition of "done" before delving into technical details. It also ensures the LLM proceeds incrementally and doesn't jump ahead.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: get a programming language suggestion
&lt;/h3&gt;

&lt;p&gt;Next, Keith requested advice on what programming language to use for this project. &lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/K1KAedjQx6M?si=LP2wQ4ASpuauiDlA&amp;amp;t=231" rel="noopener noreferrer"&gt;03:51&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The demo prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Let's build a technical design for mdview. What language do you suggest we use? Give me three options and your final recommendation. And after I tell you, we'll write the technical design.&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I like it:&lt;/strong&gt; It can be tempting to default to familiar tools. However, this approach can sometimes lead to using the wrong tool for the job. This step serves as a helpful reminder to consider alternatives, whether by asking the LLM or evaluating options yourself. When the best approach involves an unfamiliar tool, it presents an opportunity to acquire a new skill! &lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: create a technical blueprint
&lt;/h3&gt;

&lt;p&gt;Once the user guide was approved, he had the agent act as a junior architect, suggesting technologies and outlining a technical design in an &lt;code&gt;Arch.md&lt;/code&gt; file. &lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/K1KAedjQx6M?si=6LLmLdeZihyPH3Tc&amp;amp;t=300" rel="noopener noreferrer"&gt;05:00&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The demo prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Let's author a technical design and save it to Arch.md. Be very detailed. Wait for my review before coding.&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I like it:&lt;/strong&gt; This separates the high-level design from the implementation. It creates a natural checkpoint for human review, allowing for changes and approval of the technical direction before committing to code.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: generate a detailed, check-list style plan
&lt;/h3&gt;

&lt;p&gt;This was the core of the template. Keith asked the agent to break down the entire project into a series of discrete, numbered tasks in a &lt;code&gt;plan.md&lt;/code&gt; file. &lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/K1KAedjQx6M?si=pt1CZwTV4thoMw0z&amp;amp;t=420" rel="noopener noreferrer"&gt;07:00&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The demo prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Now let's create a detailed task plan and save it to plan.md. Include in the plan.md some general workflow&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I like it:&lt;/strong&gt; This is the step that turns a fuzzy idea into an actionable project plan. It creates a clear roadmap for both the developer and the AI, preventing the agent from going off-track or attempting to build everything simultaneously.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5: give the AI meta-instructions on how to work
&lt;/h3&gt;

&lt;p&gt;I found this to be a subtle but powerful part of the process. In addition to asking the AI to execute the plan, this prompt told it &lt;em&gt;how to use the plan&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The demo prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Update plan.md when each task is completed with implementation notes. Update arch.md with any design changes. I will review, and then we'll start.&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I like it:&lt;/strong&gt; By instructing the agent to update the &lt;code&gt;plan.md&lt;/code&gt; file with checkboxes and implementation notes after each step, he created a self-documenting workflow. This is valuable because it preserves the session's state, making it easy to review progress, debug, or even step away and resume the project another day. It also keeps other documents up to date with changes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Bonus step: add personality! 🤪
&lt;/h3&gt;

&lt;p&gt;The part of this demo that tickled me most is when Keith decided to inject some personality into the interaction. This was a fun reminder that we can adjust how the AI interacts with us to enhance our experience. &lt;em&gt;Timestamp: [&lt;a href="https://youtu.be/K1KAedjQx6M?si=trpxMTYilM0uvy4a&amp;amp;t=563" rel="noopener noreferrer"&gt;09:23&lt;/a&gt;]&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The demo prompt:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;From here on out, address me as K-bro. And use puns liberally.&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;The AI immediately obliged, quipping things like, "&lt;em&gt;Alright, K-bro, consider it done!&lt;/em&gt;" and "&lt;em&gt;It's time to make this code look rich&lt;/em&gt;" (punning on the use of the &lt;a href="https://github.com/Textualize/rich" rel="noopener noreferrer"&gt;rich library&lt;/a&gt;). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1lb83wbyc3bamv47fju0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1lb83wbyc3bamv47fju0.png" alt="Screenshot of the Gemini CLI's response to this prompt, as described above"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why I like it:&lt;/strong&gt; Adding these kinds of personality adjustments could add intermittent giggles and delight into the driest of projects. (Listen and you'll hear me laughing in the background at this part. :p) Why not make the entire development process more enjoyable, creative, and feel less like working with a sterile machine?&lt;/p&gt;

&lt;p&gt;❓&lt;strong&gt;Question for you:&lt;/strong&gt; What ideas do you have for injecting personality into your workflow? I'd love to see your examples!&lt;/p&gt;

&lt;p&gt;Beyond the fun, the demo left me with a few key takeaways about how I'll approach AI-assisted development from now on.&lt;/p&gt;

&lt;h2&gt;
  
  
  My takeaways
&lt;/h2&gt;

&lt;h3&gt;
  
  
  It's about exploration and acceleration
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3bmfe29m1ogfb4kqhc2w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3bmfe29m1ogfb4kqhc2w.png" alt="A visual that depicts two people, one in exploration mode, and one in acceleration mode"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Keith’s workflow helped me reframe vibe coding. Instead of seeing it as just a tool for prototyping, I now see its value on a spectrum that goes from pure exploration to powerful acceleration.&lt;/p&gt;

&lt;p&gt;For work you're not familiar with, like learning a new language or system, it’s a tool for &lt;strong&gt;exploration&lt;/strong&gt;. The ability to experiment and iterate at a speed that was never before possible is a massive advantage. Projects you might have put aside because the learning curve was too steep are now accessible.&lt;/p&gt;

&lt;p&gt;For work you already know inside and out, it's a tool for &lt;strong&gt;acceleration&lt;/strong&gt;. As Keith's demo showed, the most productive AI users are experts in their domain. They can offload the mundane parts of their job—like writing boilerplate code or hundreds of tests—making them faster and, frankly, happier.&lt;/p&gt;

&lt;p&gt;This interview made me want to go out and build more things. To experiment more, prototype more, and see what I can quickly whip together to solve problems in my own work and life. I hope it inspires you as well. (Get started with &lt;a href="https://cloud.google.com/gemini/docs/codeassist/gemini-cli?utm_campaign=CDR_0x6e136736_awareness_b442599931&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;Gemini CLI&lt;/a&gt;.)&lt;/p&gt;

&lt;h3&gt;
  
  
  I love process
&lt;/h3&gt;

&lt;p&gt;This is something I already knew about myself. I've always appreciated the value of process, not for its own sake, but for its ability to improve collaboration and efficiency. When I identify areas where a little structure could greatly enhance how people work together, I am driven to implement it, regardless of my official role or priority.&lt;/p&gt;

&lt;p&gt;I don't think I realized how much this type of structure was missing from my own iterative LLM workflow. And not just for development, but for any task I'm trying to complete with an AI.&lt;/p&gt;

&lt;h2&gt;
  
  
  What else?
&lt;/h2&gt;

&lt;p&gt;There are many effective ways to structure an iterative LLM / &lt;a href="https://cloud.google.com/discover/what-is-vibe-coding?e=48754805&amp;amp;hl=en&amp;amp;utm_campaign=CDR_0x6e136736_awareness_b442599931&amp;amp;utm_medium=external&amp;amp;utm_source=blog" rel="noopener noreferrer"&gt;vibe coding&lt;/a&gt; workflow. I am curious about what other approaches people have found effective.&lt;/p&gt;

&lt;p&gt;❓&lt;strong&gt;Question for you:&lt;/strong&gt; What other workflows have you found effective?&lt;/p&gt;

&lt;p&gt;&lt;em&gt;&lt;strong&gt;For our complete conversation&lt;/strong&gt;, watch &lt;a href="https://www.youtube.com/watch?v=I-xS4nw-HfU&amp;amp;list=PLIivdWyY5sqLXR1eSkiM5bE6pFlXC-OSs&amp;amp;index=7" rel="noopener noreferrer"&gt;Agent Factory S1E6&lt;/a&gt; or check out the &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/agent-factory-recap-keith-ballinger-on-ai-the-future-of-development-and-vibe-coding?e=48754805" rel="noopener noreferrer"&gt;Episode Recap Blog Post&lt;/a&gt; for an overview of episode segments with links and timestamps.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>gemini</category>
      <category>vibecoding</category>
      <category>agents</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
