<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Paige Bailey</title>
    <description>The latest articles on DEV Community by Paige Bailey (@dynamicwebpaige).</description>
    <link>https://dev.to/dynamicwebpaige</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/dynamicwebpaige"/>
    <language>en</language>
    <item>
      <title>Vandalizing My Own Wikipedia Experience: A 90s Cyberpunk GeoCities Makeover</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Fri, 20 Mar 2026 15:44:30 +0000</pubDate>
      <link>https://dev.to/googleai/vandalizing-my-own-wikipedia-experience-a-90s-cyberpunk-geocities-makeover-13ie</link>
      <guid>https://dev.to/googleai/vandalizing-my-own-wikipedia-experience-a-90s-cyberpunk-geocities-makeover-13ie</guid>
      <description>&lt;p&gt;Wikipedia is a marvel. It is the Library of Alexandria of our time, a meticulously curated repository of human knowledge, wrapped in a user interface so ruthlessly utilitarian it makes a hospital corridor look like a rave. &lt;/p&gt;

&lt;p&gt;But sometimes, when I am deep in a Wikipedia rabbit hole reading about &lt;a href="https://en.wikipedia.org/wiki/List_of_animals_with_fraudulent_diplomas" rel="noopener noreferrer"&gt;List of animals with fraudulent diplomas&lt;/a&gt; at 2:00 AM, the sterile white background feels... insufficient. I don't want brutalist minimalism. I want the web the way the ancients intended: dripping in neon pink, plastered in Comic Sans, and crawling with pixelated cats. &lt;/p&gt;

&lt;p&gt;So, I decided to write a custom Wikipedia &lt;code&gt;User Script&lt;/code&gt; to turn the site into a 1998 GeoCities cyberpunk fever dream. &lt;/p&gt;

&lt;p&gt;Instead of writing this from scratch, I wanted to see how well modern LLMs could handle writing niche MediaWiki API scripts. Here is a field report on how I built this abomination using Gemini 3.1 Pro Preview.&lt;/p&gt;




&lt;h3&gt;
  
  
  Grounding Gemini with Wikipedia-specific syntax
&lt;/h3&gt;

&lt;p&gt;LLMs are great at writing vanilla JavaScript, but Wikipedia user scripts rely on specific, slightly archaic MediaWiki globals (like &lt;code&gt;mw.loader.using&lt;/code&gt; and &lt;code&gt;mw.util.addCSS&lt;/code&gt;). If you just blindly ask an LLM to "make Wikipedia pink," it usually hallucinates browser extensions or generic Tampermonkey scripts. &lt;/p&gt;

&lt;p&gt;To bypass this, I jumped into &lt;strong&gt;Google AI Studio&lt;/strong&gt; and loaded up the &lt;code&gt;Gemini 3.1 Pro Preview&lt;/code&gt; model. &lt;/p&gt;

&lt;p&gt;The secret sauce here was using the &lt;a href="https://ai.google.dev/gemini-api/docs/url-context" rel="noopener noreferrer"&gt;URL Context feature&lt;/a&gt;. I toggled URL Context on and pasted in the URL for Wikipedia's custom scripting documentation: &lt;br&gt;
&lt;code&gt;https://en.wikipedia.org/wiki/Wikipedia:User_scripts&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;My prompt was simple but unhinged: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Using the provided documentation on Wikipedia User Scripts, write a script for my Special:MyPage/common.js that makes my Wikipedia viewing experience look like a 90s cyberpunk GeoCities page. I want a pink/cyan grid background, glowing Comic Sans headers, a massive scrolling &lt;code&gt;&amp;lt;marquee&amp;gt;&lt;/code&gt; for the article title, a giant glowing sparkle mouse trail, and a squad of animated cats walking across the top of my screen."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Because Gemini 3.1 Pro Preview had the actual documentation in its context window, it knew exactly how to inject CSS securely via MediaWiki's utility functions, and it gave me a plug-and-play script.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvufd1fm6j5rdpx0d6g34.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvufd1fm6j5rdpx0d6g34.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h3&gt;
  
  
  Breaking down the script
&lt;/h3&gt;

&lt;p&gt;The resulting script is a beautiful combination of modern DOM manipulation and deeply offensive 90s aesthetics.&lt;/p&gt;
&lt;h4&gt;
  
  
  1. The Marquee Title
&lt;/h4&gt;

&lt;p&gt;If we are going to read about the &lt;a href="https://en.wikipedia.org/wiki/Emu_War" rel="noopener noreferrer"&gt;Emu War&lt;/a&gt;, that title needs to &lt;em&gt;move&lt;/em&gt;. The script grabs the &lt;code&gt;#firstHeading&lt;/code&gt; element and violently wraps its inner HTML in a &lt;code&gt;&amp;lt;marquee&amp;gt;&lt;/code&gt; tag.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;$title&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;$&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;#firstHeading&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;titleText&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;$title&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;html&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="nx"&gt;$title&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;html&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;marquee scrollamount="15" behavior="alternate"&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;titleText&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;/marquee&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Note: The fact that modern browsers in 2026 still parse and execute the &lt;code&gt;&amp;lt;marquee&amp;gt;&lt;/code&gt; tag is a testament to the web’s unbreakable backwards compatibility. It is the digital equivalent of a vestigial tail.&lt;/em&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  2. The Sparkle Trail (A Lesson in Throttling)
&lt;/h4&gt;

&lt;p&gt;To create the mouse trail, the script listens to the &lt;code&gt;mousemove&lt;/code&gt; event and appends absolutely-positioned &lt;code&gt;&amp;lt;span&amp;gt;&lt;/code&gt; elements containing cyberpunk symbols (&lt;code&gt;✦&lt;/code&gt;, &lt;code&gt;★&lt;/code&gt;, &lt;code&gt;✨&lt;/code&gt;) to the DOM. &lt;/p&gt;

&lt;p&gt;To prevent this from immediately melting my GPU (a very real threat when generating hundreds of DOM nodes a second), the model smartly implemented a timestamp throttle:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;now&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;now&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="nx"&gt;lastSparkleTime&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;40&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c1"&gt;// Only spawn a sparkle every 40ms&lt;/span&gt;
&lt;span class="nx"&gt;lastSparkleTime&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;now&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It then applies a CSS &lt;code&gt;@keyframes&lt;/code&gt; animation to each sparkle so they slowly drift downward, rotate 180 degrees, and fade to &lt;code&gt;opacity: 0&lt;/code&gt; before being garbage-collected by a &lt;code&gt;setTimeout&lt;/code&gt; a second later. &lt;/p&gt;

&lt;h4&gt;
  
  
  3. The Mathematics of Walking Cats
&lt;/h4&gt;

&lt;p&gt;Instead of using a clunky JS &lt;code&gt;setInterval&lt;/code&gt; to move the cats, Gemini 3.1 leaned into pure, hardware-accelerated CSS animations.&lt;/p&gt;

&lt;p&gt;It created a fixed header container (&lt;code&gt;pointer-events: none&lt;/code&gt; so I can still click the search bar through the cats' ethereal bodies). Then, it applied two separate animations. &lt;/p&gt;

&lt;p&gt;The first animation slides the whole squad from &lt;code&gt;100vw&lt;/code&gt; (off-screen right) to &lt;code&gt;-100%&lt;/code&gt; (off-screen left). &lt;/p&gt;

&lt;p&gt;The second animation creates the "walking" illusion. If you think about the geometry of a walking pixel cat, it's essentially a sine wave. To achieve this, the script applies a 10px vertical bounce to each cat (&lt;code&gt;transform: translateY(-10px)&lt;/code&gt;). &lt;/p&gt;

&lt;p&gt;To make it look like a chaotic squad rather than a synchronized military parade, the script uses the &lt;code&gt;nth-child(even)&lt;/code&gt; pseudo-class to offset the animation delay of every other cat:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight css"&gt;&lt;code&gt;&lt;span class="nc"&gt;.walking-cat&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nl"&gt;animation&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;catBounce&lt;/span&gt; &lt;span class="m"&gt;0.4s&lt;/span&gt; &lt;span class="n"&gt;alternate&lt;/span&gt; &lt;span class="n"&gt;infinite&lt;/span&gt; &lt;span class="n"&gt;ease-in-out&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="nc"&gt;.walking-cat&lt;/span&gt;&lt;span class="nd"&gt;:nth-child&lt;/span&gt;&lt;span class="o"&gt;(&lt;/span&gt;&lt;span class="nt"&gt;even&lt;/span&gt;&lt;span class="o"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nl"&gt;animation-delay&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0.2s&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="c"&gt;/* Phase offset for the bounce! */&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We are essentially phase-shifting the vertical oscillation of our felines to simulate independent locomotion.&lt;/p&gt;




&lt;h3&gt;
  
  
  The final results
&lt;/h3&gt;

&lt;p&gt;I pasted the code into my &lt;code&gt;Special:MyPage/common.js&lt;/code&gt;, hit publish, and bypassed my cache. &lt;/p&gt;

&lt;p&gt;The result is staggering.&lt;/p&gt;

&lt;p&gt;I am currently reading the deeply serious, heavily cited Wikipedia article for &lt;a href="https://en.wikipedia.org/wiki/Maximilien_Robespierre" rel="noopener noreferrer"&gt;Maximilien Robespierre&lt;/a&gt;. The background is a dark void overlaid with a neon pink laser grid. The header "&lt;strong&gt;MAXIMILIEN ROBESPIERRE&lt;/strong&gt;" is glowing in hot pink Comic Sans, aggressively bouncing off the edges of my monitor. &lt;/p&gt;

&lt;p&gt;

&lt;iframe class="tweet-embed" id="tweet-2035009322531660256-82" src="https://platform.twitter.com/embed/Tweet.html?id=2035009322531660256"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2035009322531660256-82');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2035009322531660256&amp;amp;theme=dark"
  }





&lt;/p&gt;

&lt;p&gt;Every time I move my mouse to hover over a citation, a massive explosion of 45-pixel-wide cyan stars erupts across the text. And above it all, a squad of five neon cats marches endlessly toward the left side of my screen, oblivious to the Reign of Terror occurring in the text below them.&lt;/p&gt;

&lt;p&gt;It is awful. I am never turning it off. &lt;/p&gt;




&lt;p&gt;&lt;em&gt;If you want to ruin your own Wikipedia experience, you can find the complete script in the replies below. Just remember to log in, navigate to &lt;code&gt;Special:MyPage/common.js&lt;/code&gt; and &lt;code&gt;Special:MyPage/common.css&lt;/code&gt;, and let the 90s flow through you.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>ai</category>
      <category>javascript</category>
      <category>programming</category>
    </item>
    <item>
      <title>🦖 Image Manipulation on a Budget: Bounding Boxes and Transparency with Gemini 3.0 Flash and NanoBanana Pro</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Wed, 04 Feb 2026 02:04:25 +0000</pubDate>
      <link>https://dev.to/googleai/image-manipulation-on-a-budget-bounding-boxes-and-transparency-with-gemini-30-flash-and-cib</link>
      <guid>https://dev.to/googleai/image-manipulation-on-a-budget-bounding-boxes-and-transparency-with-gemini-30-flash-and-cib</guid>
      <description>&lt;p&gt;&lt;strong&gt;Hey friends! 👋&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let’s talk about something that used to be a total headache: &lt;strong&gt;Computer Vision pipelines.&lt;/strong&gt; Usually, if you wanted to generate an asset, remove a background, and then detect specific objects with bounding boxes, you were looking at a complex stack. You’d need a generation model, a separate segmentation model (like SAM), and maybe some custom OpenCV scripts you had to write yourself.&lt;/p&gt;

&lt;p&gt;But the game has changed!&lt;/p&gt;

&lt;p&gt;I recently played around with &lt;strong&gt;NanoBanana Pro&lt;/strong&gt; and the new &lt;strong&gt;Gemini 3.0 Flash&lt;/strong&gt; with &lt;code&gt;High&lt;/code&gt; thinking &lt;strong&gt;Code Execution enabled&lt;/strong&gt;, and my jaw hit the floor. We are talking about an end-to-end workflow that generates, processes, and analyzes images using a sandboxed Python environment... all for fractions of a penny. You can test it out today for free in &lt;a href="https://ai.dev" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Let’s dive into how we can turn a T-Rex and some LEGO bricks into production-ready assets without writing a single line of image processing code ourselves.&lt;/p&gt;




&lt;h2&gt;
  
  
  Tech Stack
&lt;/h2&gt;

&lt;p&gt;Here is the dynamic duo (plus one) we are using today:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;NanoBanana Pro:&lt;/strong&gt; To generate our initial complex images with clean backgrounds.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Gemini 3.0 Flash:&lt;/strong&gt; The speedster of the Gemini family, with &lt;code&gt;High&lt;/code&gt; Thinking enabled. But you can also experiment with lower thinking settings!&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Code Execution:&lt;/strong&gt; This is the killer feature. Gemini doesn't just "guess" where the pixels are; it writes and runs Python code in a secure sandbox to manipulate the image mathematically.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Use Case 1: The Transparent T-Rex 🕶️🦖
&lt;/h2&gt;

&lt;p&gt;First, I used &lt;strong&gt;NanoBanana Pro&lt;/strong&gt; to conjure up a "Dinosaur with sunglasses on a white background.", based on an initial input image. It did a great job, but for web design, we usually need a &lt;strong&gt;transparent PNG&lt;/strong&gt;, not a white JPEG.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvxuo5l7zksk3hkt935oo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvxuo5l7zksk3hkt935oo.png" alt=" " width="800" height="754"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Instead of opening Photoshop, I just asked Gemini 3.0 Flash to handle it.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Prompt:
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Remove the background from this image, only show the dinosaur wearing sunglasses on a white background and nothing else - no pole, no grass, etc."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  The Model's "Thinking" Process 🧠
&lt;/h3&gt;

&lt;p&gt;One of the coolest things about the new high-thinking capability is watching the model plan. It literally told me:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;em&gt;"I'm focused on extracting the T-Rex skeleton..."&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;em&gt;"I'm now zeroing in on a pristine white background."&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The Execution 💻
&lt;/h3&gt;

&lt;p&gt;Gemini didn't hallucinate the transparency. It wrote this Python script using OpenCV (&lt;code&gt;cv2&lt;/code&gt;) and executed it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;

&lt;span class="c1"&gt;# Load the image
&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;imread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;input_file_2.jpeg&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Convert to grayscale
&lt;/span&gt;&lt;span class="n"&gt;gray&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cvtColor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;COLOR_BGR2GRAY&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Threshold to get white background mask (background is 255)
# Using a threshold slightly below 255 to catch near-white pixels
&lt;/span&gt;&lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;mask&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;threshold&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;gray&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;250&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;THRESH_BINARY_INV&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Convert to BGRA
&lt;/span&gt;&lt;span class="n"&gt;bgra&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cvtColor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;COLOR_BGR2BGRA&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Set alpha channel to 0 where mask is 0 (the white background)
&lt;/span&gt;&lt;span class="n"&gt;bgra&lt;/span&gt;&lt;span class="p"&gt;[:,&lt;/span&gt; &lt;span class="p"&gt;:,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;mask&lt;/span&gt;

&lt;span class="c1"&gt;# Save the transparent PNG
&lt;/span&gt;&lt;span class="n"&gt;output_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;dinosaur_transparent.png&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
&lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;imwrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;output_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;bgra&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Boom!&lt;/strong&gt; 💥 A perfectly cut-out T-Rex with zero background artifacts.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9z8aot7ft6vttkfbx3mi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9z8aot7ft6vttkfbx3mi.png" alt=" " width="800" height="375"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Use Case 2: Object Detection (The LEGO Test) 🧱
&lt;/h2&gt;

&lt;p&gt;Next, let’s get a bit more technical. I uploaded an image of colorful LEGO bricks and wanted to identify &lt;strong&gt;only the green ones&lt;/strong&gt; and draw bounding boxes around them.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Prompt:
&lt;/h3&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"Draw bounding boxes around the green LEGO bricks. Display the image with bounding boxes."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  The Logic 🧠
&lt;/h3&gt;

&lt;p&gt;Gemini identified that color segmentation was the best path here. It analyzed the image, determined the HSV range for that specific shade of "LEGO lime green," and wrote the following logic:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;PIL&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Image&lt;/span&gt;

&lt;span class="c1"&gt;# Load the image
&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;imread&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;input_file_4.png&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;image_rgb&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cvtColor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;COLOR_BGR2RGB&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Convert to HSV for better color segmentation
&lt;/span&gt;&lt;span class="n"&gt;hsv&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cvtColor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;image&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;COLOR_BGR2HSV&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Define range for green color in HSV
# Looking at the image, it's a lime green
&lt;/span&gt;&lt;span class="n"&gt;lower_green&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;upper_green&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;85&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Create a mask for green
&lt;/span&gt;&lt;span class="n"&gt;mask&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;inRange&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;hsv&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;lower_green&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;upper_green&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Find contours in the mask
&lt;/span&gt;&lt;span class="n"&gt;contours&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;findContours&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;mask&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RETR_EXTERNAL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CHAIN_APPROX_SIMPLE&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Create a copy of the image to draw on
&lt;/span&gt;&lt;span class="n"&gt;output_image&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;image_rgb&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;copy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="c1"&gt;# Filter contours by area and draw bounding boxes
&lt;/span&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;cnt&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;contours&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;area&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;contourArea&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cnt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;area&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;1000&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;  &lt;span class="c1"&gt;# Adjust threshold as needed
&lt;/span&gt;        &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;w&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;h&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;boundingRect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cnt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;# Blue bounding boxes for visibility on green
&lt;/span&gt;        &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;rectangle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;output_image&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;w&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;h&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 

&lt;span class="c1"&gt;# Save and display the result
&lt;/span&gt;&lt;span class="n"&gt;output_pil&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Image&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fromarray&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;output_image&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;output_pil&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;save&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;green_bricks_detected.png&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It automatically calculated the contours, filtered out noise (area &amp;lt; 1000), and drew the boxes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs628t0sy212l1rik5vdi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs628t0sy212l1rik5vdi.png" alt=" " width="800" height="341"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The Cost? 💸
&lt;/h3&gt;

&lt;p&gt;This is the wildest part. Running this entire object detection workflow on the LEGO image cost approximately &lt;strong&gt;$0.006&lt;/strong&gt;. That is six-tenths of a penny for intelligent computer vision code generation and execution.&lt;/p&gt;




&lt;h2&gt;
  
  
  Use Case 3: Transparency 🏁
&lt;/h2&gt;

&lt;p&gt;Finally, I asked it to clean up the LEGO image just like the dinosaur—removing the white background to create a sprite-ready PNG.&lt;/p&gt;

&lt;p&gt;Gemini pivoted strategies. It saw the background was pure white (255), so it utilized a high threshold approach for perfectly clean edges:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Use threshold to find the white background
&lt;/span&gt;&lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;mask&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;threshold&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;gray&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;250&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;THRESH_BINARY_INV&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Set the alpha channel based on the mask
&lt;/span&gt;&lt;span class="n"&gt;bgra&lt;/span&gt;&lt;span class="p"&gt;[:,&lt;/span&gt; &lt;span class="p"&gt;:,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;mask&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The result? A clean transparent asset ready for your next project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F04dy9ud6bud949usfyjp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F04dy9ud6bud949usfyjp.png" alt=" " width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Why this matters 💡
&lt;/h2&gt;

&lt;p&gt;We are moving away from "Prompts that guess" to "Prompts that do."&lt;/p&gt;

&lt;p&gt;By combining &lt;strong&gt;Gemini 3.0 Flash's High Thinking&lt;/strong&gt; (to plan the approach) with &lt;strong&gt;Code Execution&lt;/strong&gt; (to actually do the math using Python libraries like NumPy and OpenCV), we get results that are:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Deterministic:&lt;/strong&gt; The code runs the same way every time.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Verifiable:&lt;/strong&gt; You can see exactly &lt;em&gt;how&lt;/em&gt; the model solved the problem.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Incredibly Cheap:&lt;/strong&gt; We are leveraging efficient models to write code, rather than using massive vision models to brute-force pixel generation.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Have you tried Code Execution with Gemini yet? If you're building automated asset pipelines, this is a total game changer!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Check out the docs here:&lt;/strong&gt; &lt;a href="https://ai.google.dev/gemini-api/docs/code-execution" rel="noopener noreferrer"&gt;Gemini API Code Execution&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Happy coding! 👩‍💻✨&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>python</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Benchmarking on a Budget: Running massive evals for 50% less with the Gemini Batch API ⚡️</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Mon, 26 Jan 2026 05:09:53 +0000</pubDate>
      <link>https://dev.to/googleai/benchmarking-on-a-budget-running-massive-evals-for-50-less-with-the-gemini-batch-api-5d1j</link>
      <guid>https://dev.to/googleai/benchmarking-on-a-budget-running-massive-evals-for-50-less-with-the-gemini-batch-api-5d1j</guid>
      <description>&lt;p&gt;Hey developers! 👋&lt;/p&gt;

&lt;p&gt;Running evaluations on LLMs can be a bit of a headache. You hit rate limits, you stare at loading bars, and -- probably worst of all -- you burn through your API budget faster than a GPU on a training run. But what if I told you there’s a way to run thousands of prompts asynchronously, at &lt;strong&gt;50% of the cost&lt;/strong&gt;, without blocking your main thread?&lt;/p&gt;

&lt;p&gt;Enter the &lt;strong&gt;Gemini Batch API&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Today, we are going to take a classic coding benchmark from Hugging Face -- the &lt;a href="https://huggingface.co/datasets/openai/openai_humaneval" rel="noopener noreferrer"&gt;&lt;strong&gt;OpenAI HumanEval dataset&lt;/strong&gt;&lt;/a&gt; -- and run it through Google’s small, lightweight &lt;a href="https://deepmind.google/models/gemini/flash-lite/" rel="noopener noreferrer"&gt;&lt;code&gt;gemini-2.5-flash-lite&lt;/code&gt;&lt;/a&gt; model using &lt;a href="https://ai.google.dev/gemini-api/docs/batch-api?batch=file" rel="noopener noreferrer"&gt;the Batch API&lt;/a&gt;. Then, we’re going to evaluate the code it generates and visualize the results.&lt;/p&gt;

&lt;p&gt;Grab your coffee (or tea 🍵), and let’s dive in!&lt;/p&gt;




&lt;h2&gt;
  
  
  The stack
&lt;/h2&gt;

&lt;p&gt;Here's what we'll be building with today:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://github.com/googleapis/python-genai" rel="noopener noreferrer"&gt;&lt;strong&gt;Google Gen AI SDK:&lt;/strong&gt;&lt;/a&gt; To talk to Gemini.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;HuggingFace Datasets:&lt;/strong&gt; To get our evals.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Pandas and Seaborn:&lt;/strong&gt; To make the data look pretty.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Python &lt;code&gt;exec&lt;/code&gt;:&lt;/strong&gt; To (carefully!) run the generated code.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 1: Preparing the Data
&lt;/h2&gt;

&lt;p&gt;First things first, we need our prompts. We are using the &lt;code&gt;openai_humaneval&lt;/code&gt; dataset, which contains 164 coding problems.&lt;/p&gt;

&lt;p&gt;The Batch API loves &lt;strong&gt;JSONL&lt;/strong&gt; (JSON Lines) files. Each line is a separate request. We need to iterate through the dataset and format it so Gemini understands that we want it to write Python code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;datasets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;load_dataset&lt;/span&gt;

&lt;span class="c1"&gt;# Load the test split
&lt;/span&gt;&lt;span class="n"&gt;ds&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;load_dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;openai/openai_humaneval&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;split&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;test&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;jsonl_filename&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;humaneval_input.jsonl&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Generating &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;jsonl_filename&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;jsonl_filename&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;w&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;ds&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="c1"&gt;# Sanitize the ID
&lt;/span&gt;        &lt;span class="n"&gt;custom_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;task_id&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;_&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Prompt Engineering: Be specific!
&lt;/span&gt;        &lt;span class="n"&gt;prompt_text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Complete the following Python function. &lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
            &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Provide ONLY the code, no explanation.&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;prompt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Construct the request object
&lt;/span&gt;        &lt;span class="n"&gt;entry&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;custom_id&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;custom_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;method&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;generateContent&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;request&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;contents&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;role&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;parts&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;prompt_text&lt;/span&gt;&lt;span class="p"&gt;}]}],&lt;/span&gt;
                &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;generation_config&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;temperature&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;entry&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Pro Tip:&lt;/strong&gt; Notice &lt;code&gt;custom_id&lt;/code&gt;? That’s your best friend. Since batch jobs are asynchronous, results might not come back in the same order you sent them. The ID helps you map the answer back to the question.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Upload the data to Google Cloud
&lt;/h2&gt;

&lt;p&gt;Now that we have our &lt;code&gt;humaneval_input.jsonl&lt;/code&gt;, we need to upload it to Google's Cloud Platform and tell Gemini to get to work. We are using &lt;code&gt;gemini-2.5-flash-lite&lt;/code&gt; here because it is fast, efficient, and perfect for high-volume tasks like this - but you could use any supported Gemini model docstring instead.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;google&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;genai&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;google.genai&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;types&lt;/span&gt;

&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;genai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_API_KEY&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 1. Upload the file
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Uploading file to Gemini API...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;uploaded_file&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;upload&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="nb"&gt;file&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;jsonl_filename&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;mime_type&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;application/jsonl&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# 2. Kick off the Batch Job
&lt;/span&gt;&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Submitting batch job...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;batch_job&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;batches&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gemini-2.5-flash-lite&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;src&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;uploaded_file&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;config&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;types&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;CreateBatchJobConfig&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;display_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;humaneval_batch_job&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Batch Job Created: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;batch_job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Current Status: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;batch_job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;state&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And now... we wait. ⏳&lt;/p&gt;

&lt;p&gt;Batch jobs aren't instant (that's the trade-off for the discount), but for an evaluation pipeline, it's perfect. Go stretch, grab a snack, or check Twitter. On average, my jobs for the HumanEval dataset have been taking ~10 minutes, and the majority complete in less than a few hours.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Downloading the results
&lt;/h2&gt;

&lt;p&gt;Once the job hits &lt;code&gt;JOB_STATE_SUCCEEDED&lt;/code&gt;, the results are ready to come home.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Check status (you'd likely loop this in production)
&lt;/span&gt;&lt;span class="n"&gt;job&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;batches&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;batch_job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;state&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;name&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;JOB_STATE_SUCCEEDED&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;remote_filename&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dest&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;file_name&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Downloading results from: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;remote_filename&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Save the output locally
&lt;/span&gt;    &lt;span class="n"&gt;content_bytes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;download&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;file&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;remote_filename&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;results.jsonl&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;wb&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;content_bytes&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Results saved locally!&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 4: Evaluations
&lt;/h2&gt;

&lt;p&gt;This is where the magic happens. We have the code Gemini wrote; now we need to see if it actually &lt;em&gt;works&lt;/em&gt;. We’re going to loop through our results, extract the Python code (removing those pesky markdown backticks), and run it against the unit tests provided in the HumanEval dataset.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;⚠️ Warning:&lt;/strong&gt; We are using &lt;code&gt;exec()&lt;/code&gt; here. In a production app, running untrusted code is a huge security no-no. But for a local sandbox evaluation, we live on the edge! We’ll wrap it in a &lt;code&gt;signal&lt;/code&gt; timeout so infinite loops don’t freeze our machine.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;signal&lt;/span&gt;

&lt;span class="c1"&gt;# Safety timeout handler
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;TimeoutException&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;Exception&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="k"&gt;pass&lt;/span&gt;
&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;timeout_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;signum&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;frame&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="nc"&gt;TimeoutException&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="n"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;SIGALRM&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;timeout_handler&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;results_map&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
&lt;span class="c1"&gt;# ... (Load results into results_map dictionary) ...
&lt;/span&gt;
&lt;span class="n"&gt;passed&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
&lt;span class="n"&gt;total&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ds&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Starting Evaluation with Timeouts...&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;ds&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;cid&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;task_id&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;replace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;_&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;generated_code&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;results_map&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;cid&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Combine prompt + generated code + test case
&lt;/span&gt;    &lt;span class="n"&gt;test_script&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;prompt&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;generated_code&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;test&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;check(&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;entry_point&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;alarm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# 2-second timeout per problem
&lt;/span&gt;        &lt;span class="n"&gt;exec_globals&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
        &lt;span class="nf"&gt;exec&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;test_script&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;exec_globals&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;alarm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# Disable alarm
&lt;/span&gt;        &lt;span class="n"&gt;passed&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;task_id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: Passed&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;alarm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;task_id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;: FAILED (&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;type&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;)&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 5: The results!
&lt;/h2&gt;

&lt;p&gt;So, how did &lt;code&gt;gemini-2.5-flash-lite&lt;/code&gt; do? Let's visualize it using Seaborn.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;matplotlib.pyplot&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;plt&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;seaborn&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;sns&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pandas&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;pd&lt;/span&gt;

&lt;span class="c1"&gt;# ... (Create DataFrame from results) ...
&lt;/span&gt;
&lt;span class="n"&gt;plt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;figure&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;figsize&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;sns&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_theme&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;style&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;whitegrid&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;colors&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Passed&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#2ca02c&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Failed&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#d62728&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#ff7f0e&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;No Code&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#7f7f7f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;ax&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sns&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;countplot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;status&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;df&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;hue&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;status&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;palette&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;colors&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;order&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;df&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;status&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;value_counts&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;plt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;title&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;HumanEval Evaluation Results&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;fontsize&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;plt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;show&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  🥁 Drumroll please...
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmrkahzeqntvsjxz5blkg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmrkahzeqntvsjxz5blkg.png" alt="Bar Chart" width="800" height="473"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Passed:&lt;/strong&gt; 142 tasks (86.59%)&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Failed:&lt;/strong&gt; 8 tasks&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Errors:&lt;/strong&gt; 9 tasks&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;86.59% Pass Rate!&lt;/strong&gt; 🤯&lt;/p&gt;

&lt;p&gt;That is incredibly impressive for a "Flash-Lite" model. It handled complex algorithmic logic, string manipulation, and math problems, passing the vast majority of them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;The Gemini Batch API is a game-changer for workflows like this.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Cost:&lt;/strong&gt; We saved 50% on tokens.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Scale:&lt;/strong&gt; We didn't have to manage async loops or retry logic.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Performance:&lt;/strong&gt; The Gemini 2.5 Flash-Lite model punched way above its weight class.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you have large datasets, extensive prompting jobs, or nightly evaluations, definitely give the Batch API a spin. And if you'd like to see the full code, &lt;a href="https://github.com/dynamicwebpaige/gemini-and-gemma-examples/blob/main/Gemini_Batch_API_with_Hugging_Face_Datasets.ipynb" rel="noopener noreferrer"&gt;check out this Colab notebook&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Happy building! ✨&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://ai.google.dev/gemini-api/docs/batch-api?batch=file" rel="noopener noreferrer"&gt;Google Gemini Batch API Docs&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://huggingface.co/datasets/openai/openai_humaneval" rel="noopener noreferrer"&gt;HuggingFace HumanEval Dataset&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>python</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Building a Neon-Soaked Last.fm Roast Bot for $1.75/Year</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Thu, 15 Jan 2026 22:04:37 +0000</pubDate>
      <link>https://dev.to/googleai/building-a-neon-soaked-lastfm-roast-bot-for-175year-545m</link>
      <guid>https://dev.to/googleai/building-a-neon-soaked-lastfm-roast-bot-for-175year-545m</guid>
      <description>&lt;p&gt;We all have that one song in our listening history that we hope nobody sees. But instead of hiding my shame, I decided to automate it!&lt;/p&gt;

&lt;p&gt;I recently built my own custom AI bot on &lt;strong&gt;Poe&lt;/strong&gt; called &lt;code&gt;lastfm-roaster&lt;/code&gt;. Its only job is to look at a person's Last.fm music taste (if you're curious. &lt;a href="https://www.last.fm/user/profoundlypaige" rel="noopener noreferrer"&gt;here's mine&lt;/a&gt; as an example) and absolutely destroy them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;But there was a problem:&lt;/strong&gt; The roasts were trapped in the chatbot interface. I wanted them delivered straight to my inbox every morning so I could easily forward the best (worst) burns to my friends for a laugh.&lt;/p&gt;

&lt;p&gt;So, I built a pipeline.&lt;/p&gt;

&lt;p&gt;Today, I’m going to show you how I connected my custom Poe bot to a Python script that pings the API, analyzes my listening history, and uses &lt;strong&gt;Regex&lt;/strong&gt; to inject random neon colors into a beautiful HTML email.&lt;/p&gt;

&lt;p&gt;Best of all? It runs entirely for free on GitHub Actions, and the API costs less than a cup of coffee &lt;strong&gt;for the entire year&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Onward!&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the Bot 🤖
&lt;/h2&gt;

&lt;p&gt;I went to &lt;strong&gt;Poe&lt;/strong&gt; and created a new bot using their &lt;strong&gt;ScriptBot&lt;/strong&gt; feature. I gave it a specific system prompt roughly like this:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"You are a pretentious music critic. Your job is to analyze Last.fm profiles and roast them mercilessly. Be sarcastic, use slang, and do not hold back."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;After iterating back and forth with the ScriptBot until I was satisfied with the results, I had the intelligence (&lt;code&gt;lastfm-roaster&lt;/code&gt;), but I needed to get the output out of Poe and into my email.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9hkgsx204zjeuwlnzz2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9hkgsx204zjeuwlnzz2.png" alt=" " width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Keys and Secrets 🗝️
&lt;/h2&gt;

&lt;p&gt;We are going to use GitHub Actions to run this, which means we need to keep our API keys safe. &lt;strong&gt;Never hardcode passwords in your script!&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Get the Keys
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Poe API Key:&lt;/strong&gt; Go to &lt;a href="https://poe.com/api_key" rel="noopener noreferrer"&gt;poe.com/api_key&lt;/a&gt; and grab your key.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Gmail App Password:&lt;/strong&gt; To send email via Python, you can't use your normal password. Go to your Google Account &amp;gt; Security &amp;gt; 2-Step Verification &amp;gt; &lt;strong&gt;App Passwords&lt;/strong&gt;. Generate one and save that 16-character code.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  2. Store them in GitHub
&lt;/h3&gt;

&lt;p&gt;Create a new private repository on GitHub, then go to &lt;strong&gt;Settings &amp;gt; Secrets and variables &amp;gt; Actions&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Add these three secrets:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;code&gt;POE_API_KEY&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;EMAIL_ADDRESS&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;code&gt;EMAIL_PASSWORD&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfof1p1hxewda2gk55oc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfof1p1hxewda2gk55oc.png" alt=" " width="800" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: The Code 🐍
&lt;/h2&gt;

&lt;p&gt;We need a few Python libraries to make this magic happen. Create a file named &lt;code&gt;requirements.txt&lt;/code&gt; in your repo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;openai
markdown
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;(Yes, we use &lt;code&gt;openai&lt;/code&gt;! Poe's API is now compatible with the OpenAI client, making it super easy to use. Gemini 2.5 Flash is referenced via the Poe API, so I don't have to worry about managing that key, as well.)&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The "Neon" Script (&lt;code&gt;lastfm_roast.py&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;Here is the cool part. I didn't want the email to look boring. I wanted the "insults" (the bold text) to pop in different colors.&lt;/p&gt;

&lt;p&gt;We use Python's &lt;code&gt;re&lt;/code&gt; (Regex) and &lt;code&gt;itertools&lt;/code&gt; to find every &lt;code&gt;&amp;lt;strong&amp;gt;&lt;/code&gt; tag my bot generates and cycle through a "Dracula" color palette to inject inline CSS styles.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;smtplib&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;markdown&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;re&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;itertools&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;email.message&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;EmailMessage&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;openai&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;OpenAI&lt;/span&gt;

&lt;span class="c1"&gt;# Configs (Loaded safely from GitHub Secrets)
&lt;/span&gt;&lt;span class="n"&gt;POE_API_KEY&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;POE_API_KEY&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;EMAIL_ADDRESS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;EMAIL_ADDRESS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;EMAIL_PASSWORD&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;EMAIL_PASSWORD&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;LASTFM_URL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://www.last.fm/user/profoundlypaige&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="c1"&gt;# --- NEON PALETTE ---
# A list of bright colors that look good on dark backgrounds
# (Pink, Cyan, Green, Orange, Purple, Yellow)
&lt;/span&gt;&lt;span class="n"&gt;COLORS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#FF79C6&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#8BE9FD&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#50FA7B&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#FFB86C&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#BD93F9&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;#F1FA8C&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_roast&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Pings the Poe API to get the roast.&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;OpenAI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;api_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;POE_API_KEY&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;base_url&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://api.poe.com/v1&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;🔥 Fetching roast from Poe...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;completions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;lastfm-roaster&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
                &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;role&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Roast my music taste: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;LASTFM_URL&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;choices&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="n"&gt;message&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;
    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error fetching roast: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;inject_colors&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;html_content&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Finds every &amp;lt;strong&amp;gt; tag and injects a different color from the palette.
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;color_cycle&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;itertools&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cycle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;COLORS&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;replace_match&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;next_color&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;next&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;color_cycle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;# Returns &amp;lt;strong style="color: #CODE"&amp;gt;
&lt;/span&gt;        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;strong style=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;color: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;next_color&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;gt;&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

    &lt;span class="c1"&gt;# Regex to replace &amp;lt;strong&amp;gt; with the colored version
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;re&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sub&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;r&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;strong&amp;gt;&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;replace_match&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;html_content&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;create_html_email&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;roast_text&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# 1. Convert Markdown to basic HTML
&lt;/span&gt;    &lt;span class="n"&gt;raw_html&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;markdown&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;markdown&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;roast_text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# 2. Inject the rotating neon colors into bold tags
&lt;/span&gt;    &lt;span class="n"&gt;colorful_html&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;inject_colors&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;raw_html&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# 3. Wrap in the styled container
&lt;/span&gt;    &lt;span class="n"&gt;html_template&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    &amp;lt;!DOCTYPE html&amp;gt;
    &amp;lt;html&amp;gt;
    &amp;lt;head&amp;gt;
    &amp;lt;style&amp;gt;
        body {{ margin: 0; padding: 0; background-color: #121212; font-family: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Helvetica Neue&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;, Helvetica, Arial, sans-serif; }}

        .container {{ 
            max-width: 600px; 
            margin: 40px auto; 
            background-color: #1e1e1e; 
            border-radius: 16px; 
            overflow: hidden; 
            box-shadow: 0 10px 30px rgba(0,0,0,0.5); 
            border: 1px solid #333;
        }}

        .header {{ 
            background: linear-gradient(135deg, #2b2b2b 0%, #1a1a1a 100%); 
            padding: 30px; 
            text-align: center; 
            border-bottom: 2px solid #333;
        }}

        /* The title is now a gradient text effect */
        .header h1 {{ 
            margin: 0; font-size: 28px; letter-spacing: 2px; text-transform: uppercase; 
            background: -webkit-linear-gradient(#FF79C6, #8BE9FD);
            -webkit-background-clip: text;
            -webkit-text-fill-color: transparent;
        }}

        .content {{ padding: 30px; color: #d1d5db; line-height: 1.7; font-size: 16px; }}

        h2 {{ 
            color: #ffffff; 
            border-left: 5px solid #BD93F9; /* Purple accent */
            padding-left: 15px;
            margin-top: 30px; 
            text-transform: uppercase;
            font-size: 18px;
            letter-spacing: 1px;
        }}

        ul {{ padding-left: 20px; }}
        li {{ margin-bottom: 10px; }}

        /* Styles for links */
        a {{ color: #8BE9FD; text-decoration: none; border-bottom: 1px dotted #8BE9FD; }}

        .footer {{ 
            background-color: #121212; 
            padding: 20px; 
            text-align: center; 
            font-size: 12px; 
            color: #555; 
        }}
    &amp;lt;/style&amp;gt;
    &amp;lt;/head&amp;gt;
    &amp;lt;body&amp;gt;
        &amp;lt;div class=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;container&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;gt;
            &amp;lt;div class=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;header&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;gt;
                &amp;lt;h1&amp;gt;🔥 The Daily Burn&amp;lt;/h1&amp;gt;
            &amp;lt;/div&amp;gt;
            &amp;lt;div class=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;gt;
                &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;colorful_html&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;
            &amp;lt;/div&amp;gt;
            &amp;lt;div class=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;footer&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;gt;
                Served fresh by Poe API, Gemini 2.5 Flash, &amp;amp; GitHub Actions&amp;lt;br&amp;gt;
                &amp;lt;a href=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;LASTFM_URL&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt; style=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;color:#555; border:none;&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;gt;View your tragic Last.fm Profile&amp;lt;/a&amp;gt;
            &amp;lt;/div&amp;gt;
        &amp;lt;/div&amp;gt;
    &amp;lt;/body&amp;gt;
    &amp;lt;/html&amp;gt;
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;html_template&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;send_email&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;roast_text&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;EmailMessage&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Subject&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Your Daily Last.fm Roast 🎸&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
    &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;From&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;EMAIL_ADDRESS&lt;/span&gt;
    &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;To&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;EMAIL_ADDRESS&lt;/span&gt;
    &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set_content&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;roast_text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_alternative&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;create_html_email&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;roast_text&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;subtype&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;html&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;smtplib&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;SMTP_SSL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;smtp.gmail.com&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;465&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;smtp&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;smtp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;login&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;EMAIL_ADDRESS&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;EMAIL_PASSWORD&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;smtp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;send_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;✅ Email sent successfully!&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;❌ Failed to send email: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;roast&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;get_roast&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="nf"&gt;send_email&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;roast&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Automating with GitHub Actions 🤖
&lt;/h2&gt;

&lt;p&gt;I don't want to run this manually -- I want to be roasted automatically. So we'll use a GitHub Actions workflow to run this script every day at 12:00 UTC.&lt;/p&gt;

&lt;p&gt;To create &lt;code&gt;.github/workflows/daily_roast.yml&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Daily Lastfm Roast&lt;/span&gt;
&lt;span class="na"&gt;on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;schedule&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;cron&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;0&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;12&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;*'&lt;/span&gt; &lt;span class="c1"&gt;# Noon UTC, every day&lt;/span&gt;

&lt;span class="na"&gt;jobs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;roast&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;runs-on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;ubuntu-latest&lt;/span&gt;
    &lt;span class="na"&gt;steps&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/checkout@v2&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;uses&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;actions/setup-python@v2&lt;/span&gt;
        &lt;span class="na"&gt;with&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;python-version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.9'&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;pip install -r requirements.txt&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;run&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;python lastfm_roast.py&lt;/span&gt;
        &lt;span class="na"&gt;env&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
          &lt;span class="na"&gt;POE_API_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.POE_API_KEY }}&lt;/span&gt;
          &lt;span class="na"&gt;EMAIL_ADDRESS&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.EMAIL_ADDRESS }}&lt;/span&gt;
          &lt;span class="na"&gt;EMAIL_PASSWORD&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;${{ secrets.EMAIL_PASSWORD }}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3pjiwsb3zmr64usmtuzd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3pjiwsb3zmr64usmtuzd.png" alt=" " width="800" height="321"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Is it worth it? 💸
&lt;/h2&gt;

&lt;p&gt;This is my favorite part. Poe charges "Compute Points" to run the underlying model that powers the bot (I used Gemini 2.5 Flash for the backend of my bot). I checked my consumption after a few test runs to see what kind of bill I was racking up:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvdotoj03npw05u8brink.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvdotoj03npw05u8brink.png" alt=" " width="800" height="214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here is the breakdown from my dashboard:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Model:&lt;/strong&gt; Gemini 2.5 Flash&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Cost per Request:&lt;/strong&gt; ~161 Compute Points&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Dollar Cost:&lt;/strong&gt; ~$0.0048 per roast&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If I run this every single day for a year:&lt;br&gt;
&lt;code&gt;$0.0048 * 365 days = $1.75&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;$1.75 per year.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For less than two bucks, I can get a state-of-the-art LLM to analyze my listening trends and tell me my taste in indie pop is "derivative and sad" every single morning. That is high-value ROI. 📈&lt;/p&gt;

&lt;h2&gt;
  
  
  The Result: Before vs After
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Before (Chat Interface):&lt;/strong&gt;&lt;br&gt;
Trapped in an app. Hard to share. Markdown text.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;After (The Neon Upgrade):&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0h7ijwpb7ulzrcqil4uv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0h7ijwpb7ulzrcqil4uv.png" alt=" " width="800" height="606"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, when I open my email, I get a sleek, dark-mode card. The band names—the targets of the insults—are highlighted in &lt;strong&gt;Pink&lt;/strong&gt;, &lt;strong&gt;Cyan&lt;/strong&gt;, and &lt;strong&gt;Green&lt;/strong&gt;, making sure I don't miss exactly &lt;em&gt;who&lt;/em&gt; I'm being mocked for listening to.&lt;/p&gt;

&lt;p&gt;And because it's an email, I can instantly forward the roast to my friends so they, too, can laugh at my pain.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping Up
&lt;/h2&gt;

&lt;p&gt;This pattern (Custom Bot + API + HTML Generation + Actions) is my go-to for almost all my personal automation. It’s robust, free to host, and creates genuinely fun daily interactions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Repositories mentioned:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://poe.com/lastfm-roaster" rel="noopener noreferrer"&gt;lastfm-roaster on Poe&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://developer.poe.com/" rel="noopener noreferrer"&gt;Poe API Documentation&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://ai.google.dev" rel="noopener noreferrer"&gt;Gemini Developer Documentation&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let me know in the comments if you try this out, or share the worst roast the AI gave you, and happy coding! ✨&lt;/p&gt;

</description>
      <category>python</category>
      <category>automation</category>
      <category>githubactions</category>
      <category>ai</category>
    </item>
    <item>
      <title>Where we're going, we don't need chatbots: introducing the Antigravity IDE 🚀</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Tue, 23 Dec 2025 02:19:15 +0000</pubDate>
      <link>https://dev.to/googleai/where-were-going-we-dont-need-chatbots-introducing-the-antigravity-ide-2c3k</link>
      <guid>https://dev.to/googleai/where-were-going-we-dont-need-chatbots-introducing-the-antigravity-ide-2c3k</guid>
      <description>&lt;p&gt;Lately, I’ve been living in a new world—specifically, a world called &lt;strong&gt;&lt;a href="https://antigravity.google/" rel="noopener noreferrer"&gt;Antigravity&lt;/a&gt;&lt;/strong&gt;. It’s the agent-first IDE that the Google DeepMind team has been building, and honestly? It’s completely ruined "normal" coding for me. &lt;/p&gt;

&lt;p&gt;We talk a lot about AI in development, but most of the time this just ends up being a chat box in the browser. Antigravity is different - it's built on an &lt;strong&gt;"agent-first" architecture&lt;/strong&gt; (shoutout to the Windsurf team who brought this idea to life!). The idea is that the AI shouldn’t just &lt;em&gt;suggest&lt;/em&gt; code; it should be an autonomous partner that can navigate the terminal, the editor, and even a live browser to get things done. We're even starting to incorporate many of these agentic features into &lt;a href="https://ai.dev/build" rel="noopener noreferrer"&gt;AI Studio's Build feature&lt;/a&gt;!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fib7u4l6p2nwa2jv8onbp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fib7u4l6p2nwa2jv8onbp.png" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I wanted to share two specific features that have become absolute game-changers for my daily workflow.&lt;/p&gt;




&lt;h2&gt;
  
  
  QA testing and documentation on auto-pilot
&lt;/h2&gt;

&lt;p&gt;If you're like me, the most exhausting part of any sprint isn't the code - it’s the housekeeping. You know the drill: you finish a feature, then you have to spend hours taking screenshots and writing detailed how-to guides for the quality assurance team and for users. It's incredibly important, but also tedious, work.&lt;/p&gt;

&lt;p&gt;In Antigravity, I’ve started using the &lt;strong&gt;Browser Subagent&lt;/strong&gt; (powered by Gemini 2.5 Computer Use) to handle this. I can literally just say: &lt;em&gt;"Go to this staging site, identify five core user journeys, and make a tutorial for all of them."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The agent doesn't just "guess." It explores the DOM, interacts with elements, and records its progress. But the "secret sauce" is the &lt;strong&gt;Artifacts System&lt;/strong&gt;. Instead of a wall of text, the agent produces a &lt;strong&gt;Walkthrough Artifact&lt;/strong&gt; - which is a structured, human-readable document complete with verification results, detailed steps expressed as text, and screenshots.&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-2003159982464696827-445" src="https://platform.twitter.com/embed/Tweet.html?id=2003159982464696827"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2003159982464696827-445');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2003159982464696827&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;p&gt;Because Antigravity supports &lt;strong&gt;asynchronous feedback&lt;/strong&gt;, I can hop into that Implementation Plan or Walkthrough and leave &lt;strong&gt;Google Docs-style comments&lt;/strong&gt; directly on the text or screenshots - there's no black box. The agent reads my feedback in real-time and adjusts its next steps without me having to restart the whole task.&lt;/p&gt;




&lt;h2&gt;
  
  
  Toolkit for Antigravity extension
&lt;/h2&gt;

&lt;p&gt;If you’re pushing the limits of agentic coding, you’re likely bouncing between models. I usually have &lt;strong&gt;Gemini 3 Pro&lt;/strong&gt; handling complex architectural planning while &lt;strong&gt;Gemini 3 Flash&lt;/strong&gt; handles the high-velocity subtasks. &lt;/p&gt;

&lt;p&gt;Keeping track of your quota (token consumption) across Google AI Studio, Anthropic's models (Opus, Sonnet), and OpenAI's models is usually a nightmare. Luckily, a community member built an incredible open-source extension called &lt;strong&gt;&lt;a href="https://open-vsx.org/extension/n2ns/antigravity-panel" rel="noopener noreferrer"&gt;Toolkit for Antigravity&lt;/a&gt;&lt;/strong&gt; (you can find it on the OpenVSX marketplace).&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-2003052007192772726-87" src="https://platform.twitter.com/embed/Tweet.html?id=2003052007192772726"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-2003052007192772726-87');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=2003052007192772726&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;p&gt;It adds these sleek &lt;strong&gt;Semi-Arc Gauges&lt;/strong&gt; to your sidebar that show:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Real-time Quota Tracking:&lt;/strong&gt; Hour-by-hour stats for Gemini, Claude 4.5, and the OpenAI GPT models.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Cache Management:&lt;/strong&gt; A one-click way to browse and purge conversation caches when things get a bit "loopy."&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Privacy First:&lt;/strong&gt; It’s 100% local with no telemetry, which is a must for those of us working in enterprise repos.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Shifting to "Manager View"
&lt;/h2&gt;

&lt;p&gt;The biggest mental shift with Antigravity is moving from the &lt;strong&gt;Editor View&lt;/strong&gt; to the &lt;strong&gt;Manager View&lt;/strong&gt;. In the Editor, you’re writing code; in the Manager, you’re an architect overseeing multiple agents working in parallel. &lt;/p&gt;

&lt;p&gt;Whether you’re refactoring a legacy auth module or having an agent vibe code a new frontend from scratch, the goal is to keep you building without distractions. Antigravity handles the terminal commands, the dependency installs, and the UI verification so you can focus on &lt;em&gt;what&lt;/em&gt; you're building instead &lt;em&gt;how&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;So anyhow, I’m curious - what’s the one part of your development workflow you wish you could just hand off to an agent today? For me, one of the top instances was definitely the "screenshotting for tutorials", and I’m never going back.&lt;/p&gt;

&lt;p&gt;Catch you in the comments! ✌️&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>productivity</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Give your AI Studio deployed app a custom URL</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Thu, 11 Dec 2025 17:55:08 +0000</pubDate>
      <link>https://dev.to/googleai/give-your-ai-studio-deployed-app-a-custom-url-30f</link>
      <guid>https://dev.to/googleai/give-your-ai-studio-deployed-app-a-custom-url-30f</guid>
      <description>&lt;p&gt;So, you’ve just built an incredible AI application using Build in &lt;a href="https://ai.dev/build" rel="noopener noreferrer"&gt;Google’s AI Studio&lt;/a&gt;. You hit &lt;code&gt;Deploy&lt;/code&gt;, the code flies through the ether, and your app is live! But then you look at the URL. It’s something like: &lt;a href="https://burning-man-animal-cuddle-614365371127.us-west1.run.app/" rel="noopener noreferrer"&gt;https://burning-man-animal-cuddle-614365371127.us-west1.run.app/&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;While I personally love the vibe of &lt;strong&gt;"Burning Man Animal Cuddle"&lt;/strong&gt;, your users might find it a bit... suspicious. You bought a cool domain on &lt;a href="https://www.namecheap.com" rel="noopener noreferrer"&gt;Namecheap&lt;/a&gt; (like vibe-compose.com), and you want to use that instead.&lt;/p&gt;

&lt;p&gt;If you’ve never connected a Google Cloud Run service to a third-party registrar like Namecheap, the "DNS Dance" can be confusing. Here is exactly how to do it without pulling your hair out.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Add the Mapping in Cloud Run
&lt;/h2&gt;

&lt;p&gt;First, head over to your Google Cloud Console and navigate to &lt;a href="https://console.cloud.google.com/run/domains" rel="noopener noreferrer"&gt;Cloud Run domains&lt;/a&gt;. Find your project in the dropdown list in the top left (ex: &lt;code&gt;Generative Language Client&lt;/code&gt;). You’ll see a button to &lt;code&gt;Add Mapping&lt;/code&gt; in the &lt;code&gt;Domain Mappings&lt;/code&gt; section.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzptfkumk5oenzizsa3b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzptfkumk5oenzizsa3b.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the dropdown:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Select service:&lt;/strong&gt; Pick your app.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Select a verified domain:&lt;/strong&gt; Choose "Verify a new domain..."&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Base domain:&lt;/strong&gt; Type your root domain (e.g., vibe-compose.com). Do not type www yet. We start with the root.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 2: The Verification Dance
&lt;/h2&gt;

&lt;p&gt;Google needs to know you actually own the domain before they route traffic to it. When you click verify, a modal will pop up giving you a TXT record.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvugat1yc2ip6bcaplp42.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvugat1yc2ip6bcaplp42.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz6dvhrglplckvmzqtodj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz6dvhrglplckvmzqtodj.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you've entered in your website name and hit &lt;code&gt;Continue&lt;/code&gt;, copy that long string of text. It's time to leave Google for a moment and head to Namecheap.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F09esp5vjxlt7d9nz9uba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F09esp5vjxlt7d9nz9uba.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Log into &lt;strong&gt;Namecheap&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to your &lt;code&gt;Domain List&lt;/code&gt; -&amp;gt; &lt;code&gt;Manage&lt;/code&gt; -&amp;gt; &lt;code&gt;Advanced DNS&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click &lt;code&gt;Add New Record&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Select &lt;code&gt;TXT Record&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Host: @&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Value: Paste that Google verification string here.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;TTL: Automatic.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save changes.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;☕ &lt;strong&gt;Coffee Break:&lt;/strong&gt; DNS changes usually take a while. Wait 2 minutes, then go back to the Google Cloud tab and press VERIFY.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Mapping the "Naked" Domain
&lt;/h2&gt;

&lt;p&gt;Once verified, the hard part begins. You want your site to load at &lt;code&gt;vibe-compose.com&lt;/code&gt; (no www). This is called the "Naked" or "Root" domain.&lt;/p&gt;

&lt;p&gt;Cloud Run does not use CNAMEs for root domains. It uses A Records (IPv4) and AAAA Records (IPv6). Cloud Run will display a list of IP addresses. You need to add all of them to Namecheap.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5epa535avyjgwxvh8wal.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5epa535avyjgwxvh8wal.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In Namecheap Advanced DNS, your list should look like the image above:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;4 A Records: Host is &lt;code&gt;@&lt;/code&gt;, Values are the IPs ending in &lt;code&gt;.32.21&lt;/code&gt;, &lt;code&gt;.34.21&lt;/code&gt;, &lt;code&gt;.36.21&lt;/code&gt;, and &lt;code&gt;.38.21&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;4 AAAA Records: Host is &lt;code&gt;@&lt;/code&gt;, Values are the long IPv6 addresses.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 4: Don't Forget the &lt;code&gt;www&lt;/code&gt;
&lt;/h2&gt;

&lt;p&gt;If you stop now, vibe-compose.com will work, but &lt;a href="http://www.vibe-compose.com" rel="noopener noreferrer"&gt;www.vibe-compose.com&lt;/a&gt; will crash. We need to map the subdomain too.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go back to &lt;strong&gt;Cloud Run Domain Mappings&lt;/strong&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click Add Mapping again.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Type &lt;a href="http://www.vibe-compose.com" rel="noopener noreferrer"&gt;www.vibe-compose.com&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Google will tell you to add a CNAME.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Back in Namecheap, add one final record:&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Type:&lt;/strong&gt; &lt;code&gt;CNAME&lt;/code&gt; Record&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Host:&lt;/strong&gt; &lt;code&gt;www&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Value:&lt;/strong&gt; &lt;code&gt;ghs.googlehosted.com.&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;⚠️ The "Gotcha": Many devs accidentally paste &lt;code&gt;ghs.googlehosted.com&lt;/code&gt; into the "Host" field. Don't do that! The Host is &lt;code&gt;www&lt;/code&gt;. If Namecheap adds a period at the end, that should be okay.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 5: The spinning wheel of patience
&lt;/h2&gt;

&lt;p&gt;Once your DNS records are in, head back to the Cloud Run dashboard. You will see your domains listed with a yellow spinner or a green checkmark.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ae7ia7z6hoik77kdv9n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9ae7ia7z6hoik77kdv9n.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is happening now?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;DNS Propagation:&lt;/strong&gt; Google is checking to see if Namecheap updated (can take 10 mins to 24 hours).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Certificate Provisioning:&lt;/strong&gt; Once DNS is found, Google automatically creates a managed SSL certificate for you.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Taking the extra 5 minutes to map a custom domain adds a massive layer of polish to your projects. Now, instead of sending people to &lt;code&gt;burning-man-animal-cuddle&lt;/code&gt;, you can send them to &lt;a href="https://www.vibe-compose.com" rel="noopener noreferrer"&gt;vibe-compose.com&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;(Though, honestly, I'm going to miss the animal cuddles).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdcz5qpq54do2sfwyg8yp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdcz5qpq54do2sfwyg8yp.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Happy Coding! 🚀&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>programming</category>
      <category>javascript</category>
      <category>beginners</category>
    </item>
    <item>
      <title>I built a product "BS Detector" using Gemini 2.0 Flash and AI Studio 🕵️‍♂️</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Wed, 19 Nov 2025 05:33:43 +0000</pubDate>
      <link>https://dev.to/googleai/i-built-a-product-bs-detector-using-gemini-20-flash-and-ai-studio-69j</link>
      <guid>https://dev.to/googleai/i-built-a-product-bs-detector-using-gemini-20-flash-and-ai-studio-69j</guid>
      <description>&lt;p&gt;Let's be honest: online product reviews are often... broken. You see a gadget with 4.8 stars, but when you dig in, the reviews are either bots, "incentivized" reviews, or people who used the product for 5 minutes.&lt;/p&gt;

&lt;p&gt;I got tired of tab-switching between Amazon, Reddit, and YouTube to find the &lt;em&gt;actual&lt;/em&gt; truth about a product. So, I built a Chrome Extension to do it for me! It’s called &lt;strong&gt;The BS Detector&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;It lives in the Google Chrome Side Panel, scrapes the product name, and uses &lt;strong&gt;Google Gemini 2.0 Flash&lt;/strong&gt; (with Search Grounding) to cross-reference the product against real discussions on Reddit and independent forums. You could also use Gemini 2.5 Flash-Lite, or Gemini 2.5 Flash.&lt;/p&gt;

&lt;p&gt;Here is how I built it, the tech stack I used, and the "Aha!" moment regarding JSON schemas.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tech Stack
&lt;/h2&gt;

&lt;p&gt;I wanted to keep this lightweight, which meant no backend server. That means that the user has to supply their own API key, but the footprint for project files is pretty minimal:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Manifest V3&lt;/strong&gt;: The standard for modern Chrome Extensions.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Chrome Side Panel API&lt;/strong&gt;: Better than a popup because it stays open while you browse.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Gemini 2.0 Flash&lt;/strong&gt;: Fast, cheap (free tier available), and supports &lt;strong&gt;Google Search Grounding&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Vanilla JS &amp;amp; CSS&lt;/strong&gt;: Because sometimes you don't need a framework.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  1. The manifest and side panel
&lt;/h2&gt;

&lt;p&gt;Popups are annoying for research tools because they close when you click away. The Side Panel is the perfect UX for this.&lt;/p&gt;

&lt;p&gt;In &lt;code&gt;manifest.json&lt;/code&gt;, we define the behavior:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="nl"&gt;"side_panel"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"default_path"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sidepanel.html"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="err"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="nl"&gt;"permissions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"sidePanel"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"activeTab"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"scripting"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"storage"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And in &lt;code&gt;background.js&lt;/code&gt;, we ensure the panel opens when the icon is clicked:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;chrome&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;sidePanel&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setPanelBehavior&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;openPanelOnActionClick&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;
  &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  2. Getting the context
&lt;/h2&gt;

&lt;p&gt;When the user clicks "Analyze," we need to know &lt;em&gt;what&lt;/em&gt; they are looking at. I used &lt;code&gt;chrome.scripting&lt;/code&gt; to inject a quick function into the active tab to grab the title.&lt;/p&gt;

&lt;p&gt;We look for specific Amazon ID selectors first, then fall back to the &lt;code&gt;&amp;lt;h1&amp;gt;&lt;/code&gt; tag.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;chrome&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;scripting&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;executeScript&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;target&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;tabId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;tab&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;func&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;amzn&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getElementById&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;productTitle&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;h1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;querySelector&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;h1&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;amzn&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;amzn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;innerText&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;h1&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="nx"&gt;h1&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;innerText&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;results&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// Update UI with product title&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  3. Gemini 2.0 Flash + Search grounding
&lt;/h2&gt;

&lt;p&gt;This is where the magic happens. It isn't a secret that LLMs hallucinate - as an example, if I ask GPT-4 about a specific generic dropshipped air purifier, it might make up features or capabilities.&lt;/p&gt;

&lt;p&gt;To fix this, I used Gemini's &lt;strong&gt;Search Grounding&lt;/strong&gt;. This allows the model to query Google Search live during generation.&lt;/p&gt;

&lt;p&gt;Here is the prompt strategy I used in &lt;code&gt;sidepanel.js&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;prompt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`
You are a cynical consumer investigator. Analyze this product: "&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;productTitle&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;".

1. Use Google Search to find discussions on Reddit, YouTube, and independent forums.
2. Ignore marketing fluff. Look for "dealbreakers".
3. Determine if this is a high-quality item or generic "dropshipped" junk.
`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I pass &lt;code&gt;tools: [{ googleSearch: {} }]&lt;/code&gt; in the API payload. This tells Gemini: "If you don't know, Google it."&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Controlled JSON output
&lt;/h2&gt;

&lt;p&gt;The biggest pain in AI engineering is parsing the response. You usually get markdown, backticks, or conversational filler ("Here is the JSON you asked for...").&lt;/p&gt;

&lt;p&gt;Gemini 2.0 supports &lt;strong&gt;Controlled Generation&lt;/strong&gt; via &lt;code&gt;responseSchema&lt;/code&gt;. You can define exactly what the JSON should look like, and the API enforces it. Which means no more Regex parsing!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;generationConfig&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nl"&gt;responseMimeType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;responseSchema&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;OBJECT&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="nx"&gt;properties&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nl"&gt;real_score&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;NUMBER&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="nx"&gt;verdict&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;STRING&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="nx"&gt;dealbreakers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
                &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ARRAY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
                &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;STRING&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; 
            &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="nx"&gt;pros&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
                &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;ARRAY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
                &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;STRING&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; 
            &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="nx"&gt;source_count&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nl"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;NUMBER&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="nx"&gt;required&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;real_score&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;verdict&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;dealbreakers&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;pros&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Because of this schema, in my fetch request, I can simply do:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;analysis&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;parse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;candidates&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;parts&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="c1"&gt;// It just works. Every time.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The results
&lt;/h2&gt;

&lt;p&gt;The extension takes the product title, reads through the "BS" marketing, checks Reddit threads about the item, and outputs:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;A Real Sentiment Score:&lt;/strong&gt; (e.g., 6.5/10).&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;The Verdict:&lt;/strong&gt; A concise summary of &lt;em&gt;actual&lt;/em&gt; user experiences.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Dealbreakers:&lt;/strong&gt; The stuff Amazon hides (e.g., "proprietary charging cable," "app requires login").&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1990887860237414821-994" src="https://platform.twitter.com/embed/Tweet.html?id=1990887860237414821"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1990887860237414821-994');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1990887860237414821&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;p&gt;This project took a few minutes to build in AI Studio, but saves me time every time I shop. It’s a great example of how powerful &lt;strong&gt;Search Grounding&lt;/strong&gt; is when combined with client-side extensions. You don't need a massive backend to build useful AI tools anymore.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Future Improvements:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Add history storage to compare products.&lt;/li&gt;
&lt;li&gt;  Analyze the price history as well, to see how it's changed!&lt;/li&gt;
&lt;li&gt;  Detect specific "fake review" patterns in the text itself.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The code is open source (link below). Go fork it and stop buying junk: &lt;a href="https://github.com/dynamicwebpaige/product-bs-detector" rel="noopener noreferrer"&gt;https://github.com/dynamicwebpaige/product-bs-detector&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>ai</category>
      <category>programming</category>
      <category>productivity</category>
    </item>
    <item>
      <title>📻 I built an infinite 90s boombox with Gemini + Lyria (and it has an AI DJ!)</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Tue, 18 Nov 2025 06:08:28 +0000</pubDate>
      <link>https://dev.to/googleai/i-built-an-infinite-90s-boombox-with-gemini-lyria-and-it-has-an-ai-dj-3jh8</link>
      <guid>https://dev.to/googleai/i-built-an-infinite-90s-boombox-with-gemini-lyria-and-it-has-an-ai-dj-3jh8</guid>
      <description>&lt;p&gt;I recently built an experiment that I honestly can’t stop listening to. It’s a virtual, web-based Boombox. You tune the dial, the station changes, and the music crossfades in real-time.&lt;/p&gt;

&lt;p&gt;But here is the kicker: &lt;strong&gt;The DJ is AI.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Every time you settle on a station, a dynamically generated voice (courtesy of &lt;a href="https://ai.google.dev/gemini-api/docs/speech-generation" rel="noopener noreferrer"&gt;Gemini text-to-speech&lt;/a&gt;) chimes in to introduce the track and the genre, totally context-aware. It feels like a ghost in the machine, and it was built using the Google Gen AI SDK, Lit, and the Lyria Real-Time model.&lt;/p&gt;

&lt;p&gt;Here is how I built it.&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1990544411953168752-279" src="https://platform.twitter.com/embed/Tweet.html?id=1990544411953168752"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1990544411953168752-279');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1990544411953168752&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;h2&gt;
  
  
  The Tech Stack
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Vibe-coding Platform:&lt;/strong&gt; &lt;a href="https://ai.dev/build" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Framework:&lt;/strong&gt; &lt;a href="https://lit.dev/" rel="noopener noreferrer"&gt;Lit&lt;/a&gt; (Web Components) + Vite&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Music Generation:&lt;/strong&gt; Google's &lt;code&gt;lyria-realtime-exp&lt;/code&gt; model&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;DJ Voice &amp;amp; Script:&lt;/strong&gt; Gemini 2.5 Flash &amp;amp; Gemini TTS (Fenrir voice)&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Visuals:&lt;/strong&gt; CSS for the radio, Gemini 2.5 Flash Image for the background.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  1. The Infinite Music Stream 🎵
&lt;/h2&gt;

&lt;p&gt;The core of the app is the &lt;code&gt;LiveMusicHelper&lt;/code&gt;. It connects to the &lt;code&gt;lyria-realtime-exp&lt;/code&gt; model. Unlike generating a static MP3 file, this establishes a session where we can steer the music in real-time by sending "Weighted Prompts."&lt;/p&gt;

&lt;p&gt;When you turn the tuning knob on the UI, we aren't downloading a new song; we are telling the AI to shift its attention.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// from utils/LiveMusicHelper.ts&lt;/span&gt;

&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;readonly&lt;/span&gt; &lt;span class="nx"&gt;setWeightedPrompts&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;throttle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;prompts&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Map&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;Prompt&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// Convert our UI map to an array for the API&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;weightedPrompts&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;activePrompts&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;p&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;weight&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;weight&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// This is where the magic happens. &lt;/span&gt;
    &lt;span class="c1"&gt;// We tell the model: "be 100% Bossa Nova" or "mix 50% Dubstep and 50% Jazz"&lt;/span&gt;
    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setWeightedPrompts&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="nx"&gt;weightedPrompts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="na"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;any&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The visual knob component maps a rotation angle to an index in a prompt array. If you are on index 0, "Bossa Nova" gets a weight of &lt;code&gt;1.0&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. The AI DJ (The "Secret Sauce") 🎙️
&lt;/h2&gt;

&lt;p&gt;This is my favorite part. A radio isn't a radio without a DJ telling you what you're listening to. I created a &lt;code&gt;RadioAnnouncer&lt;/code&gt; class to handle this.&lt;/p&gt;

&lt;p&gt;It works in a two-step chain:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Generate the Script:&lt;/strong&gt; We ask Gemini to write a one-sentence intro.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Generate the Audio:&lt;/strong&gt; We pass that text to the TTS model.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Step 1: The Personality
&lt;/h3&gt;

&lt;p&gt;We prompt Gemini 2.5 Flash to adopt a persona. Note the specific constraints: short, punchy, no quotes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// from utils/RadioAnnouncer.ts&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;scriptResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;models&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;generateContent&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;gemini-2.5-flash&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;contents&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`You are a charismatic radio DJ. Write a single, short, punchy sentence to introduce the current song.
  The station frequency is &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;freq&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; FM.
  The music genre is &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;station&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;.
  Do not use quotes. Just the spoken text.
  Example: "You're locked in to 104.5, keeping it smooth with Bossa Nova."`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 2: The Voice
&lt;/h3&gt;

&lt;p&gt;We use the &lt;code&gt;Fenrir&lt;/code&gt; voice from the prebuilt configurations.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;ttsResponse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;models&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;generateContent&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;gemini-2.5-flash-preview-tts&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;contents&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;parts&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt; &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;script&lt;/span&gt; &lt;span class="p"&gt;}]&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="na"&gt;config&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;responseModalities&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;Modality&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;AUDIO&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="na"&gt;speechConfig&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;voiceConfig&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;prebuiltVoiceConfig&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;voiceName&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Fenrir&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  The Logic: Debouncing
&lt;/h3&gt;

&lt;p&gt;Because the user might scroll past 5 stations quickly to get to "Dubstep," we don't want the DJ to try and announce every single one. I used a debouncer so the generation only triggers once the user &lt;em&gt;stops&lt;/em&gt; turning the knob for 800ms.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. The Aesthetics 🎨
&lt;/h2&gt;

&lt;p&gt;The UI is built using Lit. The boombox itself is a mix of CSS styling and SVG for the speakers and knobs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Speaker Pulse:&lt;/strong&gt;&lt;br&gt;
I used the Web Audio API to create an &lt;code&gt;AudioAnalyser&lt;/code&gt;. We grab the current frequency data and map it to a CSS &lt;code&gt;transform: scale()&lt;/code&gt; on the speaker cones.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight css"&gt;&lt;code&gt;&lt;span class="c"&gt;/* from components/PromptDjMidi.ts */&lt;/span&gt;
&lt;span class="nc"&gt;.speaker-cone&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c"&gt;/* ... textures and gradients ... */&lt;/span&gt;
  &lt;span class="nl"&gt;transition&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;transform&lt;/span&gt; &lt;span class="m"&gt;0.05s&lt;/span&gt; &lt;span class="n"&gt;cubic-bezier&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="m"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// In the render loop&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;pulseScale&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;audioLevel&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mf"&gt;0.15&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; 
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;speakerStyle&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;styleMap&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;transform&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`scale(&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;pulseScale&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;)`&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;The Background:&lt;/strong&gt;&lt;br&gt;
To really sell the vibe, I generate a background image on load.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// The prompt used for the background&lt;/span&gt;
&lt;span class="nx"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;A 90s-style girl&lt;/span&gt;&lt;span class="se"&gt;\'&lt;/span&gt;&lt;span class="s1"&gt;s bedroom, dreamy, nostalgic, vaporwave aesthetic, anime posters on the wall, lava lamp, beaded curtains, photorealistic.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  4. Managing the "Spin" 😵‍💫
&lt;/h2&gt;

&lt;p&gt;One of the hardest parts of this build was the math for the tuning knob. We need to convert mouse/touch movement into rotation, and then snap that rotation to specific "stations."&lt;/p&gt;

&lt;p&gt;I implemented a circular capture logic:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Calculate the angle between the center of the knob and the mouse cursor.&lt;/li&gt;
&lt;li&gt; Calculate &lt;code&gt;delta&lt;/code&gt; (change) from the start of the click.&lt;/li&gt;
&lt;li&gt; Handle the 0/360 degree wrap-around logic so you can spin it endlessly.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// from components/PromptDjMidi.ts&lt;/span&gt;
&lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="nf"&gt;handlePointerMove&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;PointerEvent&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// ... math to get angle ...&lt;/span&gt;

  &lt;span class="c1"&gt;// Handle crossing the 0/360 boundary smoothly&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;delta&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;180&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;delta&lt;/span&gt; &lt;span class="o"&gt;-=&lt;/span&gt; &lt;span class="mi"&gt;360&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;delta&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;180&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;delta&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;360&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;rotation&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;startRotation&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;delta&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;360&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="mi"&gt;360&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="c1"&gt;// Map rotation to station index&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;floor&lt;/span&gt;&lt;span class="p"&gt;(((&lt;/span&gt;&lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;rotation&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="nx"&gt;segmentSize&lt;/span&gt;&lt;span class="o"&gt;/&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;%&lt;/span&gt; &lt;span class="mi"&gt;360&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="nx"&gt;segmentSize&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;setStation&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;p&gt;This project was a blast to build because it combines the tactile feel of old-school interfaces with bleeding-edge generative AI.&lt;/p&gt;

&lt;p&gt;The "Ghost DJ" effect really adds a layer of immersion that pure generative music apps usually lack. It gives the AI a voice—literally—and makes the infinite radio feel alive.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You can check out the full code here in AI Studio:&lt;/strong&gt; &lt;a href="https://aistudio.google.com/apps/drive/1L23eufECJSn0KPta3eAVo-PGdM4iXm-1" rel="noopener noreferrer"&gt;https://aistudio.google.com/apps/drive/1L23eufECJSn0KPta3eAVo-PGdM4iXm-1&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let me know if you try building your own stations! I'm currently vibing to 94.0 "Shoegaze." 📻&lt;/p&gt;

</description>
      <category>gemini</category>
      <category>webdev</category>
      <category>ai</category>
      <category>showdev</category>
    </item>
    <item>
      <title>⛓️‍💥Chaining Veo 3.1 and NanoBanana with Gemini</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Sat, 15 Nov 2025 04:45:23 +0000</pubDate>
      <link>https://dev.to/googleai/chaining-veo-31-and-nanobanana-with-gemini-3ffi</link>
      <guid>https://dev.to/googleai/chaining-veo-31-and-nanobanana-with-gemini-3ffi</guid>
      <description>&lt;p&gt;As developers, we are used to chaining APIs to get a desired output. In the world of Generative AI, a similar pattern emerges: &lt;strong&gt;Model Chaining.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Creating a high-quality AI videos often requires orchestrating a workflow, not just typing in text and hitting the "generate!" button. Today, I’m going to walk through a specific stack—&lt;strong&gt;Gemini 2.5 Pro (for reasoning/prompting), NanoBanana (for base image generation), and Veo 3.1 (for image-to-video)&lt;/strong&gt; to simulate a hyper-realistic doorbell security camera feed of a very cute fennec fox who is playing with LEGO bricks.&lt;/p&gt;

&lt;p&gt;Below is the breakdown of how we went from a blank slate to a coherent video, the prompts I used in &lt;a href="https://ai.studio" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt;, and a critique of the generated video output. Let's get started! 😄&lt;/p&gt;

&lt;h2&gt;
  
  
  The model chain
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Gemini&lt;/strong&gt;. Used to reason through the visual aesthetics and generate the complex prompts needed for the image and video generation models.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;NanoBanana&lt;/strong&gt;. Used to generate the initial static image asset, in portrait mode (9:16).&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Veo 3.1 Fast&lt;/strong&gt;. Used to apply physics and motion to the static image asset, also in portrait mode (9:16).&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Phase 1: The Base Image
&lt;/h2&gt;

&lt;p&gt;The hardest part of image generation is getting the "vibes" and the character consistency right. For this example, I wanted a &lt;em&gt;specific medium&lt;/em&gt; — a grainy, night-vision security camera from the perspective of a home doorbell.&lt;/p&gt;

&lt;p&gt;Instead of guessing keywords, I asked Gemini to act as a prompt engineer for the task at hand. I provided the concept ("Fennec fox, Legos, night, doorbell cam") and asked it to write a prompt for an image model.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Gemini-Generated Prompt:&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;A grainy, low-quality doorbell camera snapshot at night. Infrared night vision aesthetic with a slight monochromatic green tint. A wide-angle fisheye lens view looking down at a front porch welcome mat. A cute fennec fox with large ears is sitting on the mat, looking up at the camera with glowing reflective eyes. The fox is surrounded by scattered LEGO bricks. The LEGO bricks are arranged on the floor to clearly spell out the word "HI :)" in block letters. Digital overlay text in the corner says "FRONT DOOR - LIVE" and the current timestamp.&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Why this works:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Artifact Injection:&lt;/strong&gt; Words like "grainy," "low-quality," and "monochromatic green tint" prevent the model from making the image &lt;em&gt;too&lt;/em&gt; clean or artistic. It forces realism through imperfection.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Camera Specs:&lt;/strong&gt; Specifying "fisheye lens" and "looking down" enforces the correct perspective distortion typical of Ring/Nest cameras.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;The Result:&lt;/strong&gt;&lt;br&gt;
NanoBanana output a near-perfect static image. The lighting was flat (typical of IR), the eyes glowed (retro-reflection), and the angle was distinctively "doorbell."&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foz4x6qg2sg0aftdbbmu8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foz4x6qg2sg0aftdbbmu8.png" alt=" " width="768" height="1376"&gt;&lt;/a&gt;&lt;/p&gt;


&lt;h2&gt;
  
  
  Phase 2: The Animation
&lt;/h2&gt;

&lt;p&gt;If you simply tell a video model "make it move," all of the models have a tendency to hallucinate wild camera pans or morph the subject. You need to provide &lt;strong&gt;direction&lt;/strong&gt;. To do this, I fed the static image back into Gemini and asked it to write a prompt for &lt;em&gt;animating&lt;/em&gt; the image. After taking a look at the example prompts, I selected one that focused on interaction and physics.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Video Prompt:&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;The cute fennec fox looks down from the camera towards the LEGO bricks on the mat. It gently extends one front paw and nudges a loose LEGO brick near the "HI", sliding it slightly across the mat. The fox then looks back up at the camera with a playful, innocent expression. Its ears twitch. The camera remains static.&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I fed this prompt and the static image into &lt;strong&gt;Veo 3.1 Fast&lt;/strong&gt;.&lt;/p&gt;


&lt;h2&gt;
  
  
  Phase 3: Analyzing the Veo Output
&lt;/h2&gt;

&lt;p&gt;Let’s look at the resulting video file and analyze the execution against the prompt:&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1989549550046986570-109" src="https://platform.twitter.com/embed/Tweet.html?id=1989549550046986570"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1989549550046986570-109');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1989549550046986570&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;h3&gt;
  
  
  Wins
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Temporal coherence (lighting and texture):&lt;/strong&gt;&lt;br&gt;
The most impressive aspect is the consistency of the night-vision texture. The "grain" doesn't shimmer uncontrollably, and the monochromatic green remains stable throughout the 7 seconds. The fur texture on the fox changes naturally as it moves, rather than boiling or morphing.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The "Fisheye" effect:&lt;/strong&gt;&lt;br&gt;
Veo 3.1 respected the distortion of the original image. When the fox leans down and back up, it moves &lt;em&gt;within&lt;/em&gt; the 3D space of that distorted lens. It doesn't flatten out.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Ear dynamics:&lt;/strong&gt;&lt;br&gt;
The prompt specifically asked for "ears twitch." Veo nailed this. The ears move independently and reactively, which is a critical trait of fennec foxes. This adds a layer of biological realism to the generated movement.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Camera locking:&lt;/strong&gt;&lt;br&gt;
The prompt specified "The camera remains static." This is crucial. Early video models often added unnecessary pans or zooms. Veo kept the frame locked, reinforcing the "mounted security camera" aesthetic.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Bugs
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Object Permanence ( The LEGOs):&lt;/strong&gt;&lt;br&gt;
While the prompt asked the fox to "nudge a loose LEGO," the model struggled with rigid body physics. Instead of a clean slide, the LEGOs near the paws tend to morph or "melt" slightly as the fox interacts with them. The "HI" text also loses integrity, shifting into abstract shapes by the end of the clip.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Motion Interpretation:&lt;/strong&gt;&lt;br&gt;
The prompt asked for a gentle paw extension. The model interpreted this more as a "pounce" or a head-dive. The fox dips its whole upper body down rather than isolating the paw. While cute, it’s a deviation from the specific articulation requested.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Text Overlay (OCR Hallucination):&lt;/strong&gt;&lt;br&gt;
The original image had a crisp timestamp. As soon as motion begins, the text overlay ("FRONT DOOR - LIVE") becomes unstable. Video models still struggle to keep text overlays static while animating the pixels behind them. The timestamp blurs and fails to count up logically.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The "Welcome" Mat:&lt;/strong&gt;&lt;br&gt;
If you look closely at the mat, the text (presumably "WELCOME") is geometrically inconsistent. As the fox moves over it, the letters seem to shift their orientation slightly, revealing that the model treats the mat as a texture rather than a flat plane in 3D space.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;p&gt;Using an LLM like Gemini to generate prompts for media models is a massive efficiency booster! And while Veo 3.1 Fast demonstrates incredible understanding of lighting, texture, and biological movement (the ears!), it can — like all current video models — still face challenges with rigid object interaction (LEGOs) and static text overlays.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Quick tips:&lt;/strong&gt; Be specific about camera angles and lighting in your text-to-image phase. In the video phase, focus your prompts on the &lt;em&gt;subject's&lt;/em&gt; movement, but expect some fluidity in the background objects. And use Gemini 2.5 Pro to help with prompting.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>cloud</category>
      <category>webdev</category>
      <category>programming</category>
    </item>
    <item>
      <title>I Cloned Myself in 2 Minutes to Answer Gemini API Questions</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Wed, 12 Nov 2025 19:25:55 +0000</pubDate>
      <link>https://dev.to/googleai/i-cloned-myself-in-2-minutes-to-answer-gemini-api-questions-2dmf</link>
      <guid>https://dev.to/googleai/i-cloned-myself-in-2-minutes-to-answer-gemini-api-questions-2dmf</guid>
      <description>&lt;h3&gt;
  
  
  Experimenting with Happyverse 2.0
&lt;/h3&gt;

&lt;p&gt;As developers, we are obsessed with automation. We automate our deployments, our testing, and our linting. But the one thing we’ve never quite been able to scale is &lt;em&gt;ourselves&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Until recently.&lt;/p&gt;

&lt;p&gt;I decided to take &lt;strong&gt;Happyverse 2.0&lt;/strong&gt; for a spin to see if I could build a digital twin capable of handling the technical questions I get bombarded with daily. Specifically, I wanted to see if an AI clone could handle queries about &lt;strong&gt;Google AI Studio&lt;/strong&gt; and the &lt;strong&gt;Gemini APIs&lt;/strong&gt; with enough accuracy to pass as me.&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1988420687246356930-659" src="https://platform.twitter.com/embed/Tweet.html?id=1988420687246356930"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1988420687246356930-659');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1988420687246356930&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;The "Time to Hello World"&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;The setup time was the most surprising metric. It took approximately &lt;strong&gt;2 minutes&lt;/strong&gt; to go from zero to a fully rendered clone, with the assistance of the Happyverse UI.&lt;/p&gt;

&lt;p&gt;In the world of AI avatars, we usually expect a trade-off between rendering latency and visual fidelity. However, the result here was frighteningly close to real life. We aren't just talking about a static chatbot; this is a real-time participant in a meeting — an intersection of a lifelike avatar and an intelligent agent. You can also dial into Google Meet instances, Zoom meetings, and more with your generated Happyverse agents.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Use Case: Technical Support Agent&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;I pointed the clone at a URL for our Gemini API documentation. The goal? To see if "She" could accurately parse and explain API endpoints and studio features without hallucinating.&lt;/p&gt;

&lt;p&gt;The results were impressive. She handled the context well, staying tied to the real data and DeepMind models rather than hallucinating answers, or answering off-topic questions. This "grounding" is critical for developers looking to build agents that serve as actual product experts rather than just conversational novelties.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Edge Cases and "Human" Latency&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;While the technical accuracy is there, the "human" element is the next frontier of optimization.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Personality Injection:&lt;/strong&gt; She’s accurate, but she needs a crash course in Texas barbecue to truly pass as me. Customizing the system prompt, adding more docs for grounding, and specific tonal quirks is the next step.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;i18n:&lt;/strong&gt; I’m planning to test multilinguality next. Can the avatar switch context to German on the fly while maintaining lip-sync accuracy? That’s a stress test I’m looking forward to.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;The Stack&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;For those curious about the platform, this was built on &lt;strong&gt;Happyverse 2.0&lt;/strong&gt;. They position themselves as a platform for building "Confidants"—essentially AI agents wrappered in hyper-realistic video generation that operates in real-time.&lt;/p&gt;

&lt;p&gt;If you are looking to build interfaces that feel less like a terminal and more like a colleague, this is worth a look.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Check out Happyverse 2.0 on Product Hunt:&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://producthunt.com/products/happyverse-2" rel="noopener noreferrer"&gt;https://producthunt.com/products/happyverse-2&lt;/a&gt;&lt;/p&gt;

</description>
      <category>gemini</category>
      <category>cloud</category>
      <category>api</category>
    </item>
    <item>
      <title>How I Built a "Community Prayer Quilt" in 5 Minutes</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Wed, 12 Nov 2025 19:13:27 +0000</pubDate>
      <link>https://dev.to/googleai/how-i-built-a-community-prayer-quilt-in-5-minutes-5afk</link>
      <guid>https://dev.to/googleai/how-i-built-a-community-prayer-quilt-in-5-minutes-5afk</guid>
      <description>&lt;h3&gt;
  
  
  Stack: Google AI Studio + Google Sheets + Google Forms + Gemini 2.5 Flash-Lite
&lt;/h3&gt;

&lt;p&gt;We’ve all been there: you have an idea for a sweet, community-focused project, but the thought of spinning up a database, configuring auth, and building a backend API stops you before you even &lt;code&gt;npm init&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;This weekend, I decided to skip the boilerplate and ship.&lt;/p&gt;

&lt;p&gt;I wanted to build a &lt;strong&gt;Community Prayer Quilt&lt;/strong&gt;—a digital space where people could leave a prayer or wish, and have it stitched into a visual tapestry. The goal was to get it running in the time it takes to brew coffee.&lt;/p&gt;

&lt;p&gt;Here is the "lazy" architecture I used to build &lt;a href="http://prayerwall.club" rel="noopener noreferrer"&gt;prayerwall.club&lt;/a&gt; in about 5 minutes, using Google Sheets as my backend and Gemini 2.5 Flash-Lite as my content moderator.&lt;/p&gt;

&lt;p&gt;&lt;iframe class="tweet-embed" id="tweet-1988056184365216128-354" src="https://platform.twitter.com/embed/Tweet.html?id=1988056184365216128"&gt;
&lt;/iframe&gt;

  // Detect dark theme
  var iframe = document.getElementById('tweet-1988056184365216128-354');
  if (document.body.className.includes('dark-theme')) {
    iframe.src = "https://platform.twitter.com/embed/Tweet.html?id=1988056184365216128&amp;amp;theme=dark"
  }



&lt;/p&gt;

&lt;h3&gt;
  
  
  The "No-Backend" Architecture
&lt;/h3&gt;

&lt;p&gt;When you need to collect user data rapidly without a server, nothing beats the Google ecosystem's plumbing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. The Database &amp;amp; Ingestion: Google Forms + Sheets&lt;/strong&gt;&lt;br&gt;
Instead of building a React form and handling POST requests, I created a Google Form.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Input:&lt;/strong&gt; Users submit their prayers via the form.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Storage:&lt;/strong&gt; Responses are automatically piped into a Google Sheet.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. The API: Sheets CSV Export&lt;/strong&gt;&lt;br&gt;
Here’s a trick I love for hackathons. You don’t need the full Google Sheets API (and its OAuth dance) for public read-only data. You can publish a sheet to the web and access it via the visualization endpoint:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;SHEET_ID&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;YOUR_SHEET_ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;URL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`https://docs.google.com/spreadsheets/d/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;SHEET_ID&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/gviz/tq?tqx=out:csv`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Fetch and parse&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;URL&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;csvText&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;text&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="c1"&gt;// ...parse CSV to JSON...&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Boom. Instant, zero-latency JSON endpoint.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The AI Bouncer: Gemini Flash-Lite
&lt;/h3&gt;

&lt;p&gt;The problem with public walls is moderation. I didn't want to manually review every entry, but I also didn't want to risk trolls posting toxicity on a prayer wall.&lt;/p&gt;

&lt;p&gt;Enter &lt;strong&gt;Gemini Flash-Lite&lt;/strong&gt; via Google AI Studio.&lt;/p&gt;

&lt;p&gt;I needed something fast and cheap to act as a "toxicity check" before rendering the prayers on the quilt. I hooked up the &lt;code&gt;@google/genai&lt;/code&gt; SDK to run a quick sanity check on the frontend.&lt;/p&gt;

&lt;p&gt;Here is the prompt logic I used:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;genAI&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getGenerativeModel&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;gemini-flash-lite-latest&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;prompt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`
  Analyze the following prayer for safety. 
  It will be displayed on a public, all-ages community wall.
  The prayer must not contain toxic language, hate speech, or violence.

  Prayer: "&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;userPrayer&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"

  Return JSON: { "is_safe": boolean }
`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If &lt;code&gt;is_safe&lt;/code&gt; returns &lt;code&gt;true&lt;/code&gt;, the patch is stitched into the quilt. If not, it’s silently discarded. It adds a tiny bit of latency, but it ensures the vibe of the site remains wholesome without me having to play internet janitor.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Visuals: Procedural HTML5 Canvas
&lt;/h3&gt;

&lt;p&gt;For the frontend, I didn't want a simple list. I wanted it to &lt;em&gt;look&lt;/em&gt; like a quilt.&lt;/p&gt;

&lt;p&gt;I used React with the HTML5 &lt;code&gt;&amp;lt;canvas&amp;gt;&lt;/code&gt; API to procedurally generate each "patch."&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;strong&gt;Seeding:&lt;/strong&gt; I hash the prayer text to create a unique seed.&lt;/li&gt;
&lt;li&gt;  &lt;strong&gt;Pattern:&lt;/strong&gt; That seed determines the fabric color palette, the stitch patterns (sine waves), and the "fabric texture" (randomized stroke opacity).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This means every prayer generates a unique, deterministic visual representation of itself.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why This Matters
&lt;/h3&gt;

&lt;p&gt;Is this "Enterprise Grade"? Absolutely not. Is it scalable to millions of users? Probably not (Google Sheets has rate limits).&lt;/p&gt;

&lt;p&gt;But it is &lt;strong&gt;live&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;We often let architectural purity get in the way of shipping. By treating Google Sheets as a backend and using a lightweight AI model for logic that would essentially require a human human, I went from "idea" to "deployed" in minutes.&lt;/p&gt;

&lt;p&gt;Check it out live here: &lt;a href="http://prayerwall.club" rel="noopener noreferrer"&gt;prayerwall.club&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Go build something fun today. Worry about the migration to a more robust database later.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>webdev</category>
      <category>programming</category>
      <category>javascript</category>
    </item>
    <item>
      <title>We're the Google DeepMind Team building Gemini, Google AI Studio, and more! Ask Us Anything.</title>
      <dc:creator>Paige Bailey</dc:creator>
      <pubDate>Tue, 19 Aug 2025 19:08:46 +0000</pubDate>
      <link>https://dev.to/googleai/were-the-google-deepmind-team-building-gemini-google-ai-studio-and-more-ask-us-anything-4k4</link>
      <guid>https://dev.to/googleai/were-the-google-deepmind-team-building-gemini-google-ai-studio-and-more-ask-us-anything-4k4</guid>
      <description>&lt;p&gt;Hey DEV community! 👋&lt;/p&gt;

&lt;p&gt;We're the team behind &lt;a href="https://aistudio.google.com/?utm_source=partner&amp;amp;utm_medium=partner&amp;amp;utm_campaign=FY25-Global-DEVpartnership-AMA-AIS&amp;amp;utm_content=-&amp;amp;utm_term=-" rel="noopener noreferrer"&gt;Google AI Studio&lt;/a&gt; and the Gemini API at Google DeepMind. &lt;/p&gt;

&lt;p&gt;We'll be answering your questions live on August 28, 2025 starting at 1PM ET.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Thank you to everyone who participated in our AMA!&lt;/strong&gt; We'll do our best to keep answering questions asynchronously throughout the next few weeks so check back later if your question wasn't answered!&lt;/p&gt;




&lt;h2&gt;
  
  
  Who we are:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Paige Bailey (&lt;a class="mentioned-user" href="https://dev.to/dynamicwebpaige"&gt;@dynamicwebpaige&lt;/a&gt;)&lt;/strong&gt;: AI Developer Relations Lead&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Patrick Loeber (&lt;a class="mentioned-user" href="https://dev.to/pat_loeber"&gt;@pat_loeber&lt;/a&gt;)&lt;/strong&gt;: Developer Relations Engineer&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Alisa Fortin (&lt;a class="mentioned-user" href="https://dev.to/alisa_fortin"&gt;@alisa_fortin&lt;/a&gt;)&lt;/strong&gt;: Product Manager&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ivan Solovyev (&lt;a class="mentioned-user" href="https://dev.to/ivan_solovyev"&gt;@ivan_solovyev&lt;/a&gt;)&lt;/strong&gt;: Product Manager&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Kate Olszewska (&lt;a class="mentioned-user" href="https://dev.to/kate_olszewska"&gt;@kate_olszewska&lt;/a&gt;)&lt;/strong&gt;: Product Manager&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Vivian Jair (&lt;a class="mentioned-user" href="https://dev.to/vivjair"&gt;@vivjair&lt;/a&gt;)&lt;/strong&gt;: Product Marketing Manager&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dave Elliot (&lt;a class="mentioned-user" href="https://dev.to/dave_elliott"&gt;@dave_elliott&lt;/a&gt;)&lt;/strong&gt;: Head of Developer Advocacy (Google Cloud)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What we work on:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;🤖 &lt;strong&gt;AI Studio&lt;/strong&gt;: Our developer platform where you can experiment with Gemini models, including access to our latest experimental releases.&lt;/li&gt;
&lt;li&gt;🔧 &lt;strong&gt;Gemini API&lt;/strong&gt;: APIs that serve millions of developers and process trillions of tokens.&lt;/li&gt;
&lt;li&gt;🎨 &lt;strong&gt;Multi-modal &amp;amp; Open-Source Models&lt;/strong&gt;: Advanced AI models including Veo (video generation), Imagen (image generation), Lyria (music creation), and Gemma (open-source language models).&lt;/li&gt;
&lt;li&gt;📚 &lt;strong&gt;Developer Experience&lt;/strong&gt;: Making Google's most advanced AI models easier to integrate and use.&lt;/li&gt;
&lt;li&gt;🌍 &lt;strong&gt;Community&lt;/strong&gt;: Building resources, documentation, and support for the global developer community.&lt;/li&gt;
&lt;/ul&gt;


&lt;div class="crayons-card c-embed"&gt;

  
&lt;h2&gt;
  
  
  Ask us about:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;🚀 &lt;strong&gt;AI Studio &amp;amp; Gemini API&lt;/strong&gt;: Features and how to get started&lt;/li&gt;
&lt;li&gt;🎨 &lt;strong&gt;Our AI Models&lt;/strong&gt;: Veo, Imagen, Lyria, Gemma&lt;/li&gt;
&lt;li&gt;🔬 &lt;strong&gt;Working at Google DeepMind&lt;/strong&gt;: What it's like being at the intersection of research and developer tools&lt;/li&gt;
&lt;li&gt;🛠️ &lt;strong&gt;Building AI applications&lt;/strong&gt;: Best practices, common challenges, scaling tips&lt;/li&gt;
&lt;li&gt;💡 &lt;strong&gt;Career advice&lt;/strong&gt;: Breaking into AI/ML, developer relations, product management&lt;/li&gt;
&lt;li&gt;🌟 &lt;strong&gt;The future of AI development&lt;/strong&gt;: Where we see the space heading&lt;/li&gt;
&lt;li&gt;🏗️ &lt;strong&gt;Developer experience&lt;/strong&gt;: How we think about making AI accessible&lt;/li&gt;
&lt;/ul&gt;


&lt;/div&gt;


&lt;h2&gt;
  
  
  Please don't ask us about:
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Unreleased Google products or detailed internal roadmaps&lt;/li&gt;
&lt;li&gt;Proprietary technical implementations&lt;/li&gt;
&lt;li&gt;Confidential business information&lt;/li&gt;
&lt;li&gt;Personal/private information&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Get started with AI Studio:
&lt;/h2&gt;

&lt;p&gt;If you haven't tried &lt;a href="https://aistudio.google.com/?utm_source=partner&amp;amp;utm_medium=partner&amp;amp;utm_campaign=FY25-Global-DEVpartnership-AMA-AIS&amp;amp;utm_content=-&amp;amp;utm_term=-" rel="noopener noreferrer"&gt;AI Studio&lt;/a&gt; yet, it's the easiest way to start building with Gemini. You can turn on features like code execution, use extended context (2M+ tokens), and access our latest experimental models - all for free to get started!&lt;/p&gt;

&lt;p&gt;We'll be rotating through answers throughout the day, so you might hear from different team members. Let's dive in! 🔥&lt;/p&gt;

</description>
      <category>ama</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>discuss</category>
    </item>
  </channel>
</rss>
