<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: bebechien</title>
    <description>The latest articles on DEV Community by bebechien (@bebechien).</description>
    <link>https://dev.to/bebechien</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/bebechien"/>
    <language>en</language>
    <item>
      <title>Building Your First RAG System</title>
      <dc:creator>bebechien</dc:creator>
      <pubDate>Thu, 12 Feb 2026 03:57:41 +0000</pubDate>
      <link>https://dev.to/googleai/building-your-first-rag-system-f0n</link>
      <guid>https://dev.to/googleai/building-your-first-rag-system-f0n</guid>
      <description>&lt;h1&gt;
  
  
  💎 From Mining Ores to Mining Insights
&lt;/h1&gt;

&lt;p&gt;Whether you are navigating the underground biomes of &lt;em&gt;&lt;a href="https://en.wikipedia.org/wiki/Core_Keeper" rel="noopener noreferrer"&gt;Core Keeper&lt;/a&gt;&lt;/em&gt; or the complex spreadsheets of a small business, "information overload" is the final boss.&lt;/p&gt;

&lt;p&gt;A &lt;strong&gt;RAG (Retrieval-Augmented Generation)&lt;/strong&gt; system is like giving an AI a specialized guidebook. Instead of relying on its general training, the AI looks at your specific data-like a game wiki, your research notes, or legal contracts-to give you pinpoint-accurate answers.&lt;/p&gt;

&lt;p&gt;Here is how to build a private, local knowledge base using &lt;strong&gt;&lt;a href="https://huggingface.co/google/gemma-3-4b-it" rel="noopener noreferrer"&gt;Gemma 3 4B&lt;/a&gt;&lt;/strong&gt; and &lt;strong&gt;&lt;a href="https://huggingface.co/google/embeddinggemma-300m" rel="noopener noreferrer"&gt;EmbeddingGemma&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  🛠️ The "Crafting" Station (Your Tech Stack)
&lt;/h1&gt;

&lt;p&gt;To build this, we're using a "Local-First" approach. This means your data never leaves your computer-perfect for keeping your secret base coordinates (or private client info) safe.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The Brain (LLM):&lt;/strong&gt; &lt;code&gt;gemma3:4b&lt;/code&gt; - Google’s compact, highly efficient model.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Librarian (Embedder):&lt;/strong&gt; &lt;code&gt;embeddinggemma&lt;/code&gt; - A specialized model that "indexes" your data so it can be searched.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Server:&lt;/strong&gt; &lt;strong&gt;&lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Ollama&lt;/a&gt;&lt;/strong&gt; - The engine that runs these models on your machine.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Interface:&lt;/strong&gt; &lt;strong&gt;&lt;a href="https://anythingllm.com/" rel="noopener noreferrer"&gt;AnythingLLM&lt;/a&gt;&lt;/strong&gt; - A user-friendly app that looks like a chat window but handles all the heavy lifting of document storage.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;NOTE: One of the best things about local AI in 2026 is that the tools are "plug-and-play". You can mix and match your server and UI depending on your technical comfort level. For example, you can use &lt;a href="https://lmstudio.ai/" rel="noopener noreferrer"&gt;LM Studio&lt;/a&gt; instead of Ollama, or use &lt;a href="https://openwebui.com/" rel="noopener noreferrer"&gt;Open WebUI&lt;/a&gt; instead of AnythingLLM. Experiment with different tools to see which one fits your style best!&lt;/p&gt;

&lt;h1&gt;
  
  
  📖 Step 1: Gathering Your Materials
&lt;/h1&gt;

&lt;p&gt;First, identify the "Source of Truth".&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;For the Gamer:&lt;/strong&gt; Use the &lt;em&gt;&lt;a href="https://corekeeper.atma.gg/en/Core_Keeper_Wiki" rel="noopener noreferrer"&gt;Core Keeper Wiki&lt;/a&gt;&lt;/em&gt; to track boss strategies and crafting recipes.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For the Professional:&lt;/strong&gt; This could be a folder of PDFs, project logs, or even a specialized website.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  ⚙️ Step 2: Set Up the Workshop (Ollama)
&lt;/h1&gt;

&lt;p&gt;&lt;em&gt;(Quick tip: You'll want about 8GB of VRAM to run a 4B model like this smoothly!)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Download Ollama and run these two commands in your terminal to download the models:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Download the language model&lt;/span&gt;
ollama pull gemma3:4b

&lt;span class="c"&gt;# Download the embedding model&lt;/span&gt;
ollama pull embeddinggemma

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  🖥️ Step 3: Configuring the Interface (AnythingLLM)
&lt;/h1&gt;

&lt;p&gt;Open &lt;strong&gt;AnythingLLM&lt;/strong&gt; and follow these steps to link your models:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;LLM Settings:&lt;/strong&gt; Set the provider to &lt;strong&gt;Ollama&lt;/strong&gt; and choose &lt;code&gt;gemma3:4b&lt;/code&gt;. This model acts as the "speaker" that will read the retrieved context and formulate the final answer for you.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Embedder Settings:&lt;/strong&gt; Choose &lt;strong&gt;Ollama&lt;/strong&gt; and select &lt;code&gt;embeddinggemma&lt;/code&gt;. This model is a dedicated, high-performance embedding model that acts as your "search engine".&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Upload:&lt;/strong&gt; Create a "Workspace" and drop in your files (e.g., &lt;em&gt;Core Keeper&lt;/em&gt; wiki pages or your work documents). Click &lt;strong&gt;"Save and Embed."&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  ⚔️ Step 4: Using Your Knowledge Base
&lt;/h1&gt;

&lt;p&gt;Now you can chat with your data.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Game Query:&lt;/strong&gt; &lt;em&gt;"How can I beat the Abominouse Mass?"&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Work Query:&lt;/strong&gt; &lt;em&gt;"Summarize the main clauses in the Q3 marketing contract."&lt;/em&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Gemma 3 4B doesn't just "guess", it retrieves the specific text from your files and explains it to you.&lt;/p&gt;

&lt;p&gt;See the difference between asking the AI before having your knowledge base, and after:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Before: Show the AI giving a generic, vague, or incorrect answer without context.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bq647gc5v4x7sxfzl3d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7bq647gc5v4x7sxfzl3d.png" alt="before" width="675" height="296"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;After: Show the AI giving a precise, accurate answer, with a citation back to your uploaded document.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0q47cdvezpm9c5tlork.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff0q47cdvezpm9c5tlork.png" alt="after" width="657" height="674"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  👨🏻‍💻 Applying it to Your Data
&lt;/h1&gt;

&lt;p&gt;By building this yourself, you aren't just a user of AI-you’re an architect. Whether you’re optimizing a game run or a business workflow, you now have a 100% private, offline assistant that knows exactly what you know.&lt;/p&gt;

&lt;p&gt;While we’ve been using &lt;em&gt;Core Keeper&lt;/em&gt; as an example, this "build" is a lifesaver for professional field work:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;For Field Researchers:&lt;/strong&gt; Imagine you are in a wild, remote region with &lt;strong&gt;zero internet access&lt;/strong&gt;. You can feed the AI your entire library of botanical guides, previous expedition logs, and geological maps before you leave.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For Writers:&lt;/strong&gt; Feed it your draft chapters to check for world-building consistency without uploading your IP to a cloud.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For Home Chefs:&lt;/strong&gt; Turn a messy folder of recipe screenshots into a searchable "Digital Cookbook".&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;The Big Win:&lt;/strong&gt; Because you are using &lt;strong&gt;Gemma 3 4B&lt;/strong&gt; and &lt;strong&gt;EmbeddingGemma&lt;/strong&gt; locally, your system is 100% &lt;strong&gt;OFFLINE&lt;/strong&gt;. Your data never leaves your machine, making it the perfect companion for researchers in the field who need instant answers without a satellite link.&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>embeddinggemma</category>
      <category>gemma</category>
      <category>ai</category>
    </item>
    <item>
      <title>Gemini versus Gemma: Still Confused?</title>
      <dc:creator>bebechien</dc:creator>
      <pubDate>Fri, 30 Jan 2026 06:24:10 +0000</pubDate>
      <link>https://dev.to/googleai/gemini-versus-gemma-still-confused-1dnd</link>
      <guid>https://dev.to/googleai/gemini-versus-gemma-still-confused-1dnd</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fborkt90dukjavakijgaj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fborkt90dukjavakijgaj.png" alt="4 cut comic" width="640" height="640"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When catching up on Google’s AI news, you hear &lt;strong&gt;Gemini&lt;/strong&gt;, and then you hear &lt;strong&gt;Gemma&lt;/strong&gt;. The names are so similar-like twin siblings-that it’s easy to scratch your head and wonder, &lt;em&gt;"What exactly is the difference?"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Technically speaking, Gemini is a family of multimodal generative AI models and intelligent assistants developed by Google DeepMind, while Gemma is a collection of lightweight open models built from the same technology that powers our Gemini models... but wait!&lt;/p&gt;

&lt;p&gt;That explanation is a bit dry, isn't it? So, whenever I explain the difference to friends, I like to compare them to &lt;strong&gt;Ramen&lt;/strong&gt;. Let me share this tasty analogy with you today.&lt;/p&gt;

&lt;h1&gt;
  
  
  Gemini: The High-End Ramen Restaurant Run by a Big Corp
&lt;/h1&gt;

&lt;p&gt;First, think of &lt;strong&gt;Gemini&lt;/strong&gt; as a &lt;strong&gt;top-tier ramen restaurant&lt;/strong&gt; directly operated by a giant corporation called Google.&lt;/p&gt;

&lt;p&gt;How do we eat this ramen? We have to visit the restaurant (&lt;a href="https://gemini.google.com" rel="noopener noreferrer"&gt;gemini.google.com&lt;/a&gt;) or order delivery. We can’t see inside the kitchen to know what secret broth they’re using or how they control the heat.&lt;/p&gt;

&lt;p&gt;However, as soon as you sit down, a professional chef serves you a perfect bowl of ramen made with the best ingredients and know-how. All we have to do is enjoy it. The taste and quality are guaranteed to be at the highest level the company has to offer.&lt;/p&gt;

&lt;h1&gt;
  
  
  Gemma: The Premium Packaged Ramen You Can Take Home
&lt;/h1&gt;

&lt;p&gt;On the other hand, &lt;strong&gt;Gemma&lt;/strong&gt; is the &lt;strong&gt;instant noodle&lt;/strong&gt; released by that same restaurant.&lt;/p&gt;

&lt;p&gt;It might not look as flashy as the bowl served fresh at the restaurant. But the important thing is that it was created based on the &lt;strong&gt;exact same recipe and technology&lt;/strong&gt; as the famous shop. Thanks to that, it boasts a flavor that stands head and shoulders above other instant noodles.&lt;/p&gt;

&lt;p&gt;The biggest appeal? You can take it home for &lt;strong&gt;free&lt;/strong&gt;. Once you &lt;a href="https://huggingface.co/google/collections" rel="noopener noreferrer"&gt;download&lt;/a&gt; it to your computer, you can cook it up anytime, even if the internet goes down.&lt;/p&gt;

&lt;p&gt;The real fun of this "instant noodle" (Gemma) starts once you bring it home. (&lt;a href="https://ai.google.dev/gemma/docs/tune" rel="noopener noreferrer"&gt;Gemma model fine-tuning&lt;/a&gt;)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Respecting Your Tastes (LoRA Fine-tuning):&lt;/strong&gt; The base flavor is excellent, but you can chop up some green onions or crack in an egg to suit your palate. It's like tuning the model slightly to specialize in a specific area.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Creating Something New (Full Model Tuning):&lt;/strong&gt; You can even take the noodles and soup base and completely reinvent them into a new dish, like &lt;em&gt;&lt;a href="https://en.wikipedia.org/wiki/Rabokki" rel="noopener noreferrer"&gt;Rabokki&lt;/a&gt;&lt;/em&gt; (Ramen Tteokbokki) or &lt;em&gt;&lt;a href="https://en.wikipedia.org/wiki/Budae-jjigae" rel="noopener noreferrer"&gt;Budae-Jjigae&lt;/a&gt;&lt;/em&gt; (Army Stew).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At the Gemini restaurant, you have to eat from the set menu. But with the Gemma instant noodle, you have the freedom to change the flavor however you like.&lt;/p&gt;

&lt;p&gt;If you visit the &lt;a href="https://deepmind.google/models/gemma/gemmaverse/" rel="noopener noreferrer"&gt;Gemmaverse&lt;/a&gt;, you can check out various tasty experiments people have made.&lt;/p&gt;

&lt;h1&gt;
  
  
  But There Are a Few Caveats!
&lt;/h1&gt;

&lt;p&gt;Of course, to cook instant noodle, you need a few supplies.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Your Own Kitchen:&lt;/strong&gt; You need a pot and a stove with good heat-in other words, a computer with a &lt;strong&gt;High-Performance GPU&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cooking Utensils:&lt;/strong&gt; You need tools like a ladle or chopsticks, which correspond to the right environment and frameworks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cooking Skills:&lt;/strong&gt; Most importantly, you need the &lt;strong&gt;cooking (development) knowledge&lt;/strong&gt; to know how much water to add and how long to boil it. If you can't cook at all, even delicious instant noodle might remain a "pie in the sky".&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  To Summarize
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Gemini:&lt;/strong&gt; "Forget the hassle, I want to eat the most delicious ramen prepared by Chef Google right now!" 😋&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gemma:&lt;/strong&gt; "I want to use my own pot to cook my own custom ramen that fits my taste perfectly!" 🧑‍🍳&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Do the differences between Gemini and Gemma feel a bit more relatable now?&lt;br&gt;
If you want to enjoy a comfortable service, go with Gemini. If you want to tinker and build your own AI, give Gemma a try.&lt;/p&gt;

&lt;p&gt;Wishing you some delicious coding in your kitchen (PC) today!&lt;/p&gt;

&lt;h1&gt;
  
  
  Behind the Scenes of the 4-Panel Comic
&lt;/h1&gt;

&lt;p&gt;How did you enjoy the 4-panel comic included in this post?&lt;/p&gt;

&lt;p&gt;Actually, there's a little backstory to how this comic was born. Have you by any chance seen the cute &lt;a href="https://x.com/googlejapan/status/1862051304228434163" rel="noopener noreferrer"&gt;Chrome character&lt;/a&gt; from Google Japan's X (Twitter)?&lt;/p&gt;

&lt;p&gt;Looking at that little friend, I suddenly thought, &lt;em&gt;"It would be great if our Gemini and Gemma had characters like that too."&lt;/em&gt; So, feeling a bit shy about it, I summoned all of my (lacking) drawing skills and scribbled a very simple draft first.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkv4vb6zcfd5i775v7qj6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkv4vb6zcfd5i775v7qj6.png" alt="Gemini and Gemma character draft" width="565" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;(A truly "simple" draft, isn't it? Haha 😅)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Next, I took this clumsy character sketch and rough story to &lt;a href="https://gemini.google/overview/image-generation/" rel="noopener noreferrer"&gt;Gemini's image generation feature&lt;/a&gt;, which carries the nickname "Nano Banana." I asked it, &lt;em&gt;"Please draw a 4-panel comic with these kids based on this story!"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;With the power of AI, the artwork came out looking cool, but one last important task remained. As you know, my blog operates in three languages!&lt;/p&gt;

&lt;p&gt;To ensure readers all over the world could enjoy the story of these cute kids, I translated and edited the dialogue in the speech bubbles to fit each language, finally completing the comic.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft943cx035z1ups0vxtnb.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft943cx035z1ups0vxtnb.gif" alt="Gemini and Gemma translate" width="800" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope you enjoyed it!&lt;/p&gt;

</description>
      <category>gemini</category>
      <category>gemma</category>
      <category>ai</category>
    </item>
    <item>
      <title>Threading the Beads: Coding in the Era of AI</title>
      <dc:creator>bebechien</dc:creator>
      <pubDate>Thu, 22 Jan 2026 04:57:54 +0000</pubDate>
      <link>https://dev.to/googleai/threading-the-beads-coding-in-the-era-of-ai-211h</link>
      <guid>https://dev.to/googleai/threading-the-beads-coding-in-the-era-of-ai-211h</guid>
      <description>&lt;h1&gt;
  
  
  Threading the Beads: Coding in the Era of AI
&lt;/h1&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;"Even a bushel of beads isn't a treasure until they are threaded."&lt;/strong&gt; (Korean Proverb)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Whenever I use Gemini to create something, this old proverb comes to mind.&lt;/p&gt;

&lt;p&gt;Today, Gemini pours out an endless stream of "beads"—the code. In the past, each bead was a precious thing I had to carve by hand, but now they are so abundant they roll around under my feet.&lt;/p&gt;

&lt;p&gt;However, if they are just scattered, they remain nothing more than fragments of data. Ultimately, it is still my job to &lt;strong&gt;thread them into a complete, meaningful whole.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;That’s why these days, I prefer to define myself as a &lt;strong&gt;"Creator."&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  More than an "OK" Button
&lt;/h3&gt;

&lt;p&gt;Of course, there are moments of mild existential dread when I’m just hitting Enter or clicking &lt;code&gt;OK&lt;/code&gt; on code Gemini has written for me.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;"Am I just a biological machine that eats food and clicks 'OK'—a (productive) poop 💩 generator?"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;This is why I try even harder to keep my wits about me. I focus on which colors to pick from the beads Gemini pours out and in what order to thread them to create a beautiful necklace. I remind myself to pour all my senses and intellect into the &lt;strong&gt;"Planning"&lt;/strong&gt; and &lt;strong&gt;"Intent."&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Otherwise, I might fall into a bottomless pit of despair, wondering if I'm a real human developer or just a "productive algorithm" that spits out results when fed data. (Well, at least results are coming out, so I pat myself on the back for being a "highly productive being." 😄)&lt;/p&gt;

&lt;h3&gt;
  
  
  Where Code Vanishes, Only "Intent" Remains
&lt;/h3&gt;

&lt;p&gt;As Generative AI brings the cost of coding toward zero, there is a sentiment many now agree with:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"In an era where anyone can be a creator, the 'What' (the idea) matters more than the 'How' (the implementation)."&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I recently found a fascinating GitHub repository called &lt;strong&gt;&lt;a href="https://github.com/dbreunig/whenwords" rel="noopener noreferrer"&gt;whenwords&lt;/a&gt;&lt;/strong&gt;. Surprisingly, it is an open-source library with &lt;strong&gt;zero lines of code—only "Specifications."&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Inspired by this, I decided to start a similar experiment called &lt;strong&gt;&lt;code&gt;history-of-video-game&lt;/code&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;🔗 &lt;strong&gt;&lt;a href="https://github.com/bebechien/history-of-video-game" rel="noopener noreferrer"&gt;Github: bebechien/history-of-video-game&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Collecting the "Blueprints"
&lt;/h3&gt;

&lt;p&gt;The first bead I’ve picked is the classic masterpiece &lt;strong&gt;&lt;a href="https://github.com/bebechien/history-of-video-game/blob/main/pong.md" rel="noopener noreferrer"&gt;Pong&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;I’m not sure what kind of necklace this repository will become. But for now, I plan to collect the specifications of games released in the past, one by one.&lt;/p&gt;

&lt;p&gt;If you’re curious how these specs turn into a "treasure," check out &lt;strong&gt;&lt;a href="https://gemini.google.com/share/dde7d4c8deea" rel="noopener noreferrer"&gt;this link&lt;/a&gt;&lt;/strong&gt; to see Gemini's implementation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4wu8w78s9ivxys4alloa.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4wu8w78s9ivxys4alloa.gif" alt="PONG built by Gemini" width="800" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The process with current technology is surprisingly simple. Go to &lt;a href="https://gemini.google.com" rel="noopener noreferrer"&gt;gemini.google.com&lt;/a&gt;, turn on the &lt;strong&gt;"Canvas"&lt;/strong&gt; feature, and hand over the game spec (&lt;code&gt;pong.md&lt;/code&gt;). Then, just say one thing: &lt;strong&gt;"Build this game."&lt;/strong&gt; The code appears like magic.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3vf75l3b63go2khwt7h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3vf75l3b63go2khwt7h.png" alt="Canvas in Gemini" width="800" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu5ru3r1r265js2b5undb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu5ru3r1r265js2b5undb.png" alt="Prompt to Gemini" width="800" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We live in an era where AI can rewrite code anytime, as many times as needed. That’s why I believe the &lt;strong&gt;essential logic of "how it should work"&lt;/strong&gt; is more precious than the code itself.&lt;/p&gt;

&lt;p&gt;Whether it’s a Pull Request, a request for a new game, or a small contribution—you are always welcome. I’m looking for people to thread these beads with me.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Future We Face
&lt;/h3&gt;

&lt;p&gt;Perhaps in the future, downloading finished games from an App Store will be a thing of the past.&lt;/p&gt;

&lt;p&gt;Sharing game "specs" on social media, thinking, "Oh, this rule looks fun," and then bringing it over to remix it into your own style—won't that become a common way to play?&lt;/p&gt;

&lt;p&gt;Services like &lt;strong&gt;&lt;a href="https://www.astrocade.com/" rel="noopener noreferrer"&gt;Astrocade&lt;/a&gt;&lt;/strong&gt; are already showing us glimpses of that experimental future.&lt;/p&gt;

&lt;p&gt;In an era overflowing with beads, what kind of treasure are you threading today?&lt;/p&gt;

</description>
      <category>gemini</category>
      <category>ai</category>
    </item>
    <item>
      <title>I Trained a Tiny AI to Judge My Hacker News Feed (And You Can Too)</title>
      <dc:creator>bebechien</dc:creator>
      <pubDate>Wed, 21 Jan 2026 01:16:31 +0000</pubDate>
      <link>https://dev.to/googleai/i-trained-a-tiny-ai-to-judge-my-hacker-news-feed-and-you-can-too-6og</link>
      <guid>https://dev.to/googleai/i-trained-a-tiny-ai-to-judge-my-hacker-news-feed-and-you-can-too-6og</guid>
      <description>&lt;h1&gt;
  
  
  It’s just too much noise.
&lt;/h1&gt;

&lt;p&gt;Keeping up with AI News is tough. I spend way too much time skimming past titles that &lt;em&gt;look&lt;/em&gt; techy but are totally irrelevant to what I’m actually wanting. Keyword filters are brittle because they miss the nuance.&lt;/p&gt;

&lt;p&gt;I wanted a way to filter news based on “vibes”, not just regex strings.&lt;/p&gt;

&lt;p&gt;So, I’ve been playing around with the &lt;strong&gt;EmbeddingGemma Tuning Lab&lt;/strong&gt;, a new Hugging Face Space that provides a tool for fine-tuning Google’s &lt;code&gt;embeddinggemma-300m&lt;/code&gt; model to understand your specific personal taste.&lt;/p&gt;

&lt;h1&gt;
  
  
  The Vibe Check
&lt;/h1&gt;

&lt;p&gt;The coolest part about this project is that it doesn’t rely on a massive LLM prompting strategy. It uses &lt;strong&gt;&lt;a href="https://huggingface.co/collections/google/embeddinggemma" rel="noopener noreferrer"&gt;EmbeddingGemma&lt;/a&gt;&lt;/strong&gt;, a lightweight 300M parameter model. Because it’s an embedding model, it turns text into vectors. Check out &lt;a href="https://developers.googleblog.com/gemma-explained-embeddinggemma-architecture-and-recipe/" rel="noopener noreferrer"&gt;my blog post&lt;/a&gt; if you want to learn more about how the model works and how it was trained.&lt;/p&gt;

&lt;p&gt;The core idea is actually pretty funny but effective. The system relies on a "Semantic Similarity" score against a hard-coded anchor phrase: &lt;code&gt;MY_FAVORITE_NEWS&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;By default, the model doesn't know what that means. But by fine-tuning it, you warp the model's understanding of the universe so that articles you &lt;em&gt;actually&lt;/em&gt; like are mathematically closer to that magic phrase, and the ones you hate are pushed away.&lt;/p&gt;

&lt;h1&gt;
  
  
  The "EmbeddingGemma Tuning Lab": 3 Ways to Run It
&lt;/h1&gt;

&lt;p&gt;The EmbeddingGemma Tuning Lab isn't just a training script; it contains three different apps depending on how you like to experiment:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The Trainer (Gradio):&lt;/strong&gt; This is where the magic happens. You load up the Gradio app, it pulls the current top 10 &lt;a href="https://news.ycombinator.com/" rel="noopener noreferrer"&gt;Hacker News&lt;/a&gt; stories, and you just check the boxes next to the ones you like. Click "Fine-Tune", and under the hood, it uses &lt;a href="https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss" rel="noopener noreferrer"&gt;MultipleNegativesRankingLoss&lt;/a&gt; to update the model. You can literally watch the semantic search results shift in real-time.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The Terminal Viewer (CLI):&lt;/strong&gt; This one is for the true terminal junkies. It’s an interactive CLI app that lets you scroll through the live feed. It color-codes the stories based on the model's score - green for "good vibes," red for skips.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;The Web Viewer (Flask):&lt;/strong&gt; Once you're happy with the model, there’s a lightweight Flask app included. You can deploy this as a standalone "Mood Reader" on a local server just to have your personalized feed running in the background.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Try It Out
&lt;/h1&gt;

&lt;p&gt;If you want to stop doomscrolling and start vibe-checking your news, check out the space or grab the code. It handles the data fetching, the training loop, and the visualization for you.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Check out the Space:&lt;/strong&gt; &lt;a href="https://huggingface.co/spaces/google/embeddinggemma-tuning-lab" rel="noopener noreferrer"&gt;EmbeddingGemma Tuning Lab&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;See the Code:&lt;/strong&gt; The &lt;a href="https://huggingface.co/spaces/google/embeddinggemma-tuning-lab/tree/main" rel="noopener noreferrer"&gt;repo&lt;/a&gt; includes everything you need to export your dataset and download your fine-tuned model as a ZIP.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Happy tuning!&lt;/p&gt;

</description>
      <category>embeddinggemma</category>
      <category>gemma</category>
      <category>ai</category>
    </item>
    <item>
      <title>On the 'Joy of Creating' in the Age of AI</title>
      <dc:creator>bebechien</dc:creator>
      <pubDate>Sun, 18 Jan 2026 03:38:43 +0000</pubDate>
      <link>https://dev.to/bebechien/on-the-joy-of-creating-in-the-age-of-ai-hm4</link>
      <guid>https://dev.to/bebechien/on-the-joy-of-creating-in-the-age-of-ai-hm4</guid>
      <description>&lt;h1&gt;
  
  
  Why we still build, even when machines can do it faster.
&lt;/h1&gt;

&lt;p&gt;I love scribbling down thoughts. There is a specific joy in taking difficult, complex concepts and breaking them down into soft, digestible pieces that anyone can understand.&lt;/p&gt;

&lt;p&gt;I also enjoy drawing. Although my skills are strictly "programmer art" level, I believe a single clean diagram is often far more powerful than a hundred words.&lt;/p&gt;

&lt;p&gt;And above all, I love coding. I truly cherish the process of creating something that actually &lt;em&gt;works&lt;/em&gt; right at my fingertips.&lt;/p&gt;

&lt;p&gt;I first dipped my toes into the massive wave of Generative AI back in 2023, right when Stable Diffusion was starting to take off. I had poked around game AI before that, but looking back, the shift that began then seems to be shaking the entire IT ecosystem to its roots.&lt;/p&gt;

&lt;p&gt;At the time, I felt a strange thirst when looking at models trained primarily on Western art styles. So, I spent time &lt;a href="https://huggingface.co/datasets/bebechien/shinyunbok_xl" rel="noopener noreferrer"&gt;painstakingly crafting datasets&lt;/a&gt; and fell deeply into the &lt;a href="https://civitai.com/user/bebechien" rel="noopener noreferrer"&gt;fun of teaching AI&lt;/a&gt; the brushstrokes of &lt;a href="https://en.wikipedia.org/wiki/Sin_Yunbok" rel="noopener noreferrer"&gt;Shin Yun-bok&lt;/a&gt;, a painter from the Joseon Dynasty.&lt;/p&gt;

&lt;p&gt;Then came a period where I felt infinitely small in front of the high-quality images pouring out so effortlessly. What sustained me through that overwhelming feeling was the realization that "teaching a new style and setting the direction" was still a human task.&lt;/p&gt;

&lt;p&gt;A similar sense of helplessness arrived with writing. Watching models evolve from ChatGPT and Gemini, I witnessed AI’s writing skills quickly surpass my own. However, I realized that deciding &lt;em&gt;what&lt;/em&gt; to write, bearing the weight of a piece published under my name, and finally putting the period at the end of the sentence is something only "I" can do. That sense of responsibility is something AI cannot take away.&lt;/p&gt;

&lt;p&gt;Is coding any different? Although I still do a lot of the typing myself, my jaw drops at the speed of evolution every time a new model is released.&lt;/p&gt;

&lt;p&gt;When it comes to writing or drawing, AI is already the superior player. So, a collaborative process has settled into my daily life: I throw out a rough draft, the AI polishes it smoothly, and I do the final review and adjustments. As the models get smarter, the parts I touch are becoming fewer and fewer. I have a hunch that coding will follow this exact process before long.&lt;/p&gt;

&lt;p&gt;At an &lt;a href="https://luma.com/6iyf5pca" rel="noopener noreferrer"&gt;AI Workshop&lt;/a&gt; I attended yesterday, someone asked me a heavy question:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"What on earth should humans do in the future?"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;As is the case now, even more things will be automated by AI in the future.&lt;/p&gt;

&lt;p&gt;However, people like me will still want to make things ourselves. Even if we borrow the power of a tool as potent as AI, the starting point and the intention of that creation will still remain with the "person."&lt;/p&gt;

&lt;p&gt;There will be a clear distinction between what AI generates because it "wants" to (if ever), and what a human creates with specific intent. The value will likely be assessed differently, too.&lt;/p&gt;

&lt;p&gt;Isn't it similar to the variety of choices we have when we need a chair?&lt;/p&gt;

&lt;p&gt;Sure, you can pay money and buy a comfortable, finished product. But some enjoy the process of buying parts from IKEA and assembling them; others buy the tools and cut the lumber to build from scratch; and some even choose the primitive labor of carving wood, stitch by stitch, by hand.&lt;/p&gt;

&lt;p&gt;Just as we pay different prices for factory-made goods and artisanal handicrafts today, I believe the "us" of the future will continue to live on, assigning different meanings based on the "process" and "values" embedded in the result.&lt;/p&gt;

</description>
      <category>ai</category>
    </item>
  </channel>
</rss>
