<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Yakub</title>
    <description>The latest articles on DEV Community by Yakub (@ykbmck).</description>
    <link>https://dev.to/ykbmck</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ykbmck"/>
    <language>en</language>
    <item>
      <title>Running Local LLMs in Game Engines - Here's My Journey with Godot + Ollama</title>
      <dc:creator>Yakub</dc:creator>
      <pubDate>Sat, 27 Dec 2025 14:22:03 +0000</pubDate>
      <link>https://dev.to/ykbmck/running-local-llms-in-game-engines-heres-my-journey-with-godot-ollama-4hhd</link>
      <guid>https://dev.to/ykbmck/running-local-llms-in-game-engines-heres-my-journey-with-godot-ollama-4hhd</guid>
      <description>&lt;p&gt;I randomly got an idea yesterday. AI is everywhere right? &lt;strong&gt;Well let's make it even worse.&lt;/strong&gt; I realized I've never actually looked into integrating LLMs into games. Particularly game engines like Unreal, Unity, or Godot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fko9btt2svkqxodlszbym.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fko9btt2svkqxodlszbym.png" alt="AI Funny GIF" width="480" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I've always loved playing around with game engines in the past, though I never really made a full game. So I was like, "let's just do it".&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Spark: FunctionGemma&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;While researching, I came across Google releasing &lt;a href="https://blog.google/technology/developers/functiongemma/" rel="noopener noreferrer"&gt;FunctionGemma&lt;/a&gt; - a model specifically designed for &lt;strong&gt;function calling&lt;/strong&gt; from natural language. Basically, it takes text input and can identify when to call specific functions and with what parameters.&lt;/p&gt;

&lt;p&gt;This immediately clicked for me. Theoretically, I could build something like a helper bot in my game that actually understands player commands: "Go mine some iron", "Pick up all the dropped items nearby", "Build a solar panel over there"... The LLM would parse the intent and trigger the appropriate game functions.&lt;/p&gt;

&lt;p&gt;And the best part? It's small enough that I can run it completely locally on my RTX 3070. No API calls needed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Time to Learn Local LLMs
&lt;/h2&gt;

&lt;p&gt;I've never run an LLM locally before. Didn't have a idea how to do it - always just used OpenAI APIs. So I was like, now is the time!&lt;/p&gt;

&lt;p&gt;Researching how to connect a game to an AI led me to an important realization: running the LLM directly inside the game engine isn't ideal, especially for early testing and development. You want a separate server handling the inference.&lt;/p&gt;

&lt;p&gt;I decided to give &lt;strong&gt;Ollama&lt;/strong&gt; a try for the LLM server. For the game engine, I went with &lt;strong&gt;Godot&lt;/strong&gt; since I could quickly play with it. Downloaded it, set up &lt;strong&gt;FunctionGemma&lt;/strong&gt; model with a simple &lt;code&gt;ollama pull functiongemma&lt;/code&gt;, and had a local LLM server running &lt;strong&gt;in minutes&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3i5x6tjr1zll2h7sxt4w.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3i5x6tjr1zll2h7sxt4w.gif" alt=" " width="270" height="195"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting Godot to Ollama
&lt;/h2&gt;

&lt;p&gt;Now came the fun part: making Godot talk to the LLM.&lt;/p&gt;

&lt;p&gt;Godot has built-in &lt;code&gt;HTTPRequest&lt;/code&gt;nodes that make this surprisingly straightforward, I just needed to learn very simple Godot Script and how to make the request to the server and handle the stuff.&lt;/p&gt;

&lt;p&gt;The basic flow looks like this&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an &lt;code&gt;HTTPRequest&lt;/code&gt;node in your scene&lt;/li&gt;
&lt;li&gt;Send a POST request to &lt;code&gt;http://127.0.0.1:11434/api/chat&lt;/code&gt; with your message&lt;/li&gt;
&lt;li&gt;Parse the JSON response to get the AI's reply&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here's the simplified concept:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight gdscript"&gt;&lt;code&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;OLLAMA_URL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"http://127.0.0.1:11434/api/chat"&lt;/span&gt;
&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;MODEL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;"functiongemma"&lt;/span&gt;
&lt;span class="k"&gt;func&lt;/span&gt; &lt;span class="nf"&gt;send_request&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;void&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="s2"&gt;"model"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;MODEL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="s2"&gt;"messages"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;conversation_messages&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="s2"&gt;"tools"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;tools&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;# Your game function definitions&lt;/span&gt;
        &lt;span class="s2"&gt;"stream"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="bp"&gt;false&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;json_body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;JSON&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;http_request&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;OLLAMA_URL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;HTTPClient&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="n"&gt;METHOD_POST&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;json_body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For function calling, you define your available "tools" (game functions) with their parameters. The LLM then decides which functions to call based on the user's request:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight gdscript"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="s2"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"function"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="s2"&gt;"function"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="s2"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"get_weather"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="s2"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"Get the current weather for a location"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="s2"&gt;"parameters"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="s2"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"object"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="s2"&gt;"properties"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="s2"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="s2"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="s2"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"The city and country"&lt;/span&gt;
                &lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;},&lt;/span&gt;
            &lt;span class="s2"&gt;"required"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When the LLM responds with &lt;code&gt;tool_calls&lt;/code&gt;, you execute those functions locally and send the results back. It's a conversation: &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;user → LLM → function call → result → LLM → final response&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Here's the important part: the &lt;strong&gt;LLM doesn't actually execute anything&lt;/strong&gt;. It just returns structured data saying "hey, I think you should call &lt;code&gt;mine_resource&lt;/code&gt;with these parameters." &lt;strong&gt;Your game code makes the final decision.&lt;/strong&gt; So this not some AI slop game that breaks or does super weird stuff.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Learned
&lt;/h2&gt;

&lt;p&gt;I loved this whole process. It made me understand LLMs and all the stuff around running them on a much deeper level — how inference servers work, function/tool calling behind the scenes, the round-trip conversation flow and much more cool stuff!&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next
&lt;/h2&gt;

&lt;p&gt;I definitely plan to play with this &lt;strong&gt;a lot more&lt;/strong&gt;. There are some genuinely interesting use cases where LLMs in games could be actually useful - not in the bad way we sometimes see today, where "AI-first" is just shoved everywhere and shines in our faces where we don't want or need it.&lt;/p&gt;

&lt;p&gt;Some ideas I want to explore:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Intelligent NPC companions that understand context and can perform complex tasks&lt;/li&gt;
&lt;li&gt;Natural language command interfaces for strategy or simulation games&lt;/li&gt;
&lt;li&gt;Dynamic dialogue systems that don't feel like scripted trees&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Oh, and if you want to &lt;strong&gt;try this yourself&lt;/strong&gt; 👉 I've put together a &lt;a href="https://github.com/yakubmurcek/godot-gemma-ollama-demo" rel="noopener noreferrer"&gt;working demo&lt;/a&gt; you can clone and run right now. It's a minimal Godot project with everything set up: the HTTP client, function definitions, and the full round-trip conversation flow.&lt;/p&gt;

&lt;p&gt;Feel free to use it as a starting point and build on top of it!&lt;/p&gt;

&lt;p&gt;If you're also thinking about experimenting with this, &lt;strong&gt;let's connect&lt;/strong&gt;! The barrier to entry is surprisingly low, and you'll learn a ton in the process. I study economy and come from web dev!&lt;/p&gt;

&lt;p&gt;Have you tried integrating LLMs into game engines? I'd love to hear about your experiences in the comments!&lt;/p&gt;

</description>
      <category>godot</category>
      <category>godotengine</category>
      <category>gamedev</category>
      <category>ai</category>
    </item>
    <item>
      <title>My Thesis Accidentally Made Me a Data Scientist</title>
      <dc:creator>Yakub</dc:creator>
      <pubDate>Fri, 26 Dec 2025 18:38:25 +0000</pubDate>
      <link>https://dev.to/ykbmck/my-thesis-accidentally-made-me-a-data-scientist-ol0</link>
      <guid>https://dev.to/ykbmck/my-thesis-accidentally-made-me-a-data-scientist-ol0</guid>
      <description>&lt;h2&gt;
  
  
  Prologue
&lt;/h2&gt;

&lt;p&gt;So after months of procrastination, I finally started my master thesis. You know how it is. Anyway, I'm finishing my economics degree and I needed a &lt;strong&gt;topic&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;I went with something that actually kind of interested me: scanning and &lt;strong&gt;analyzing job postings&lt;/strong&gt; in the IT industry. The plan was to compare the EU vs US vs India – what skills are employers looking for, what's different across regions, that kind of stuff.&lt;/p&gt;

&lt;p&gt;So I started scraping Glassdoor for data. Easy enough right? Then came the fun part – actually analyzing this stuff.&lt;/p&gt;

&lt;p&gt;See, I figured I'd use Stata or some other analysis software they taught us at school. That's what you do with data, right? Load it into Stata, run some regressions, call it a day.&lt;/p&gt;

&lt;p&gt;Except... I don't really have &lt;strong&gt;numbers&lt;/strong&gt;. I have text. Job descriptions. Thousands of them.&lt;/p&gt;

&lt;p&gt;And Stata doesn't really vibe with &lt;strong&gt;text&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;So I'd need to use something like Python anyway to process all this text before I could even think about analysis. And at that point I was like – aight, let's just do it &lt;strong&gt;ALL&lt;/strong&gt; in Python then. lol.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Never&lt;/strong&gt; coded in Python before, by the way. It couldn't go that bad right?&lt;/p&gt;

&lt;p&gt;Well guess what. It actually started to be fun.&lt;/p&gt;

&lt;p&gt;And it turns out the methodology is going to be way more advanced and complicated than I expected. We're talking NLP, LLM pipelines, structured data extraction – the whole thing. This isn't just "import data, run analysis" anymore. This is actual engineering (I think, don't judge me I come from React). And the problems I'm solving, I feel like they're actually helping me think more clearly. The decisions I make on the way are shaping the thesis and I like that.&lt;/p&gt;

&lt;p&gt;I'm on day 3 now.&lt;/p&gt;

&lt;p&gt;My codebase is roughly &lt;strong&gt;5,800 lines&lt;/strong&gt; of Python.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmrmr8gvpq2sxf5ssyyx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxmrmr8gvpq2sxf5ssyyx.png" alt="Here's proof since I didn't believe it either" width="800" height="116"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just the Python files. We're not counting configs or existential crisis logs.&lt;/p&gt;

&lt;p&gt;I came here to write an economics thesis. I think I might be leaving as... something else entirely. Python Data Analyst or how they even call it.&lt;/p&gt;

&lt;p&gt;And honestly? I'm kind of here for it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0inawkqwql95r5q04j2n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0inawkqwql95r5q04j2n.png" alt="Some Python code for better SEO" width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the following posts,&lt;br&gt;
I'll dive into the interesting stuff I'm solving while building this project. Drop a follow if you want to join along! &lt;/p&gt;

</description>
      <category>ai</category>
      <category>beginners</category>
      <category>python</category>
      <category>datascience</category>
    </item>
  </channel>
</rss>
