<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Muhammad Ikhwan Fathulloh</title>
    <description>The latest articles on DEV Community by Muhammad Ikhwan Fathulloh (@muhammadikhwanfathulloh).</description>
    <link>https://dev.to/muhammadikhwanfathulloh</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/muhammadikhwanfathulloh"/>
    <language>en</language>
    <item>
      <title>Professional Inverse Kinematics on Arduino: Deep Dive into NocKinematics</title>
      <dc:creator>Muhammad Ikhwan Fathulloh</dc:creator>
      <pubDate>Tue, 21 Apr 2026 08:04:28 +0000</pubDate>
      <link>https://dev.to/muhammadikhwanfathulloh/professional-inverse-kinematics-on-arduino-deep-dive-into-nockinematics-7hi</link>
      <guid>https://dev.to/muhammadikhwanfathulloh/professional-inverse-kinematics-on-arduino-deep-dive-into-nockinematics-7hi</guid>
      <description>&lt;p&gt;Building a robotic arm often starts with a simple challenge: &lt;em&gt;"I want the claw to move to point (X, Y, Z)."&lt;/em&gt; However, translating a 3D coordinate into specific servo angles is a nightmare of trigonometry and matrix calculus.&lt;/p&gt;

&lt;p&gt;While professional robotics often use the &lt;strong&gt;Jacobian Inverse&lt;/strong&gt; or &lt;strong&gt;Cyclic Coordinate Descent (CCD)&lt;/strong&gt;, these methods are either too "heavy" for an Arduino or produce jerky, unnatural movements.&lt;/p&gt;

&lt;p&gt;Today, we are exploring &lt;strong&gt;NocKinematics&lt;/strong&gt;, a lightweight C++ library that brings the industry-standard &lt;strong&gt;FABRIK&lt;/strong&gt; algorithm to the world of microcontrollers.&lt;/p&gt;




&lt;h2&gt;
  
  
  🔬 The Science: Why FABRIK?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;NocKinematics&lt;/strong&gt; is built upon the foundational research of &lt;strong&gt;Andreas Aristidou&lt;/strong&gt; and &lt;strong&gt;Joan Lasenby&lt;/strong&gt;. If you are looking for the academic rigor behind this library, you should refer to the original paper:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.andreasaristidou.com/publications/papers/FABRIK.pdf" rel="noopener noreferrer"&gt;&amp;gt; &lt;strong&gt;Aristidou, A., &amp;amp; Lasenby, J. (2011).&lt;/strong&gt; &lt;em&gt;"FABRIK: A fast, iterative solver for the Inverse Kinematics problem."&lt;/em&gt; Graphical Models, 73(5), 243-260.&lt;br&gt;
&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Why this matters for Embedded Systems:
&lt;/h3&gt;

&lt;p&gt;Unlike traditional methods that rely on expensive trigonometric functions or matrix inversions, &lt;strong&gt;FABRIK&lt;/strong&gt; (Forward And Backward Reaching Inverse Kinematics) uses an iterative approach based on finding points on a line. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Low Computational Cost:&lt;/strong&gt; Perfect for 8-bit AVR (Arduino Uno) and 32-bit ESP32 alike.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fast Convergence:&lt;/strong&gt; It usually finds a solution in just a few iterations.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Handle Constraints:&lt;/strong&gt; It gracefully handles "unreachable" targets by stretching the arm to its maximum toward the point.&lt;/li&gt;
&lt;/ol&gt;


&lt;h2&gt;
  
  
  🛠️ Key Features of NocKinematics
&lt;/h2&gt;

&lt;p&gt;Developed by &lt;strong&gt;Muhammad Ikhwan Fathulloh&lt;/strong&gt; and the &lt;strong&gt;Nocturnailed Community&lt;/strong&gt;, this library bridges the gap between game engine math and hardware.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;N-Joint Support:&lt;/strong&gt; Solve kinematics for 2, 3, or even 20+ joints (ideal for snake robots or tentacles).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory Optimized:&lt;/strong&gt; It avoids &lt;code&gt;std::vector&lt;/code&gt; to prevent heap fragmentation. It allocates memory exactly once during initialization.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Platform Agnostic:&lt;/strong&gt; Runs on &lt;strong&gt;Arduino Uno, Nano, Mega, ESP8266, and ESP32.&lt;/strong&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  🚀 Step-by-Step Implementation
&lt;/h2&gt;
&lt;h3&gt;
  
  
  1. Define Your Robot's Anatomy
&lt;/h3&gt;

&lt;p&gt;Think of your robot as a collection of "joints" connected by "bones." If you have 4 joints, you have 3 bones.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;NocKinematics.h&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;NUM_JOINTS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="c1"&gt;// Length of each bone segment in your chosen unit (cm, mm, etc.)&lt;/span&gt;
&lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;boneLengths&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;NUM_JOINTS&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="mf"&gt;10.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;8.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;5.0&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt; 

&lt;span class="c1"&gt;// Initialize the solver on the Heap&lt;/span&gt;
&lt;span class="n"&gt;NocKinematics&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;FABRIK&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;armSolver&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;NocKinematics&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;FABRIK&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;boneLengths&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;NUM_JOINTS&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Set the Anchor
&lt;/h3&gt;

&lt;p&gt;The "Base" is the stationary part of your robot (e.g., the shoulder).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="n"&gt;armSolver&lt;/span&gt;&lt;span class="o"&gt;-&amp;gt;&lt;/span&gt;&lt;span class="n"&gt;setBasePosition&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;NocCore&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Vector3&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. Reach the Target
&lt;/h3&gt;

&lt;p&gt;Instruct the arm to calculate the positions needed to touch a point in space.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="n"&gt;NocCore&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Vector3&lt;/span&gt; &lt;span class="nf"&gt;target&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;12.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;5.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;3.0&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kt"&gt;bool&lt;/span&gt; &lt;span class="n"&gt;success&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;armSolver&lt;/span&gt;&lt;span class="o"&gt;-&amp;gt;&lt;/span&gt;&lt;span class="n"&gt;solve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;target&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Mapping to Hardware (Servos)
&lt;/h3&gt;

&lt;p&gt;The library gives you &lt;code&gt;Vector3&lt;/code&gt; coordinates for each joint. To move a real servo, you simply use &lt;code&gt;atan2&lt;/code&gt; to find the angle between these points.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;NUM_JOINTS&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;NocCore&lt;/span&gt;&lt;span class="o"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Vector3&lt;/span&gt; &lt;span class="n"&gt;pos&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;armSolver&lt;/span&gt;&lt;span class="o"&gt;-&amp;gt;&lt;/span&gt;&lt;span class="n"&gt;getJointPosition&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Joint "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;": "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pos&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;x_val&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;", "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pos&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;y_val&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  📂 Real-World Examples Included
&lt;/h2&gt;

&lt;p&gt;The library ships with several built-in examples (&lt;code&gt;File &amp;gt; Examples &amp;gt; NocKinematics&lt;/code&gt;):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;BasicArm.ino&lt;/code&gt;&lt;/strong&gt;: The "Hello World" of kinematics.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;DynamicTarget.ino&lt;/code&gt;&lt;/strong&gt;: Watch the arm follow a moving coordinate in real-time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;MultiJointSnake.ino&lt;/code&gt;&lt;/strong&gt;: A stress test showing 10+ joints moving smoothly on an Arduino.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;ServoArm4DOF/5DOF&lt;/code&gt;&lt;/strong&gt;: The most practical scripts. They show how to translate $X, Y, Z$ math into actual &lt;code&gt;servo.write(angles)&lt;/code&gt; commands.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  📦 Installation
&lt;/h2&gt;

&lt;p&gt;Get started today by downloading the library through your preferred channel:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Arduino Library Manager:&lt;/strong&gt; Search for &lt;code&gt;NocKinematics&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GitHub:&lt;/strong&gt; &lt;a href="https://github.com/Nocturnailed-Community/NocKinematics" rel="noopener noreferrer"&gt;Nocturnailed-Community/NocKinematics&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Documentation:&lt;/strong&gt; &lt;a href="https://www.arduinolibraries.info/libraries/noc-kinematics" rel="noopener noreferrer"&gt;Arduino Libraries Info&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Inverse Kinematics doesn't have to be a "math wall" that stops your project. With &lt;strong&gt;NocKinematics&lt;/strong&gt;, you can focus on building the hardware and the behavior, while the FABRIK algorithm handles the heavy lifting of spatial geometry.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;License:&lt;/strong&gt; Released under the &lt;strong&gt;MIT License&lt;/strong&gt;.&lt;br&gt;
&lt;strong&gt;Maintained by:&lt;/strong&gt; Nocturnailed Community.&lt;/p&gt;

&lt;p&gt;#Robotics #Arduino #InverseKinematics #FABRIK #ESP32 #OpenSource #NocLab #Mathematics&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>iot</category>
      <category>cpp</category>
      <category>kinematik</category>
    </item>
    <item>
      <title>Bringing Generative AI to Microcontrollers: Introducing NocLLM</title>
      <dc:creator>Muhammad Ikhwan Fathulloh</dc:creator>
      <pubDate>Tue, 21 Apr 2026 05:30:05 +0000</pubDate>
      <link>https://dev.to/muhammadikhwanfathulloh/bringing-generative-ai-to-microcontrollers-introducing-nocllm-2ii7</link>
      <guid>https://dev.to/muhammadikhwanfathulloh/bringing-generative-ai-to-microcontrollers-introducing-nocllm-2ii7</guid>
      <description>&lt;p&gt;The barrier between resource-constrained hardware and Large Language Models (LLMs) has finally been broken. While microcontrollers lack the VRAM to run a 70B parameter model locally, they can now act as intelligent gateways to the world's most powerful AI engines.&lt;/p&gt;

&lt;p&gt;Enter &lt;strong&gt;NocLLM&lt;/strong&gt;, an optimized integration and inference library designed specifically for Arduino and embedded systems.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is NocLLM?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;NocLLM&lt;/strong&gt; is a high-performance C++ library that allows microcontrollers to communicate with LLM providers (OpenAI, Gemini, Groq, DeepSeek) or local LLM servers (Ollama, LMStudio) using a &lt;strong&gt;non-blocking, stream-oriented architecture&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Unlike traditional HTTP clients that hang while waiting for a full JSON response, NocLLM parses incoming data chunks in real-time. This means your Arduino can keep reading sensors or driving motors while the AI is "typing" its response.&lt;/p&gt;

&lt;h3&gt;
  
  
  Core Strengths:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Zero-Overhead Streaming:&lt;/strong&gt; Uses background TCP polling to prevent CPU stalls.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-Provider Support:&lt;/strong&gt; One unified syntax for various AI infrastructures.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Smart Parsing:&lt;/strong&gt; Automatically adapts its internal configuration based on the target URL (e.g., switching between Gemini and OpenAI protocols).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Edge-First Design:&lt;/strong&gt; Optimized for memory efficiency, preventing "Out of Memory" crashes during long AI conversations.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  5 Ways to Power Your Hardware with LLMs
&lt;/h2&gt;

&lt;p&gt;NocLLM ships with five comprehensive examples (found in &lt;code&gt;File -&amp;gt; Examples -&amp;gt; NocLLM&lt;/code&gt;) to get you started:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;01_Sumopod:&lt;/strong&gt; DeepSeek-V3 integration via Sumopod Cloud.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;02_OpenAI:&lt;/strong&gt; The industry standard—perfect for GPT-4o or GPT-3.5 Turbo.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;03_Gemini_Native:&lt;/strong&gt; Harness Google’s &lt;code&gt;gemini-3-flash&lt;/code&gt;. NocLLM handles the specific Google GenAI headers and parsing logic automatically.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;04_Groq:&lt;/strong&gt; Experience ultra-low latency with &lt;code&gt;llama3-70b&lt;/code&gt;. Ideal for voice assistants or real-time robotics.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;05_Local_LMStudio:&lt;/strong&gt; The privacy-focused choice. Connect to Ollama or LMStudio on your local network. It uses bare TCP streams with &lt;strong&gt;0 SSL overhead&lt;/strong&gt;, providing blazing-fast speeds for local AI setups.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Technical Spotlight: Non-Blocking Execution
&lt;/h2&gt;

&lt;p&gt;The most powerful feature of NocLLM is its ability to multitask. Here is how simple it is to implement a streaming AI response without freezing your microcontroller:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;"NocLLM.h"&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="c1"&gt;// Initialize with your Key, Endpoint, and Model&lt;/span&gt;
&lt;span class="n"&gt;NocAI&lt;/span&gt; &lt;span class="nf"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"YOUR_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"https://api.openai.com/v1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"gpt-3.5-turbo"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Callback function triggered as each word/chunk arrives&lt;/span&gt;
&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;onStream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;String&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;setup&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;begin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;115200&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="c1"&gt;// ... [Insert your WiFi connection logic here] ...&lt;/span&gt;

    &lt;span class="c1"&gt;// Attach the listener and trigger a prompt&lt;/span&gt;
    &lt;span class="n"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;onMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;onStream&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;beginStream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Write a 1-sentence poem about a robot."&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; 
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;loop&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c1"&gt;// This gently pulls the data from the network in the background&lt;/span&gt;
    &lt;span class="n"&gt;ai&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;loop&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// Your main logic stays alive!&lt;/span&gt;
    &lt;span class="c1"&gt;// Example: Blink an LED or read a DHT22 sensor here.&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Why Use NocLLM for Your Next Project?
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Interactive Robotics:&lt;/strong&gt; Give your robot a "brain" that can understand complex natural language commands.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Smart Home Hubs:&lt;/strong&gt; Build a private voice assistant that processes logic via a local Ollama server.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Intelligent Data Analysis:&lt;/strong&gt; Send sensor logs to an LLM to receive a human-readable summary of system health.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Installation &amp;amp; Resources
&lt;/h2&gt;

&lt;p&gt;Ready to build the next generation of smart hardware?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Arduino Library Manager:&lt;/strong&gt; Search for &lt;strong&gt;"NocLLM"&lt;/strong&gt; and click install.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;GitHub Repository:&lt;/strong&gt; &lt;a href="https://github.com/Nocturnailed-Community/NocLLM" rel="noopener noreferrer"&gt;Nocturnailed-Community/NocLLM&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Registry Details:&lt;/strong&gt; &lt;a href="https://www.arduinolibraries.info/libraries/noc-llm" rel="noopener noreferrer"&gt;NocLLM on Arduino Libraries&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;NocLLM is part of the &lt;strong&gt;Noc Lab&lt;/strong&gt; ecosystem, dedicated to pushing the boundaries of what is possible on the edge. &lt;/p&gt;

&lt;p&gt;#Arduino #LLM #GenerativeAI #IoT #EdgeAI #NocLab #OpenSource #Programming&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>iot</category>
      <category>cpp</category>
      <category>llm</category>
    </item>
    <item>
      <title>Bringing Intelligence to the Edge: Introduction to NocML for Arduino</title>
      <dc:creator>Muhammad Ikhwan Fathulloh</dc:creator>
      <pubDate>Tue, 21 Apr 2026 05:22:10 +0000</pubDate>
      <link>https://dev.to/muhammadikhwanfathulloh/bringing-intelligence-to-the-edge-introduction-to-nocml-for-arduino-4fmd</link>
      <guid>https://dev.to/muhammadikhwanfathulloh/bringing-intelligence-to-the-edge-introduction-to-nocml-for-arduino-4fmd</guid>
      <description>&lt;p&gt;Edge AI is often seen as a field reserved for powerful single-board computers, but what if you could run machine learning logic directly on a standard Arduino?&lt;/p&gt;

&lt;p&gt;In this article, we will explore &lt;strong&gt;NocML&lt;/strong&gt;, an efficient machine learning library specifically designed for resource-constrained microcontrollers.&lt;/p&gt;




&lt;h2&gt;
  
  
  What is NocML?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;NocML&lt;/strong&gt; is a lightweight C++ library built to bridge the gap between complex ML logic and the limited processing power of microcontrollers like the ESP32, Arduino Uno, or Nano. Inspired by the Scikit-Learn API, it offers a familiar workflow for developers coming from a Python background.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Features:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Low Memory Footprint:&lt;/strong&gt; Optimized to run within the tight SRAM limits of common microcontrollers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Preprocessing:&lt;/strong&gt; Includes tools like &lt;code&gt;MinMaxScaler&lt;/code&gt; for data normalization on-device.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Versatile Algorithms:&lt;/strong&gt; Supports Classification (&lt;strong&gt;KNN, Naive Bayes&lt;/strong&gt;), Clustering (&lt;strong&gt;K-Means&lt;/strong&gt;), and Regression (&lt;strong&gt;Linear Regression&lt;/strong&gt;).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Zero Latency:&lt;/strong&gt; Perform inference locally on the device, ensuring privacy and real-time response without cloud dependency.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Use Case: K-Nearest Neighbors (KNN) Classification
&lt;/h2&gt;

&lt;p&gt;One of the strongest features of NocML is its ability to perform sensor classification directly on the "edge." Imagine building a device that classifies activity types based on accelerometer data.&lt;/p&gt;

&lt;p&gt;Here is a practical example of how to implement a &lt;strong&gt;KNN&lt;/strong&gt; algorithm using NocML:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;NocML.h&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;
&lt;span class="c1"&gt;// Define training data (Features)&lt;/span&gt;
&lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;X_train&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="mf"&gt;1.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;2.0&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="c1"&gt;// Category A&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="mf"&gt;1.5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;1.8&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="c1"&gt;// Category A&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="mf"&gt;5.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;8.0&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="c1"&gt;// Category B&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="mf"&gt;6.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;7.0&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;  &lt;span class="c1"&gt;// Category B&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// Labels for training data&lt;/span&gt;
&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// Initialize KNN with k=3&lt;/span&gt;
&lt;span class="n"&gt;KNN&lt;/span&gt; &lt;span class="nf"&gt;knn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;setup&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;begin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;115200&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// Local training (Fit)&lt;/span&gt;
  &lt;span class="n"&gt;knn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fit&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="kt"&gt;float&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;y_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"KNN Model Ready!"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;loop&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// New sensor data to classify&lt;/span&gt;
  &lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;input&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="mf"&gt;1.2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mf"&gt;2.1&lt;/span&gt;&lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="c1"&gt;// Perform prediction&lt;/span&gt;
  &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;prediction&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;knn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;predict&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Classification Result: "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prediction&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="o"&gt;?&lt;/span&gt; &lt;span class="s"&gt;"Category A"&lt;/span&gt; &lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"Category B"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="n"&gt;delay&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;2000&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Why NocML?
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Smart IoT:&lt;/strong&gt; Transform passive sensors into intelligent nodes that make decisions without a server.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Predictive Maintenance:&lt;/strong&gt; Detect anomalous vibration patterns in industrial motors before failure occurs.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Human-Machine Interaction:&lt;/strong&gt; Recognize gestures or simple audio patterns in real-time.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Getting Started
&lt;/h2&gt;

&lt;p&gt;You can easily integrate NocML into your project via the Arduino Library Manager or by visiting the official repositories.&lt;/p&gt;

&lt;h3&gt;
  
  
  Installation
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt; Open your &lt;strong&gt;Arduino IDE&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt; Go to &lt;strong&gt;Sketch&lt;/strong&gt; -&amp;gt; &lt;strong&gt;Include Library&lt;/strong&gt; -&amp;gt; &lt;strong&gt;Manage Libraries...&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt; Search for &lt;strong&gt;"NocML"&lt;/strong&gt; and click install.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Alternatively, explore the source code and documentation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitHub Repository:&lt;/strong&gt; &lt;a href="https://www.google.com/search?q=https://github.com/Nocturnailed-Community/NocML" rel="noopener noreferrer"&gt;Nocturnailed-Community/NocML&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Arduino Library Registry:&lt;/strong&gt; &lt;a href="https://www.arduinolibraries.info/libraries/noc-ml" rel="noopener noreferrer"&gt;NocML on Arduino Libraries&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;NocML is part of the &lt;strong&gt;TinyML&lt;/strong&gt; movement, making artificial intelligence accessible on devices costing only a few dollars. By bringing logic closer to the data source, we create faster, more reliable, and smarter IoT systems.&lt;/p&gt;

&lt;p&gt;Are you working on an Edge AI project? Give &lt;strong&gt;NocML&lt;/strong&gt; a try and share your results!&lt;/p&gt;

&lt;p&gt;#Arduino #MachineLearning #TinyML #IoT #OpenSource #NocLab #ArtificialIntelligence&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>iot</category>
      <category>cpp</category>
      <category>machinelearning</category>
    </item>
  </channel>
</rss>
