<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jennifer Davis</title>
    <description>The latest articles on DEV Community by Jennifer Davis (@sigje).</description>
    <link>https://dev.to/sigje</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/sigje"/>
    <language>en</language>
    <item>
      <title>Why your `curl` logic just bit you 🐾</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Mon, 09 Feb 2026 05:01:07 +0000</pubDate>
      <link>https://dev.to/googlecloud/why-your-curl-logic-just-bit-you-5gk1</link>
      <guid>https://dev.to/googlecloud/why-your-curl-logic-just-bit-you-5gk1</guid>
      <description>&lt;p&gt;It’s a common strategy to test a new API directly with &lt;a href="https://curl.se/docs/manual.html" rel="noopener noreferrer"&gt;curl&lt;/a&gt;. It feels intuitive, fast, and removes the overhead of a language runtime. For example, if you are testing out the &lt;a href="https://cloud.google.com/logging/docs/reference/rest?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;Google Cloud Logging API&lt;/a&gt;, you might start with a simple request to list logs from a &lt;a href="https://docs.cloud.google.com/kubernetes-engine/docs/learn/containers?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;Kubernetes container&lt;/a&gt;.:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-X&lt;/span&gt; POST &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Authorization: Bearer &lt;/span&gt;&lt;span class="si"&gt;$(&lt;/span&gt;gcloud auth print-access-token&lt;span class="si"&gt;)&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  https://logging.googleapis.com/v2/entries:list &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="s1"&gt;'{
    "resourceNames": ["projects/your-project-id"],
    "filter": "resource.type=\"k8s_container\""
  }'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The JSON returns perfectly. But then, you move that logic into a &lt;a href="https://nodejs.org/en/docs/" rel="noopener noreferrer"&gt;Node.js application&lt;/a&gt; using the official &lt;a href="https://cloud.google.com/nodejs/docs/reference/logging/latest?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;code&gt;@google-cloud/logging&lt;/code&gt; library&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Logging&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;@google-cloud/logging&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;readLogsAsync&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// Common approach: Initializing the client without explicit auth&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;logging&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Logging&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getEntries&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;resource.type="k8s_container"&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;resourceNames&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;projects/your-project-id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;nextPageToken&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All of a sudden, the code fails with a cryptic error: &lt;a href="https://cloud.google.com/docs/authentication/troubleshoot-adc?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;code&gt;Error: Could not load the default credentials&lt;/code&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This happens because of a disconnect between how the &lt;a href="https://cloud.google.com/sdk/gcloud?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;code&gt;gcloud&lt;/code&gt; CLI&lt;/a&gt; manages session tokens and how the Google Cloud client libraries search for credentials.&lt;/p&gt;

&lt;p&gt;While your &lt;code&gt;curl&lt;/code&gt; command relies on the &lt;a href="https://cloud.google.com/docs/authentication/production?utm_campaign=CDR_0x0d701af0_default_b482487387#obtaining_and_providing_service_account_credentials_manually" rel="noopener noreferrer"&gt;&lt;strong&gt;explicit token&lt;/strong&gt;&lt;/a&gt; you provided via &lt;a href="https://cloud.google.com/sdk/gcloud/reference/auth/print-access-token?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;code&gt;gcloud auth print-access-token&lt;/code&gt;&lt;/a&gt;, the client libraries are designed to look for &lt;a href="https://cloud.google.com/docs/authentication/application-default-credentials?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;strong&gt;Application Default Credentials (ADC)&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Running &lt;a href="https://cloud.google.com/sdk/gcloud/reference/auth/login?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;code&gt;gcloud auth login&lt;/code&gt;&lt;/a&gt; authenticates, but it does not create the specific credential file that the Node.js library requires to run locally.&lt;/p&gt;

&lt;h2&gt;
  
  
  🤫 Authenticate for local development
&lt;/h2&gt;

&lt;p&gt;The most efficient way to solve this is to provide the library with the credentials it is looking for. Run the &lt;a href="https://cloud.google.com/sdk/gcloud/reference/auth/application-default/login?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;&lt;code&gt;gcloud auth application-default login&lt;/code&gt;&lt;/a&gt; command in your terminal.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;gcloud auth application-default login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This opens a browser window to authorize your account and saves a JSON file to your local configuration folder. Once this is done, your Node.js code will automatically find these credentials—no code changes required.&lt;/p&gt;

&lt;h2&gt;
  
  
  🏡 Moving to Production
&lt;/h2&gt;

&lt;p&gt;Once you get your local environment running, you need to think about "productionizing" your code. Hardcoded project IDs and lack of &lt;a href="https://cloud.google.com/apis/design/errors?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;robust error handling&lt;/a&gt; are common causes of production outages.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;🛡️ Robust Error Handling&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In production, your app should handle permission issues or network timeouts gracefully.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;logging&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;getEntries&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;resource.type="k8s_container"&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;resourceNames&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;`projects/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;GOOGLE_CLOUD_PROJECT&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
    &lt;span class="na"&gt;pageSize&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;code&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Permission Denied: Check your Service Account roles.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Unexpected Logging Error:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;🎭 Service Accounts&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In production, you shouldn't use your personal user credentials. Use a Service Account with the &lt;a href="https://cloud.google.com/logging/docs/access-control?utm_campaign=CDR_0x0d701af0_default_b482487387#logging.viewer" rel="noopener noreferrer"&gt;"Logs Viewer" role&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;🌍 Environment Variables&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Avoid hardcoding project IDs. Use an environment variable to make your code portable across staging and production environments.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;PROJECT_ID&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;GOOGLE_CLOUD_PROJECT&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;your-project-id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;...&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;readLogsAsync&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;logging&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Logging&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;projectId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;PROJECT_ID&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Structured Logging&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Instead of just reading logs, think about how you write them. Using structured JSON logs makes them much easier to query later in the Cloud Logging console.&lt;/p&gt;

&lt;p&gt;If you are running on &lt;a href="https://cloud.google.com/kubernetes-engine?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;Google Kubernetes Engine (GKE)&lt;/a&gt;, you don't even need to use the Logging library to write logs. Printing a JSON string to stdout allows GKE to parse your data into searchable fields automatically.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;logStructuredStatus&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;logEntry&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;severity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;INFO&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Container health check successful&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;container_info&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;v2.1.0&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;uptime&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;uptime&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;http_stats&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;active_connections&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;

  &lt;span class="c1"&gt;// GKE picks this up and converts it to a structured log automatically&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;logEntry&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Get started
&lt;/h2&gt;

&lt;p&gt;I'll be sharing a more thorough walk-through on using the native observability features provided with GKE real soon—including how to automate this stuff so you can avoid the manual debugging of credential errors.&lt;/p&gt;

&lt;p&gt;In the meantime, dive deeper here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;📖 &lt;a href="https://cloud.google.com/docs/authentication/application-default-credentials?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;How Application Default Credentials work&lt;/a&gt; – Understand the "magic" behind the lookup.&lt;/li&gt;
&lt;li&gt;🛠️ &lt;a href="https://cloud.google.com/docs/authentication/provide-credentials-adc?utm_campaign=CDR_0x0d701af0_default_b482487387" rel="noopener noreferrer"&gt;Providing credentials to ADC&lt;/a&gt; – Setup guides for every environment.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>googlecloud</category>
      <category>node</category>
      <category>devops</category>
      <category>authentication</category>
    </item>
    <item>
      <title>Giving my droid a personality with ESP32 and Arduino Modulino sensors</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Fri, 23 Jan 2026 07:07:07 +0000</pubDate>
      <link>https://dev.to/sigje/giving-my-droid-a-personality-with-esp32-and-arduino-modulinos-5c0p</link>
      <guid>https://dev.to/sigje/giving-my-droid-a-personality-with-esp32-and-arduino-modulinos-5c0p</guid>
      <description>&lt;p&gt;In my last post, Willard the Smart Car was undergoing surgery for ghost states and memory leaks. While I continue to debug the "Reflex Layer," I wanted to work more on the emotional response layer. &lt;/p&gt;

&lt;p&gt;As a recap, the plan is for the Raspberry Pi to be the &lt;strong&gt;Cognition Layer&lt;/strong&gt; (high-level thinking) and the Arduino Uno (or in Willard's case the Elegoo board) to be the &lt;strong&gt;Reflex Layer&lt;/strong&gt; (movement). That leaves a critical piece for interaction and sensing. I decided to experiment with the &lt;strong&gt;ESP32&lt;/strong&gt; as the &lt;strong&gt;Interaction Layer&lt;/strong&gt;—the "Feeling Brain" that gives my droid its personality.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Hardware Pivot: SparkFun &amp;amp; Qwiic
&lt;/h2&gt;

&lt;p&gt;This layer uses the &lt;a href="https://www.sparkfun.com/sparkfun-thing-plus-esp32-wroom-usb-c.html" rel="noopener noreferrer"&gt;&lt;strong&gt;SparkFun ESP32 Thing Plus&lt;/strong&gt;&lt;/a&gt;. I couldn't be happier with the ease of experimentation here, primarily because of the board’s design and the &lt;strong&gt;Qwiic (I2C)&lt;/strong&gt; ecosystem. (I will publish more of my experiments to my &lt;a href="https://github.com/iennae/uno-sketch-box#" rel="noopener noreferrer"&gt;GitHub repo uno-sketch-box&lt;/a&gt; repo, although kind of regretting my name choice as I'm experimenting with more than just the Arduino Uno at this point.) &lt;/p&gt;

&lt;p&gt;One of the biggest challenges with Willard was the bird’s nest of jumper wires taking over the chassis. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3h3tfb4vvv3v8fhtvh9d.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3h3tfb4vvv3v8fhtvh9d.jpg" alt="Willard's robot car chassis is full." width="800" height="1731"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Thing Plus uses the modern USB-C standard, has built-in wireless, and features a Qwiic connector. This allows me to daisy-chain &lt;strong&gt;Arduino Modulino&lt;/strong&gt; sensors with a single cable. I’m spending less time fighting with breadboards and more time imagining what the droid can actually &lt;em&gt;do&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;My current interaction stack includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Modulino Distance:&lt;/strong&gt; The droid’s "social eyes."&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Modulino Pixels:&lt;/strong&gt; For visual "eye" expressions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Modulino Buzzer:&lt;/strong&gt; The droid’s "voice box."&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Sound Engine
&lt;/h2&gt;

&lt;p&gt;I didn't want my droid to sound like a microwave or a digital alarm. I wanted it to have a distinct personality. This led me into a deep dive on Granular Synthesis and Vibrato.&lt;/p&gt;

&lt;p&gt;Instead of playing a static frequency, I wrote a sound engine that uses micro-slides in pitch. By rapidly oscillating the frequency with a &lt;code&gt;sin()&lt;/code&gt; function and adding tiny 20ms gaps, the buzzer produces a "croaky," mechanical texture.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;800&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;1200&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;vibrato&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;millis&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mf"&gt;50.0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; 
  &lt;span class="n"&gt;buzzer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tone&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;vibrato&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;           
  &lt;span class="n"&gt;delay&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;                              
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By starting a sound at a low 300Hz "croak" and sliding it up to a 900Hz "chirp," the droid goes from sounding like a telephone to sounding like it’s asking a question.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// An inquisitive chirp&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;300&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;900&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;buzzer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tone&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;delay&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;25&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// The gap creates the mechanical "clicky" texture&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To simulate active awareness when it isn't interacting, I used the same mathematical logic to create a pulse for its eyes. By mapping millis() to a sine wave, the LEDs oscillate in brightness smoothly rather than just staying on.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;idleBlink&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;b&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;millis&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mf"&gt;2000.0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;255&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;show&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Bridging the Social Gap
&lt;/h2&gt;

&lt;p&gt;The goal of this droid is to handle nudging my son that it's time for school in a way that is fun (not too much fun) but gets him engaged and aware of the time. &lt;/p&gt;

&lt;p&gt;Currently, I’m using distance as the primary input. In the future, the droid will use vision to categorize what he's doing (e.g., "Is he wearing shoes yet?").&lt;/p&gt;

&lt;p&gt;During testing, I discovered that if I set a single distance trigger, the droid "stutters" when my hand sits statically near the sensor. To fix this, I implemented a 250mm "buffer zone." The ESP32 greets you when you enter the 20cm range, but "locks" that interaction until you move past 45cm. Only then does it return to its idle stage (pulsing two blue LEDs).&lt;/p&gt;

&lt;p&gt;Here's the ESP32 greeting:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdyisoymfpjpcm2v939j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdyisoymfpjpcm2v939j.png" alt="A top-down view of a LEGO-encased droid interaction hub. Proximity has triggered the translucent LEGO " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And a nervous "red", I'm a bit too close: &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbyul2uq67wgzvaajx89l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbyul2uq67wgzvaajx89l.png" alt="The droid's translucent LEGO head is illuminated with a sharp red glow. The person's hand is held closer to the sensor, demonstrating where the droid has shifted from a friendly greeting to a worried " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What's Next?
&lt;/h2&gt;

&lt;p&gt;I'm working on image recognition on the Raspberry Pi with the &lt;a href="https://github.com/google-ai-edge/mediapipe" rel="noopener noreferrer"&gt;Google Mediapipe&lt;/a&gt; and &lt;a href="https://docs.opencv.org/4.x/" rel="noopener noreferrer"&gt;OpenCV&lt;/a&gt; libraries to train a facial recognition model. Then the Pi will identify who is in the room, and the ESP32 will decide which personality profile to use in response.&lt;/p&gt;

&lt;p&gt;I’d love to hear from you, how do you handle "personality" in your builds? Have you found a way to make simple buzzers sound more organic? &lt;/p&gt;

</description>
      <category>arduino</category>
      <category>hardware</category>
      <category>beginners</category>
      <category>robotics</category>
    </item>
    <item>
      <title>Willard’s First Flight (and Unplanned Disassembly)</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Sun, 18 Jan 2026 02:43:07 +0000</pubDate>
      <link>https://dev.to/sigje/willards-first-flight-and-unplanned-disassembly-1ohd</link>
      <guid>https://dev.to/sigje/willards-first-flight-and-unplanned-disassembly-1ohd</guid>
      <description>&lt;p&gt;In my last post, I was teaching my droid project to "see" and "think" on a Raspberry Pi 5. But while waiting for my Build Hat, I discovered these car kits—a ready-made option to explore physical motion immediately.&lt;/p&gt;

&lt;p&gt;Enter the &lt;a href="https://us.elegoo.com/products/elegoo-smart-robot-car-kit-v-4-0" rel="noopener noreferrer"&gt;&lt;strong&gt;Elegoo Smart Robot Car Kit v4.0&lt;/strong&gt;&lt;/a&gt;, or as my son affectionately calls him, &lt;strong&gt;Willard&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Physical Build
&lt;/h2&gt;

&lt;p&gt;The assembly was surprisingly seamless. Elegoo helpfully provides the three screwdrivers you need, and the bags are neatly labeled. It felt very much like "IKEA instructions meet electronics."&lt;/p&gt;

&lt;p&gt;However, I quickly hit a physical "hardware tax." Some of these components are so tiny that getting them into place is a challenge. Even worse, once I got the robot moving, the jerky default programming caused the whole chassis to vibrate. Within minutes, I had bolts and screws flying off the robot like it was undergoing unplanned disassembly! It took some time to figure out where the bits came off of, and I'm not sure if there is some recommended way to actually set this up so that bits don't come off. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson learned:&lt;/strong&gt; I’ve taken Willard’s wheels off for now. This lets me debug the logic safely on my desk without him shaking himself to pieces or driving off the desk. &lt;/p&gt;

&lt;h2&gt;
  
  
  The "Hidden" Software Gatekeepers
&lt;/h2&gt;

&lt;p&gt;Even though it’s a kit, customizing your robot car isn't exactly "plug and play." I ran into two gatekeepers that weren't in the printed manual (but were discoverable in the many docs available in the downloadable &lt;a href="https://github.com/elegooofficial/ELEGOO-Smart-Robot-Car-Kit-V4.0" rel="noopener noreferrer"&gt;zip file&lt;/a&gt;). &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;The Upload Toggle:&lt;/strong&gt; The expansion hat has a tiny toggle switch. If it's set to "App" mode, you cannot upload code. You must flip it to &lt;strong&gt;"Upload"&lt;/strong&gt; to talk to the Elegoo. It’s printed on the board, but so small you might miss it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Missing Driver:&lt;/strong&gt; Due to chip shortages, different serial chips might be installed. My Arduino IDE didn't "see" the board until I hunted down the &lt;strong&gt;CH340 driver&lt;/strong&gt; hidden in the tutorial zip file.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Pro-tip:&lt;/strong&gt; I’ve switched to the &lt;strong&gt;Arduino CLI&lt;/strong&gt; for faster feedback:&lt;br&gt;
&lt;code&gt;arduino-cli compile --fqbn arduino:avr:uno --libraries . smart-robot-car-firmware.ino&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;This gives me fast feedback if something needs to be fixed. &lt;/p&gt;

&lt;p&gt;And, I need that feedback because I keep finding out new things. One of the most frustrating things to debug in hardware is the Restart Loop. Unlike a computer, where an app crashes and gives you an error, an Arduino often just restarts. Because it boots so quickly, it can look like it's just "ignoring" your code or behaving weirdly, when in reality, it’s crashing and rebooting every few seconds.&lt;/p&gt;
&lt;h2&gt;
  
  
  Every Byte counts
&lt;/h2&gt;

&lt;p&gt;One of the weirdest hurdles was moving to the CLI. The code compiled fine in the IDE, but the CLI threw a "Sketch too big" error. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;ETA: I might be misremembering this step as it might be that it was erroring due to the Global variable size. This is what my usage looks like now&lt;br&gt;
&lt;/p&gt;


&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Sketch uses 21166 bytes (65%) of program storage space. Maximum is 32256 bytes.                                           │
│ Global variables use 1208 bytes (58%) of dynamic memory, leaving 840 bytes for local variables. Maximum is 2048 bytes. 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It turns out the original firmware was already redlining the Arduino Uno's 32KB limit. Adding even the "negligible" overhead of the Servo library was enough to push it over the edge. I wrote a Python script to see which symbols were eating my space.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;subprocess&lt;/span&gt;
&lt;span class="c1"&gt;# Runs avr-nm to sort symbols by size
&lt;/span&gt;&lt;span class="n"&gt;nm_output&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;subprocess&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;check_output&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;avr-nm&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;-S&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;--size-sort&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;-C&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;elf_file&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This led me to refactor the logic into a non-blocking state machine, reducing the &lt;code&gt;JsonDocument&lt;/code&gt; size from 200 to 128 bytes, and replacing &lt;code&gt;sprintf&lt;/code&gt; with &lt;code&gt;Serial.write&lt;/code&gt; to avoid pulling in heavy string-formatting libraries.&lt;/p&gt;




&lt;h2&gt;
  
  
  An Independent Eye
&lt;/h2&gt;

&lt;p&gt;The most interesting discovery? The camera isn't connected to the Elegoo brain at all for this kit. It operates on its own IP address. This means for this robot car the "Reflex" layer (Elegoo board) is completely blind to what the camera sees. Even so, this Elegoo kit is incredibly comprehensive; having all the sensors and the camera in one package helps me visualize how much can actually fit into a consumer-level robot.&lt;/p&gt;

&lt;p&gt;It also gave me a much-needed reality check on weight restrictions. I’ve been questioning how the Raspberry Pi and the Arduino, along with the LEGO structure, might impede movement. Watching Willard navigate has given me a baseline for how much mass the motors can handle before performance starts to drop. My eventual goal will be to bridge the reflex/cognition gap—having the Pi analyze the visual data and then send high-level commands to my droid's Arduino "muscles."&lt;/p&gt;




&lt;h2&gt;
  
  
  What's Next
&lt;/h2&gt;

&lt;p&gt;Willard is currently on the operating table. I’m deep in the weeds debugging why his &lt;strong&gt;Guard Duty&lt;/strong&gt; behavior is returning "Ghost States."&lt;/p&gt;

&lt;p&gt;Take a look at this serial output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Current Distance: 13
Current Distance: 12
Alarm triggered!
Guard Duty State: 3  &amp;lt;-- Ghost State Detected
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;State 3 doesn't exist. This is the classic signature of a Buffer Overflow. Something in Willard's memory is spilling over and rewriting his state variables in real-time.&lt;br&gt;
Next time, we perform surgery to find the leak!&lt;/p&gt;




&lt;p&gt;If you've built this car and made modifications or if you have suggestions for other ideas please respond in the comments. So far, I've come up with a "Guard" mode (detecting motion via proximity changes) and a "Dance" mode (multi-color lights and pre-programmed steps).&lt;/p&gt;

&lt;p&gt;I'm also considering adding an Arduino Modulino Buzzer module. It uses I2C (Qwiic), which would keep the wiring clean and finally let Willard "speak" with custom tones and alerts.&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>hardware</category>
      <category>beginners</category>
      <category>robotics</category>
    </item>
    <item>
      <title>Reflexes, Cognition, and Thought</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Wed, 07 Jan 2026 07:55:39 +0000</pubDate>
      <link>https://dev.to/sigje/reflexes-cognition-and-thought-5o</link>
      <guid>https://dev.to/sigje/reflexes-cognition-and-thought-5o</guid>
      <description>&lt;p&gt;In my previous posts, I focused on sharing the basics—making LEDs blink and understanding wiring. Today’s adventure was about expanding on what my droid will actually need to function. &lt;/p&gt;

&lt;p&gt;The droid will have a multi-layered "brain." I’ve been working on the &lt;strong&gt;Reflex Layer&lt;/strong&gt; with the Arduino Uno for prototyping. In this post, I’ll review what I’ve learned there and explore the &lt;strong&gt;Cognition Layer&lt;/strong&gt; using computer vision and local AI.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Reflex Layer: Data Types &amp;amp; Geometry
&lt;/h2&gt;

&lt;p&gt;Before the droid can walk, it has to have a way to traverse the world (or at least my home). I used the Arduino to test the "logic" of movement before ever plugging in a motor.&lt;/p&gt;

&lt;h3&gt;
  
  
  Visual Odometer
&lt;/h3&gt;

&lt;p&gt;I built a Visual Odometer using 4 LEDs to represent 4 bits of a &lt;code&gt;signed char&lt;/code&gt;. I wanted to visualize an "integer overflow." By starting the counter at 120 (near the 127 limit of a 1-byte signed integer), I could watch the moment the droid "lost its mind." As soon as it hit 128, the odometer flipped to -128. &lt;/p&gt;

&lt;p&gt;Seeing the LEDs jump and the Serial Monitor report a negative distance was a tactile lesson: &lt;strong&gt;pick the right "storage box" (data type) for your sensors&lt;/strong&gt;, or your droid will think it's traveling backward just because it reached a limit.&lt;/p&gt;

&lt;h3&gt;
  
  
  Simulating Motion with Light
&lt;/h3&gt;

&lt;p&gt;Since I don’t have something that physically moves yet, I used a photoresistor (light sensor) to simulate "steps." Every time I flashed my phone light, the Arduino registered a movement. I also had the LED change color based on the light being detected so I could see really quickly whether my code was working. &lt;/p&gt;

&lt;p&gt;I used the &lt;strong&gt;Pythagorean Theorem&lt;/strong&gt; ($a^2 + b^2 = h^2$) to calculate the "as-the-crow-flies" distance from the starting point. Using the Serial Plotter, I could see the $X$ and $Y$ coordinates stair-stepping while the distance tracked a smooth, calculated curve.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include&lt;/span&gt; &lt;span class="cpf"&gt;&amp;lt;math.h&amp;gt;&lt;/span&gt;&lt;span class="cp"&gt;
&lt;/span&gt;&lt;span class="c1"&gt;// ... logic to detect light pulse ...&lt;/span&gt;
&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sensorValue&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="n"&gt;triggered&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; 
    &lt;span class="n"&gt;xPos&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; 
    &lt;span class="n"&gt;yPos&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; 
    &lt;span class="c1"&gt;// h = sqrt(x^2 + y^2)&lt;/span&gt;
    &lt;span class="n"&gt;hypotenuse&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;sqrt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pow&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;xPos&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;pow&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;yPos&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
    &lt;span class="n"&gt;triggered&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;true&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Adding Some Motion
&lt;/h3&gt;

&lt;p&gt;At this point, I figured I could just add some hardware and experience the spinning of the motor based on the distance traveled. I was a bit surprised when I opened up the servos packaging and discovered that I didn't know what to do with them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn8vqaa0a3k3t5mxuekva.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn8vqaa0a3k3t5mxuekva.jpeg" alt="Close-up of a GeekServo with a 2-pin connector. A hand holds a red and black wire, highlighting the connection confusion." width="350" height="465"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I unpacked the Arduino motor shield figuring it would be obvious where it would plug-in, but nope. While the shield was easily installed, the wiring wasn't straightforward.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstl2szrr04o8c19l5q81.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstl2szrr04o8c19l5q81.jpeg" alt="Side profile of an Arduino Uno with an L298N motor shield stacked on top, showing the various header pins and screw terminals." width="400" height="301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I could not figure out where the wires on the Geek Servos were supposed to go. &lt;/p&gt;

&lt;p&gt;I tried guessing the connection and successfully saw my LED light up, but there was no spinning motor.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fywmbgh94qans34itcztw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fywmbgh94qans34itcztw.jpeg" alt="GeekServo connected to a breadboard using male-to-male jumper wires. A status LED is lit, but the motor is stationary." width="450" height="598"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Which is when I realized, that my servo was actually just a motor. &lt;/p&gt;

&lt;p&gt;I also realized that I likely needed an external power source to support this hardware. I have other servos to try, but I really want these to work since they are LEGO-compatible. To keep the momentum, I decided to look at the second layer of the brain.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Cognition Layer: Enter the Raspberry Pi 5
&lt;/h2&gt;

&lt;p&gt;I set up my sparkly new Raspberry Pi 5 from a &lt;a href="https://www.canakit.com/pi5-case" rel="noopener noreferrer"&gt;CanaKit&lt;/a&gt;. This is the "High-Functioning" brain. This kit was super easy and the video was very straightforward to follow—a great "intro to building a computer" kit. After a quick setup and package update, I dove straight into Edge AI.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Sidequest: The Screenshot Struggle I spent way too long trying to automate screenshots on the Pi for this blog. I learned the hard way that scrot only produces black screens on the new Pi OS (Wayland). After fighting with grim and slurp, I realized I'd figure that part out later. No screenshots for now!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  "I See You"
&lt;/h3&gt;

&lt;p&gt;I hooked up an ELP 2.0 Megapixel camera and installed &lt;a href="https://ollama.com/" rel="noopener noreferrer"&gt;Ollama&lt;/a&gt; to run a local &lt;a href="https://ollama.com/library/minicpm-v" rel="noopener noreferrer"&gt;Vision Language Model (openbmb/minicpm-v4.5)&lt;/a&gt;. I wrote a Python script using &lt;a href="https://opencv.org/" rel="noopener noreferrer"&gt;OpenCV (cv2)&lt;/a&gt; to grab a frame and feed it to the model.&lt;/p&gt;

&lt;p&gt;The Result: Success! The Pi analyzed the camera feed locally and described me and the room.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;DROID SAYS: 
Observing: A human with glasses and purple attire occupies the center of an indoor space; 
ceiling fan whirs above while wall decor and doorframes frame background elements—a 
truly multifaceted environment!
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It took about 3 minutes to process one frame. My droid currently has the processing speed of a very deep philosopher—it’s not going to win any races yet, but it is truly thinking about its surroundings.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Vision Script (vision_test.py)
&lt;/h3&gt;

&lt;p&gt;Here is the bridge between the camera and the AI:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ollama&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;time&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;capture_and_analyze&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# Initialize USB Camera
&lt;/span&gt;    &lt;span class="n"&gt;cam&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;VideoCapture&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;cam&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;isOpened&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error: Could not access /dev/video0. Check USB connection.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;--- Droid Vision Active ---&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Warm-up: Skip a few frames so the auto-exposure adjusts
&lt;/span&gt;    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;cam&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sleep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;ret&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;frame&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cam&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;ret&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;img_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;droid_snapshot.jpg&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
        &lt;span class="n"&gt;cv2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;imwrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;frame&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Image captured! Sending to MiniCPM-V-4.5...&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Querying the local Ollama model
&lt;/span&gt;            &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ollama&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;chat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;openbmb/minicpm-v:4.5&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[{&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;role&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;user&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Act as a helpful LEGO droid. Describe what you see in one short, robotic sentence.&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;images&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;img_path&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
                &lt;span class="p"&gt;}]&lt;/span&gt;
            &lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;DROID SAYS:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;message&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
        &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Ollama Error: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Clean up the photo after analysis
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;path&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exists&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img_path&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;remove&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;img_path&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error: Could not grab a frame.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;cam&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;release&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;capture_and_analyze&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  What's Next
&lt;/h2&gt;

&lt;p&gt;I'm going to figure out the motors, but for now, I'm going to focus on refining the Vision + AI pieces. I am going to try using the &lt;code&gt;opencv-face-recognition&lt;/code&gt; library and experiment with different, smaller models to see if I can speed up that 3-minute "thought" process!&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>hardware</category>
      <category>beginners</category>
      <category>showdev</category>
    </item>
    <item>
      <title>The RGB LED Sidequest 💡</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Mon, 05 Jan 2026 23:00:30 +0000</pubDate>
      <link>https://dev.to/sigje/the-rgb-led-sidequest-4519</link>
      <guid>https://dev.to/sigje/the-rgb-led-sidequest-4519</guid>
      <description>&lt;p&gt;Welcome back to my journey &lt;strong&gt;From Dust to Droid Dreams&lt;/strong&gt;. While the primary path to building a droid is a long-term mission, sometimes you have to take a sidequest to master the fundamentals of the "nervous system." Today’s mission: &lt;strong&gt;Multi-LED color blending and the hidden traps of Serial debugging&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;I discovered that I had an RGB LED and decided that I would rewire the existing Red, Blue, and Green LEDs and keep the yellow LED as is. The goal was to create a smooth transition between Red, Green, and Blue using Pulse Width Modulation (PWM),  dedicated Yellow LED that glows based on the overlap of the Red and Green channels.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "Fiddly" Hardware
&lt;/h2&gt;

&lt;p&gt;First lesson of the sidequest: &lt;strong&gt;The physical world is messy.&lt;/strong&gt; I quickly learned that the RGB LED setup is super fiddly. Wires cross over easily, and it requires very careful application of the resistors to keep the signals clean and the components safe. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fscn1zmu6uge3uxys0xwr.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fscn1zmu6uge3uxys0xwr.jpeg" alt="LED surrounded by jumper wires and resistors" width="450" height="598"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Encountering the "Lag" Monster
&lt;/h2&gt;

&lt;p&gt;Once I got the LEDs in sync, I wanted to monitor the values changing in real time. This did not work the way I was expecting.  Suddenly, the synchronization disappeared—the Yellow LED started stuttering and wouldn't come on at the right time.&lt;/p&gt;

&lt;h3&gt;
  
  
  Problematic Code:
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;updateLEDs&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RED&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;redValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;GREEN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;greenValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;BLUE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;blueValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// Serial output inside the main update loop&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"R: "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;redValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" | G: "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;greenValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" | B: "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;blueValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" | Y-LED: "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;yellowBright&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="c1"&gt;// ... this was slowing everything down!&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At the default 9600 baud, Serial communication is relatively slow. Every time the code reached these print statements, it paused the loop to send data. This created a "hiccup" in the PWM signal, making the LEDs flicker and the timing feel "off."&lt;/p&gt;

&lt;h3&gt;
  
  
  The Optimization Fix
&lt;/h3&gt;

&lt;p&gt;To resolve the lag and reclaim that smooth fade, I applied two key fixes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Baud Rate Boost: I increased the speed from 9600 to 115200 (updating it in both the code and the IDE configuration).&lt;/li&gt;
&lt;li&gt;Timed Printing (Non-blocking): Instead of printing every single loop, I used a timer (millis()) to update the Serial output every 100ms. This keeps the processor focused on the LED pulses.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;updateLEDs&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;RED&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;redValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;GREEN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;greenValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;BLUE&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;blueValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;yellowBright&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;redValue&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;greenValue&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;?&lt;/span&gt; &lt;span class="n"&gt;redValue&lt;/span&gt; &lt;span class="o"&gt;:&lt;/span&gt; &lt;span class="n"&gt;greenValue&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="n"&gt;analogWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;YELLOW&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;yellowBright&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// Non-blocking timer: Only print every 100ms&lt;/span&gt;
  &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;millis&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="n"&gt;lastPrintTime&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"R:"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;redValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" G:"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;greenValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" B:"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;blueValue&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="c1"&gt;// Added a small offset (+5) so the yellow line is visible on the plotter&lt;/span&gt;
    &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;" Y-LED:"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="n"&gt;Serial&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;println&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;yellowBright&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; 
    &lt;span class="n"&gt;lastPrintTime&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;millis&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  New Tool Use Unlocked: The Serial Plotter
&lt;/h3&gt;

&lt;p&gt;I also spent some time with the Arduino IDE's Serial Plotter (Tools &amp;gt; Serial Plotter). It’s a fantastic tool for visualizing how the color variables oscillate in waves.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgxjca8pjcy0xgtz0gf76.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgxjca8pjcy0xgtz0gf76.png" alt="Yellow LED blip on the Serial Plotter graph" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I had to go through a few iterations to get the yellow line to show up properly. Because the values were identical to the Red or Green lines, they were hidden behind them. Adding a small offset (&lt;em&gt;+5&lt;/em&gt;) to the print statement finally made the yellow line visible!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi21c5rhzp5r3lf3m9b2v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi21c5rhzp5r3lf3m9b2v.png" alt="Yellow LED now visible on the Serial Plotter graph" width="800" height="494"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;A minor mystery: I haven't figured out how to get the line colors in the plotter to actually match the RGB colors I'm outputting. The IDE seems to assign them automatically based on the order of the variables. If anyone has a hack for this, please let me know in the comments! Even without matching colors, seeing the waves overlap in real-time is a huge help for debugging the logic.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Key Takeaways for the Droid Build
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Communication Overhead: Consider how much "talking" your microcontroller is doing—it might be slowing down its "thinking."&lt;/li&gt;
&lt;li&gt;Visualization over Text: Data is much easier to digest as a wave than as a wall of scrolling numbers.&lt;/li&gt;
&lt;li&gt;Timing is Everything: Don't just default to delay. &lt;code&gt;millis()&lt;/code&gt;  allows for multi-tasking, whereas &lt;code&gt;delay()&lt;/code&gt; essentially puts the Arduino Uno to sleep.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>arduino</category>
      <category>hardware</category>
      <category>beginners</category>
      <category>showdev</category>
    </item>
    <item>
      <title>Four Sketches and a Rewire: The Path to Droid Brains</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Mon, 05 Jan 2026 04:27:14 +0000</pubDate>
      <link>https://dev.to/sigje/four-sketches-and-a-rewire-the-path-to-droid-brains-5e7e</link>
      <guid>https://dev.to/sigje/four-sketches-and-a-rewire-the-path-to-droid-brains-5e7e</guid>
      <description>&lt;p&gt;In my last post, I was staring at a dark LED, wondering if my 16-year-old hardware was faulty or if I was failing to understand something fundamental. I’ll admit, I struggled with the introductory ELEC 2200 curriculum back at Auburn University (though that may have had more to do with the class being at 8:00 AM after working until 1:00 AM at Taco Bell). &lt;/p&gt;

&lt;p&gt;The good news? I found working hardware by switching up the breadboard. The better news? I’ve officially moved past the "Blink" stage and into logic. I’m sharing my progress via a new open-source project: &lt;a href="https://github.com/iennae/uno-sketch-box" rel="noopener noreferrer"&gt;&lt;strong&gt;uno-sketch-box&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Pivoting to the "Serial" Heartbeat
&lt;/h3&gt;

&lt;p&gt;Before I found my breakthrough with the physical LEDs, I decided not to let the hardware block me. I jumped ahead to &lt;a href="https://www.ladyada.net/learn/arduino/lesson4.html" rel="noopener noreferrer"&gt;&lt;strong&gt;Lesson 4&lt;/strong&gt;&lt;/a&gt;. If the droid's "eyes" (the LEDs) weren't working yet, I wanted to see if its "voice" was intact.&lt;/p&gt;

&lt;p&gt;By using the &lt;code&gt;Serial&lt;/code&gt; library, I established a "Hello World" heartbeat between the Arduino and my Mac. Seeing that text scroll across the Serial Monitor confirmed that my Arduino had a voice and was ready to communicate.&lt;/p&gt;

&lt;h3&gt;
  
  
  Crossing the Rev 3 Bridge
&lt;/h3&gt;

&lt;p&gt;While troubleshooting my initial Ladyada kit, I realized I actually had a second "time capsule" on my shelf: the &lt;strong&gt;"Emerging Objects" kit from Sparklelabs&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdxy4xt4armbtgfaknub0.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdxy4xt4armbtgfaknub0.jpeg" alt="Sparklelabs kit"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sadly, the Sparklelabs tutorial has vanished from the web, but the hardware provided was exactly what I needed: a more modern &lt;strong&gt;Arduino Uno Rev 3&lt;/strong&gt; (identifiable by the red reset button moved to the top left) and a full-sized &lt;strong&gt;830-point breadboard with power rails&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqqlx87ya29bcbxs7ai8w.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqqlx87ya29bcbxs7ai8w.jpeg" alt="Uno side by side"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhqtuh3hhlurfhl1jvcc8.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhqtuh3hhlurfhl1jvcc8.jpeg" alt="830 point breadboard with power rails"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The "Everything is Blinking" Bug
&lt;/h3&gt;

&lt;p&gt;With the better breadboard, the mystery of the dark LED was solved. I finally had a blinking circuit! However, as soon as I tried to scale up to an &lt;strong&gt;LED Scanner&lt;/strong&gt; (think &lt;em&gt;Knight Rider&lt;/em&gt;), I hit my first logic bug. I had several LEDs hooked up, but instead of a smooth back-and-forth scan, every single LED was blinking in unison.&lt;/p&gt;

&lt;p&gt;The culprit was a physical wiring error that ignored the board's architecture. By moving the power and ground connections to a more stable configuration on the new rails, I was able to see the scanning pattern between the blue and green LEDs—but the red LED stayed dark.&lt;/p&gt;

&lt;p&gt;It turns out I was using &lt;strong&gt;Pin 13&lt;/strong&gt;, which is tied to the onboard "L" LED. By moving the jumper cable for the red LED to &lt;strong&gt;Pin 8&lt;/strong&gt;, I finally achieved discrete control. I brainstormed other things that I could try out without moving any physical hardware wiring and decided I would share my adventures so others could try this out as well.&lt;/p&gt;

&lt;h3&gt;
  
  
  Open Sourcing the Journey: The Uno Sketch Box
&lt;/h3&gt;

&lt;p&gt;I’ve officially open-sourced my learning progress in a new repository: &lt;a href="https://github.com/iennae/uno-sketch-box" rel="noopener noreferrer"&gt;&lt;strong&gt;uno-sketch-box&lt;/strong&gt;&lt;/a&gt;. Instead of just following the lessons linearly, I’ve added four projects that serve as building blocks for the droid's future "brain":&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;LED Scanner:&lt;/strong&gt; A study in &lt;code&gt;for&lt;/code&gt; loops and multi-output management.&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Police Strobe:&lt;/strong&gt; An exercise in timing and alternating state patterns.&lt;br&gt;


  &lt;iframe src="https://www.youtube.com/embed/mL4Sdts8zg4"&gt;
  &lt;/iframe&gt;


 &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Binary Counter:&lt;/strong&gt; A brain-bender that visualizes numbers in binary using 4 LEDs—because droid logic is all 1s and 0s.&lt;br&gt;


  &lt;iframe src="https://www.youtube.com/embed/1vEevR4qY-8"&gt;
  &lt;/iframe&gt;


 &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Random Winner Picker:&lt;/strong&gt; Introducing the &lt;code&gt;random()&lt;/code&gt; function for a bit of unpredictable personality.&lt;br&gt;


  &lt;iframe src="https://www.youtube.com/embed/YaFgHoTzhx0"&gt;
  &lt;/iframe&gt;


 &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Automation &amp;amp; Clean Code
&lt;/h3&gt;

&lt;p&gt;I didn't want to leave the code messy, and I wanted a way for others to contribute. I’ve set up &lt;a href="https://github.com/iennae/uno-sketch-box/blob/main/.github/workflows/compile.yml" rel="noopener noreferrer"&gt;&lt;strong&gt;GitHub Actions&lt;/strong&gt;&lt;/a&gt; to lint my sketches and ensure they compile correctly on every push. It feels good to bring modern CI practices to this project.&lt;/p&gt;




&lt;h3&gt;
  
  
  What’s Next? "Input"
&lt;/h3&gt;

&lt;p&gt;I’ve got the LED "Output" figured out with making lights dance. Now, I need to teach the droid to take some input. My Sparklelabs kit has some buttons, and I’m planning to tackle the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; &lt;strong&gt;Visualizing Math &amp;amp; Data Types:&lt;/strong&gt; Extending on the Lady Ada &lt;a href="https://www.ladyada.net/learn/arduino/lesson4.html" rel="noopener noreferrer"&gt;Lesson 4&lt;/a&gt;, Building a "Position Tracker" using the Pythagorean theorem ($a^2 + b^2 = h^2$) to calculate distance traveled, displayed over Serial.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;The Odometer Challenge:&lt;/strong&gt; Testing &lt;strong&gt;variable overflows&lt;/strong&gt;—watching a number jump from 32,767 to -32,768—to ensure I pick the right "storage boxes" (&lt;code&gt;int&lt;/code&gt; vs &lt;code&gt;long&lt;/code&gt;) for droid sensors.&lt;/li&gt;
&lt;li&gt; &lt;strong&gt;Interactive Logic:&lt;/strong&gt; Using a physical button to trigger an LED pattern or reset the binary counter.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Before I move on, I’m polishing the visuals by swapping a green LED for a yellow one and possibly adding a white one.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;I’d love to hear your thoughts:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What other LED-related logic projects should I try?&lt;/li&gt;
&lt;li&gt;Any creative ideas for projects that combine these multi-LED patterns with button inputs?&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>arduino</category>
      <category>hardware</category>
      <category>beginners</category>
      <category>showdev</category>
    </item>
    <item>
      <title>Relearning the Arduino</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Sun, 04 Jan 2026 01:01:06 +0000</pubDate>
      <link>https://dev.to/sigje/from-2010-dust-to-droid-dreams-relearning-the-arduino-4f5g</link>
      <guid>https://dev.to/sigje/from-2010-dust-to-droid-dreams-relearning-the-arduino-4f5g</guid>
      <description>&lt;p&gt;I have a goal: I want to build a real, functional LEGO droid.&lt;/p&gt;

&lt;p&gt;It’s an ambitious project involving modern sensors, servos, and logic. But instead of jumping straight into the deep end with expensive new kits, I decided to start with a "time capsule" I’ve had on my shelf since 2010.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1obh2q0wejg0fchb014y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1obh2q0wejg0fchb014y.png" alt="Original Order the Arduino Kit was $65" width="800" height="254"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I’m dusting off an original Arduino Starter Pack. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foylyhy8euxrdltjbgydx.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foylyhy8euxrdltjbgydx.jpg" alt="Arduino Starter Pack Box" width="400" height="300"&gt;&lt;/a&gt;&lt;br&gt;
My plan is to learn the hardware incrementally, starting with the basics I already own before graduating to trying to build a droid. &lt;/p&gt;
&lt;h2&gt;
  
  
  The "Lady Ada" Time Machine
&lt;/h2&gt;

&lt;p&gt;The box is labeled &lt;a href="http://ladyada.net/learn/arduino" rel="noopener noreferrer"&gt;ladyada.net/learn/arduino&lt;/a&gt;. It’s a trip down memory lane. While that URL still works, the experience today is a bit of a scavenger hunt. The old tutorials are still there, but they often point to newer versions of the site, making the lessons feel a little disjointed.&lt;/p&gt;

&lt;p&gt;I wasn't even sure if a board this old would talk to a computer from 2026.&lt;/p&gt;
&lt;h2&gt;
  
  
  The Modern Mac Roadblock
&lt;/h2&gt;

&lt;p&gt;Hardware might be timeless, but software architecture moves fast. When I tried to upload the classic "Blink" sketch from Lesson 1, my Mac threw a fit:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.../bin/avr-g++: bad CPU type in executable
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;My M-series Mac chip didn't know how to handle the older Intel-based instructions in the Arduino compiler.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 &lt;strong&gt;The Fix:&lt;/strong&gt; If you see this error, your Mac needs to translate older Intel code. Open your Terminal and run:&lt;br&gt;
&lt;code&gt;/usr/sbin/softwareupdate --install-rosetta --agree-to-license&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Small Wins, Big Goals
&lt;/h2&gt;

&lt;p&gt;Once Rosetta was in place, the "Blink" sketch compiled perfectly:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Sketch uses 924 bytes (2%) of program storage space. Global variables use 9 bytes (0%) of dynamic memory.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Watching that LED blink for the first time (even if it's a decade late) feels like a win. It’s a small step—just a single light—but it’s the first milestone on the road to a LEGO droid.&lt;/p&gt;





&lt;p&gt;Now that the "Blink" sketch works, I’ve moved on to &lt;a href="https://www.ladyada.net/learn/arduino/lesson3.html" rel="noopener noreferrer"&gt;Lesson 3&lt;/a&gt;, which introduces the breadboard and external LEDs. This is where the 16-year gap becomes a bit of a headache.&lt;/p&gt;

&lt;p&gt;The tutorial images show an older Arduino shield setup that doesn't quite match my Uno and tiny breadboard. I’m currently stuck in "debug mode" with my wiring.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Breadboard Mystery: Why won't my LED light up?
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The Mental Model
&lt;/h3&gt;

&lt;p&gt;In theory, the circuit is a simple loop. Current flows from &lt;strong&gt;High (5V)&lt;/strong&gt; to &lt;strong&gt;Low (GND)&lt;/strong&gt;. Along the way, it must pass through the &lt;strong&gt;Resistor&lt;/strong&gt; (to limit current) and the &lt;strong&gt;LED&lt;/strong&gt; (to create light).&lt;/p&gt;

&lt;h3&gt;
  
  
  The Setup Challenge
&lt;/h3&gt;

&lt;p&gt;I've been experimenting with two configurations, but neither has worked yet:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Option A:&lt;/strong&gt; 5V → Resistor → Anode (Long Leg) → LED → Cathode (Short Leg) → GND.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Option B:&lt;/strong&gt; 5V → Anode (Long Leg) → LED → Cathode (Short Leg) → Resistor → GND.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So far, the Uno powers up fine, but my external LED is staying stubbornly dark. &lt;/p&gt;

&lt;h2&gt;
  
  
  Seeking Your "Droid-Tech" Wisdom
&lt;/h2&gt;

&lt;p&gt;I’m going to keep plugging away and swapping wires, but if any of you have experience with these vintage kits or the specific Lady Ada Lesson 3 layout, I’d love some advice! &lt;/p&gt;

&lt;p&gt;Is there a common mistake with these tiny breadboards?  I'm determined to solve this before I even think about LEGO motors!&lt;/p&gt;

&lt;h2&gt;
  
  
  ETA - Success
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fat9c5hlqtipiibwxdh8w.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fat9c5hlqtipiibwxdh8w.jpeg" alt="Uno with breadboard and a single red LED illuminated" width="400" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I switched out the tiny breadboard for a breadboard with rails (another incomplete kit hah!), and wired it up with Option A and that works! I also switched over to the more modern Uno from the other kit(I can tell as the reset button is in a slightly different place) but I'm sure the wiring would just work with the older version as well. &lt;/p&gt;

</description>
      <category>arduino</category>
      <category>hardware</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Systems, Leadership, and the Power of 'We'</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Sat, 03 Jan 2026 01:47:36 +0000</pubDate>
      <link>https://dev.to/sigje/systems-leadership-and-the-power-of-we-11pm</link>
      <guid>https://dev.to/sigje/systems-leadership-and-the-power-of-we-11pm</guid>
      <description>&lt;p&gt;My career has been defined by "hard things"—from building a noSQL service platform at Yahoo to writing &lt;strong&gt;&lt;a href="http://shop.oreilly.com/product/0636920268635.do" rel="noopener noreferrer"&gt;Modern System Administration&lt;/a&gt;&lt;/strong&gt;. Over two decades in the trenches of systems engineering, I’ve learned a vital truth: the secret to tackling complexity isn't just technical expertise; it's the strength of the community around you.&lt;/p&gt;

&lt;p&gt;I recently visited &lt;a href="https://www.yelp.com/biz/nirvana-soul-coffee-san-francisco-7" rel="noopener noreferrer"&gt;&lt;strong&gt;Nirvana Soul&lt;/strong&gt;&lt;/a&gt; in San Francisco and was struck by the reminder on their wall: &lt;strong&gt;"We Can Do Hard Things."&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fywi68b49bh0ip03rc1ey.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fywi68b49bh0ip03rc1ey.jpg" alt="Jennifer Davis at Nirvana Soul with 'We Can Do Hard Things' sign" width="800" height="1066"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As an Engineering Manager, I often reflect on how we challenge our teams to reach their full potential without hitting a breaking point. We &lt;em&gt;can&lt;/em&gt; do hard things—whether that’s improving operability or overhauling legacy infrastructure—but we can only do them consistently if we build a sustainable foundation.&lt;/p&gt;

&lt;p&gt;To me, "doing hard things" in 2026 means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Fostering Community:&lt;/strong&gt; Encouraging a culture of experimentation where it is safe to fail, learn, and iterate.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Prioritizing Reliability:&lt;/strong&gt; Moving beyond "firefighting" mode to build systems that allow for deep work and innovation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Leading with Empathy:&lt;/strong&gt; Ensuring the people building the systems are as well-cared for as the systems themselves.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Systems are hard. Leadership is hard. Community building is hard. But the &lt;strong&gt;"We"&lt;/strong&gt; is what makes it possible.&lt;/p&gt;




&lt;h3&gt;
  
  
  Let’s celebrate your wins
&lt;/h3&gt;

&lt;p&gt;What is one "hard thing" you tackled in the last year that you’re proud of? I’d love to hear about the people or communities that helped you get across the finish line. Let’s start the year celebrating our wins in the comments!&lt;/p&gt;

</description>
      <category>leadership</category>
      <category>devops</category>
      <category>career</category>
      <category>softskills</category>
    </item>
    <item>
      <title>Adventures in System Seeing Day 1</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Tue, 02 Dec 2025 05:44:40 +0000</pubDate>
      <link>https://dev.to/sigje/adventures-in-system-seeing-day-1-iag</link>
      <guid>https://dev.to/sigje/adventures-in-system-seeing-day-1-iag</guid>
      <description>&lt;p&gt;We often think we understand the systems we interact with every day. If you ride a bicycle, you know how it works. But do you? &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Join the &lt;a href="https://www.ruthmalan.com/Advent/2025/Day1.html" rel="noopener noreferrer"&gt;Advent of System Seeing&lt;/a&gt; and follow along with the prompts&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;To kick off December, my family and I joined &lt;a href="https://www.ruthmalan.com/" rel="noopener noreferrer"&gt;Ruth Malan&lt;/a&gt;’s "Advent of System Seeing." The Day 1 prompt sounds deceptively simple: Draw a bicycle. The goal isn't art; it is a warm-up exercise for your brain. It forces you to take your fuzzy mental models and make them concrete on paper. When we sat down to do this—covering bubble diagrams, memory sketches, and reflection—we quickly realized that "the system" looked completely different to each of us.&lt;/p&gt;

&lt;h1&gt;
  
  
  ⚙️ The Bubble Diagram
&lt;/h1&gt;

&lt;p&gt;The first part of the prompt asks for a bubble diagram to show key concepts and relationships.&lt;/p&gt;

&lt;p&gt;This was where my brain went immediately. My drawing was a network of functional parts, for example, frame, gears, seat, and shock absorbers. My husband, Brian, pointed out that this was a distinct "engineering mind" view. I was defining the system by its internal architecture.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3woykoimrhooxtoh3yq.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc3woykoimrhooxtoh3yq.jpg" alt="sigje's bubble diagram and sketch of a bike from memory" width="800" height="634"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, even my "engineering" view had gaps. As I described what I was attempting to draw, for example the complex braking systems, Brian noted, "Most bikes don't have complex brakes. Frankie's doesn't have calipers or cantilevers". I was drawing &lt;em&gt;my bike&lt;/em&gt; and it's important to make that distinction as there isn't just one perfect bike.&lt;/p&gt;

&lt;h1&gt;
  
  
  🚴 Sketching from Memory
&lt;/h1&gt;

&lt;p&gt;The next step is to sketch a bicycle from memory without looking at one. This is where the &lt;a href="https://en.wikipedia.org/wiki/Illusion_of_explanatory_depth" rel="noopener noreferrer"&gt;illusion of explanatory depth&lt;/a&gt; usually collapses.&lt;/p&gt;

&lt;p&gt;Brian struggled to draw the bike in isolation. For him, the system was incomplete without its environment. He eventually drew a rider—a figure resembling Abraham Lincoln—and a cat in a basket. He realized that it was actually easier to draw the mechanics once he placed a human in the center.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9unz96jo1hdfuc9siucg.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9unz96jo1hdfuc9siucg.jpg" alt="brian's bubble diagram and sketch of a bike from memory with Abe Lincoln" width="800" height="587"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This mirrors a reflection from &lt;a href="https://hachyderm.io/@sebhans/115644816763170515" rel="noopener noreferrer"&gt;Sebastian Hans&lt;/a&gt;, another participant in the challenge: "The system only works at all in connection with its environment. Without the ground, the bicycle doesn't move." Brian couldn't see the system until he saw the relationship between the machine and the user.&lt;/p&gt;

&lt;h1&gt;
  
  
  🚀 The Visionary Gap
&lt;/h1&gt;

&lt;p&gt;Then there was our son, Frankie. While the prompt asks us to notice what is there, Frankie focused on what should be there.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0htr7xxlrtrna8be5bdp.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0htr7xxlrtrna8be5bdp.jpg" alt="Frankie's bubble drawing and rocket boostered tricycle" width="800" height="665"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;His mental model of a "functional bike" included rocket launchers and a puppy (heavily influenced by the block style art of Minecraft). His default starting point was actually a tricycle. This sparked a real debate about the boundaries of the system—specifically, does a tricycle count when drawing a bicycle? It was a reminder that users often bring expectations to a system that the designers never anticipated.&lt;/p&gt;

&lt;h1&gt;
  
  
  💡 Why Mental Models Matter
&lt;/h1&gt;

&lt;p&gt;Ruth Malan sums up the purpose of this exercise: "Our mental models are incomplete, but we don't know this until we really engage with them."&lt;/p&gt;

&lt;p&gt;We assume we know how the parts connect. We assume we know what the user needs. But when we are forced to draw it out, we see the gaps.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I saw the mechanical structure.&lt;/li&gt;
&lt;li&gt;Brian saw the human relationship.&lt;/li&gt;
&lt;li&gt;Frankie saw the potential features.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;None of us were wrong, but none of us had the full picture alone. To build better systems, we have to stop assuming our mental model is the only one that matters. We have to pick up a pencil, draw, share our drawing, and then see what everyone else is drawing.&lt;/p&gt;

&lt;h1&gt;
  
  
  Join us
&lt;/h1&gt;

&lt;p&gt;You can join the "System Seeing" practice yourself. It only takes 15-20 minutes.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Visit the prompt: Go to &lt;a href="https://www.ruthmalan.com/Advent/2025/Day1.html" rel="noopener noreferrer"&gt;Ruth Malan’s Day 1 page&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Don't skip the bubble diagram: It reveals how you structure relationships in your head.&lt;/li&gt;
&lt;li&gt;Compare notes: Share your drawing (or a link to your drawing) in the comments and let's talk about it! The differences in the drawings are more valuable than the drawings themselves.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>watercooler</category>
      <category>discuss</category>
      <category>architecture</category>
    </item>
    <item>
      <title>The lumberjack paradox: From theory to practice</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Wed, 19 Nov 2025 00:42:14 +0000</pubDate>
      <link>https://dev.to/googlecloud/the-lumberjack-paradox-from-theory-to-practice-2lb5</link>
      <guid>https://dev.to/googlecloud/the-lumberjack-paradox-from-theory-to-practice-2lb5</guid>
      <description>&lt;p&gt;&lt;a href="https://www.linkedin.com/posts/sigje_i-have-so-many-thoughts-about-this-interesting-ugcPost-7389735278690742272-viPE" rel="noopener noreferrer"&gt;Previously&lt;/a&gt;, I shared my thoughts on Neal Sample’s "&lt;a href="https://www.linkedin.com/pulse/challenge-all-leaders-how-do-you-create-right-culture-david-reimer-1ydbc/" rel="noopener noreferrer"&gt;lumberjack paradox&lt;/a&gt;" and the urgent need to build the systems thinkers of tomorrow. I argued that leaders must move beyond simple efficiency and focus on &lt;a href="https://www.researchgate.net/publication/227690136_Deliberate_Performance_Accelerating_Expertise_in_Natural_Settings" rel="noopener noreferrer"&gt;re-engineering the experience&lt;/a&gt; (Dr. Gary Klein) and creating context to ensure we don't lose the path to deep expertise.&lt;/p&gt;

&lt;p&gt;But what does "leadership as context creator" look like in practice?&lt;/p&gt;

&lt;p&gt;For us in Cloud DevRel Engineering, it isn't abstract. It comes down to how we manage the most fundamental unit of our developer experience: the code sample.&lt;/p&gt;

&lt;p&gt;As Neal notes, AI will lead to the "industrialization of creativity"—an infinite supply of ideas and code. In this world, the premium shifts to discernment: the ability to distinguish the quality from the mediocre.&lt;/p&gt;

&lt;p&gt;But this isn't a choice between the axe (manual craft) and the chainsaw (AI). The modern expert needs both.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;If you only have the axe, you are restricted to the problems that fit within manual reach. It is the perfect tool for the campsite, but it cannot clear the forest.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;But if you only have the chainsaw, without the judgment to guide it, you are dangerous. You lack the control to distinguish a clean cut from a destructive one.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You need the deep expertise of the axe to get the precise, consistent outcomes from the chainsaw.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;From theory to practice: The catalog as ground truth&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;In my previous post, I mentioned Dr. Richard Cook's work on "building common ground" and Donella Meadows’ warnings about &lt;a href="https://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/" rel="noopener noreferrer"&gt;suboptimization&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;In Cloud DevRel Engineering, we realized that our code samples are the primary tool for building this common ground. In Dr. Cook’s terms, they form the "&lt;a href="https://queue.acm.org/detail.cfm?id=3380777" rel="noopener noreferrer"&gt;Line of Representation&lt;/a&gt;"—the tangible surface that connects the human "above the line" to the complex system "below the line."&lt;/p&gt;

&lt;p&gt;When a developer (the human) learns a new platform, the sample is their manual for the "axe." When an AI assistant generates a solution, the sample is the training data that guides the "chainsaw."&lt;/p&gt;

&lt;p&gt;When we looked at our systems, we saw suboptimization. By treating samples as low-priority content maintained by individual contributors, we created a fractured reality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;We broke the Line of Representation.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We saw this failure hit on two fronts:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;We break the human judgment loop:&lt;/strong&gt; If samples are inconsistent, developers cannot learn "good" from "bad." We fail to re-engineer the experience (Dr. Klein) necessary to build expertise.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;We poison the AI well:&lt;/strong&gt; AI models ingest our official repositories. The AI learns them, scales them, and feeds them back to the user.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We are currently witnessing exactly how this hand-crafted approach fails at scale.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;The high cost of "geological strata" in code&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Without central standardization, our repositories accumulated "geological strata"—layers of outdated practices—because manual maintenance cannot keep up with language evolution. This makes it hard to know what is correct today.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Node.js' paradigm tax:&lt;/strong&gt; Our Node.js repositories contain a mix of callbacks, raw promises, and async/await. A user learning Pub/Sub sees one era, while a user learning Cloud Storage sees another. The AI sees all of it and treats it all as valid, stripping away the context of "outdated" versus "modern."
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Python: The contributor long tail:&lt;/strong&gt; With over 650 contributors, our Python samples suffer from extreme fragmentation. The &lt;strong&gt;total cost of ownership (TCO)&lt;/strong&gt; of manually bringing thousands of older snippets up to modern Python 3.10+ standards is astronomically high, so it simply doesn't happen. This leaves a massive surface area of "technical debt" that the AI happily recycles.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Inconsistent quality creates "false best practices"&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;When samples are hand-written by federated teams, personal "developer flair" masquerades as industry best practice. Users copy-paste these patterns, inadvertently adopting technical debt.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Java's Framework creep:&lt;/strong&gt; Instead of teaching the core platform, contributors often introduce heavy frameworks for simple tasks. This increases the "time-to-hello-world" and teaches the AI that simple tasks require complex dependencies.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Python vs. Go:&lt;/strong&gt; Most Go samples handle errors correctly because the language forces it. Many Python samples show only the "happy path," skipping &lt;strong&gt;critical distributed systems patterns&lt;/strong&gt; like exponential backoff or retry logic. The AI then generates code that looks clean but fails in production.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;The hidden cost of incoherence&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;This is the "suboptimization" Donella Meadows warned about. It is not enough for individual samples to be correct in isolation; they must function as a cohesive unit.&lt;/p&gt;

&lt;p&gt;For a human developer, shifting between products that use different coding styles creates friction. They have to spend mental energy decoding the "dialect" of a specific product team rather than focusing on the logic.&lt;/p&gt;

&lt;p&gt;For an AI, this lack of cohesion is even more dangerous.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The Context Gap:&lt;/strong&gt; When our samples for Cloud Storage look structurally different from our samples for BigQuery, the AI treats them as unrelated entities. It fails to learn the underlying "grammar" of our platform.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The Integration Failure:&lt;/strong&gt; When a user asks for a solution that combines these products, the AI struggles to bridge the gap. Lacking a consistent pattern to follow, it often hallucinates a messy, "glue code" solution that is brittle and insecure.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By allowing fragmentation, we aren't just impacting the docs; we are training the AI to misunderstand how our platform is supposed to fit together.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Get started&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;We cannot view code samples as static documentation. They are the active constraints of our system—the "environment" we design for our users. If we fail to maintain them, we dull the tools that build developer judgment, and we degrade the quality of the AI they trust.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Recommended Reading&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;If you want to dig deeper into the systems thinking concepts behind this post, I recommend starting here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;On the "Line of Representation":&lt;/strong&gt; &lt;a href="https://queue.acm.org/detail.cfm?id=3380777" rel="noopener noreferrer"&gt;Above the Line, Below the Line&lt;/a&gt; by Dr. Richard Cook — The essential framework for understanding why we must care about the representations (like code samples) that sit between us and complex systems.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;On System Failure:&lt;/strong&gt; &lt;a href="https://how.complexsystems.fail/" rel="noopener noreferrer"&gt;How Complex Systems Fail&lt;/a&gt; by Dr. Richard Cook — His classic treatise on why failure is never about a single "root cause" but the result of multiple latent factors.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;On Suboptimization:&lt;/strong&gt; &lt;a href="https://donellameadows.org/archives/leverage-points-places-to-intervene-in-a-system/" rel="noopener noreferrer"&gt;Leverage Points: Places to Intervene in a System&lt;/a&gt; by Donella Meadows — The definitive essay on why optimizing parts often destroys the whole.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;On Re-engineering Experience:&lt;/strong&gt; &lt;a href="https://www.researchgate.net/publication/227690136_Deliberate_Performance_Accelerating_Expertise_in_Natural_Settings" rel="noopener noreferrer"&gt;Deliberate Performance&lt;/a&gt; by Dr. Gary Klein — Research on how to build expertise when you can't stop the work to train.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Coming up next
&lt;/h2&gt;

&lt;p&gt;Next in this series, I will share our structural solution: the "Golden Path." This approach moves us away from isolated automation and towards a human-led, AI-scaled system that improves consistency.&lt;/p&gt;

&lt;p&gt;I’ll be focusing more on the strategy in this series, but the execution is its own journey. Using AI to write code is well-known, but relying on it to produce production-ready educational content? &lt;a href="https://dev.to/grayside"&gt;Adam Ross&lt;/a&gt;, and &lt;a href="https://www.linkedin.com/in/nimjay/" rel="noopener noreferrer"&gt;Nim Jayawardena&lt;/a&gt; have shared the technical reality of our team's shift in a post on their &lt;a href="https://cloud.google.com/blog/topics/developers-practitioners/7-technical-takeaways-from-using-gemini-to-generate-code-samples-at-scale?e=48754805" rel="noopener noreferrer"&gt;&lt;strong&gt;7 takeaways from generating samples at scale with Gemini&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Until then, ask yourself:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Are you trying to automate away your documentation debt without first defining a standard of quality?
&lt;/li&gt;
&lt;li&gt;Are your samples strong enough to serve as the "ground truth" for the AI models your developers rely on?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Special thanks to &lt;a href="https://dev.to/glasnt"&gt;Katie McLaughlin&lt;/a&gt;, &lt;a href="https://dev.to/grayside"&gt;Adam Ross&lt;/a&gt;, and &lt;a href="https://www.linkedin.com/in/nimjay/" rel="noopener noreferrer"&gt;Nim Jayawardena&lt;/a&gt; for reviewing early drafts of this post.&lt;/em&gt; &lt;/p&gt;

</description>
      <category>cloud</category>
      <category>ai</category>
      <category>programming</category>
    </item>
    <item>
      <title>How do you translate Cloud Computing?</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Thu, 12 Jun 2025 03:04:41 +0000</pubDate>
      <link>https://dev.to/sigje/how-do-you-translate-cloud-computing-5h39</link>
      <guid>https://dev.to/sigje/how-do-you-translate-cloud-computing-5h39</guid>
      <description>&lt;p&gt;I was reading a web page in español and saw the term "en la computación en la nube" (the translation for cloud computing) and that gave me pause. Is there a shorter version to communicate the concept of hosted/managed compute? How do you convey cloud computing in your primary language? I'm curious as my primary language is English and I'd never considered it before. &lt;/p&gt;

&lt;p&gt;It would be great to know the words used and the approximate meaning of those words. &lt;/p&gt;

&lt;p&gt;I think "en la computación en la nube" translates to "computing in the cloud" but it might actually have a different meaning contextually that I'm not aware of. &lt;/p&gt;

&lt;p&gt;Thanks in advance! &lt;/p&gt;

</description>
      <category>discuss</category>
    </item>
    <item>
      <title>TIL - April 25, 2025</title>
      <dc:creator>Jennifer Davis</dc:creator>
      <pubDate>Sat, 26 Apr 2025 05:31:24 +0000</pubDate>
      <link>https://dev.to/sigje/til-april-25-2025-54kk</link>
      <guid>https://dev.to/sigje/til-april-25-2025-54kk</guid>
      <description>&lt;p&gt;By default, &lt;a href="https://cloud.google.com/pubsub/docs/push#receive_push" rel="noopener noreferrer"&gt;Google Cloud Pub/Sub Push Subscriptions&lt;/a&gt; deliver a message to a push endpoint as a &lt;em&gt;Wrapped&lt;/em&gt; message in the following (abbrievated) format&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "message": {
        "data": string,
        "messageId": string,
        "message_id": string,
        "publishTime": string,
        "publish_time": string
    },
    "subscription": string
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is hidden in the Push Subscription documentation and not surfaced in the &lt;a href="https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessage" rel="noopener noreferrer"&gt;PubsubMessage reference documentation&lt;/a&gt; which shows the following format:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "data": string,
  "attributes": {
    string: string,
    ...
  },
  "messageId": string,
  "publishTime": string,
  "orderingKey": string
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I haven't verified, but I suspect that the configuration for choosing unwrapped messages is to &lt;a href="https://cloud.google.com/pubsub/docs/create-push-subscription#payload-unwrapping" rel="noopener noreferrer"&gt;Enable payload unwrapping&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Have you used Cloud Run with Pub/Sub and ran into this? Let me know in the comments.&lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>todayilearned</category>
    </item>
  </channel>
</rss>
