<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: vinal-2</title>
    <description>The latest articles on DEV Community by vinal-2 (@vinal2).</description>
    <link>https://dev.to/vinal2</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/vinal2"/>
    <language>en</language>
    <item>
      <title>How I Verify Azure Pricing Accuracy Against Real Invoices (And Why I Had To)</title>
      <dc:creator>vinal-2</dc:creator>
      <pubDate>Thu, 09 Apr 2026 17:32:10 +0000</pubDate>
      <link>https://dev.to/vinal2/-how-i-verify-azure-pricing-accuracy-against-real-invoices-and-why-i-had-to-44m7</link>
      <guid>https://dev.to/vinal2/-how-i-verify-azure-pricing-accuracy-against-real-invoices-and-why-i-had-to-44m7</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;From "AI garbage" accusations to a trusted calculator — the technical journey of validating every price on AzureCalc.uk&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I built an Azure cost calculator &lt;a href="https://www.azure-calc.co.uk/" rel="noopener noreferrer"&gt;https://www.azure-calc.co.uk/&lt;/a&gt; as a weekend project. The first piece of feedback I got was: “this looks AI-generated.”&lt;/p&gt;

&lt;p&gt;They weren't entirely wrong.&lt;/p&gt;

&lt;p&gt;Ouch.. fair as I had hardcoded pricing constants, generic guide content, and no methodology transparency.&lt;/p&gt;

&lt;p&gt;The antidote to "AI garbage" is showing your working. This article explains exactly how I did that.&lt;/p&gt;

&lt;p&gt;Layer 1: The Data Pipeline — Azure Retail Prices API to Cloudflare D1&lt;/p&gt;

&lt;p&gt;The foundation is the &lt;strong&gt;Azure Retail Prices API&lt;/strong&gt; (&lt;code&gt;prices.azure.com&lt;/code&gt;). Unlike scraping or manual spreadsheets, this is Microsoft's official price feed. But it's 50,000+ rows for UK South alone, updated monthly. You can't query this in real-time for every calculator request.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdwatzknit3otj4rj0b3s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdwatzknit3otj4rj0b3s.png" alt=" " width="645" height="258"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Worker filters to UK South, GBP-only, then inserts into a database table. This gives us:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fympqb9yexnylpz20p2od.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fympqb9yexnylpz20p2od.png" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Layer 2: The Verification Trap — Hardcoded Constants vs. Live Data
&lt;/h2&gt;

&lt;p&gt;Here's where I went wrong in Sprint 0. I hardcoded the Log Analytics PAYG rate as &lt;code&gt;£2.76/GB&lt;/code&gt; based on Azure's documentation. But documentation lags reality.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9rk3su32yf6z35lr6li5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9rk3su32yf6z35lr6li5.png" alt=" " width="800" height="77"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That came directly from Azure documentation.&lt;/p&gt;

&lt;p&gt;After querying actual pricing data:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpwwwo5oquwkfil4p0596.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpwwwo5oquwkfil4p0596.png" alt=" " width="800" height="105"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That is a 27% error.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Fix: D1-First Development
&lt;/h3&gt;

&lt;p&gt;Now, no price enters the codebase unless it exists in D1.&lt;/p&gt;

&lt;p&gt;Every calculator now follows:&lt;/p&gt;

&lt;p&gt;Query D1 for available SKUs&lt;br&gt;
Match frontend inputs to actual SKU names&lt;br&gt;
Store the query used for verification&lt;br&gt;
Surface the same value in the UI&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F29m2i4ghtujvt0maswz6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F29m2i4ghtujvt0maswz6.png" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Layer 3: Real Invoice Validation — The Ultimate Ground Truth
&lt;/h2&gt;

&lt;p&gt;API prices are theoretical. Invoices are reality. The gap between them is where discount programs (EA, CSP, MACC) live.&lt;/p&gt;

&lt;p&gt;To reconcile this, I run controlled validations such as every quarter, I run a &lt;strong&gt;Invoice vs. Calculator&lt;/strong&gt; reconciliation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Deploy a known workload (e.g. Log Analytics + App Service)&lt;/li&gt;
&lt;li&gt;Let it run for a billing cycle&lt;/li&gt;
&lt;li&gt;Capture Azure invoice data&lt;/li&gt;
&lt;li&gt;Run identical inputs through the calculator&lt;/li&gt;
&lt;li&gt;Compare line-by-line&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7e8jnfvwd19eb3qqle4j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7e8jnfvwd19eb3qqle4j.png" alt=" " width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; These are PAYG rates. EA/CSP discounts would show variance here — that's expected and documented on the &lt;code&gt;/methodology&lt;/code&gt; page.&lt;/p&gt;

&lt;h2&gt;
  
  
  Layer 4: The Formula Disclosure — Showing Your Working
&lt;/h2&gt;

&lt;p&gt;The most effective trust signal I added was the &lt;strong&gt;FormulaDisclosure&lt;/strong&gt; component. Every calculator result shows the exact arithmetic:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxphl6h54laz6210q4z4z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxphl6h54laz6210q4z4z.png" alt=" " width="800" height="38"&gt;&lt;/a&gt;&lt;br&gt;
Tier: Pay-as-you-go · Region: UK South&lt;br&gt;
Price fetched: 08 Apr 2026 from Azure Retail Prices API&lt;/p&gt;

&lt;p&gt;This serves two purposes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Verification&lt;/strong&gt; — Engineers can check the unit price against their own sources&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Education&lt;/strong&gt; — Shows how Azure billing actually works (unit price × quantity × time)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The implementation is a React component that takes the raw API response and the user's inputs, then generates the formula string dynamically. No hardcoded example text — real data only.&lt;/p&gt;

&lt;h2&gt;
  
  
  Layer 5: Price History &amp;amp; Alerts — Proving the Data Is Live
&lt;/h2&gt;

&lt;p&gt;Static pricing pages are the hallmark of abandoned tools. Live data needs evidence of life.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Price History Page
&lt;/h3&gt;

&lt;p&gt;Here &lt;a href="https://www.azure-calc.co.uk/history/" rel="noopener noreferrer"&gt;https://www.azure-calc.co.uk/history/&lt;/a&gt; every price change is logged to a &lt;code&gt;price_history&lt;/code&gt; table.&lt;/p&gt;

&lt;p&gt;The page shows the last 10 price movements. This proves:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The data isn't static&lt;/li&gt;
&lt;li&gt;Someone is monitoring it&lt;/li&gt;
&lt;li&gt;Price changes are tracked with timestamps&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The TrustBar
&lt;/h3&gt;

&lt;p&gt;Every page shows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Last update timestamp&lt;/li&gt;
&lt;li&gt;Number of prices cached&lt;/li&gt;
&lt;li&gt;UK South + GBP&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is not a UI feature.&lt;/p&gt;

&lt;p&gt;It is evidence that the system is actively maintained&lt;/p&gt;

&lt;h2&gt;
  
  
  Layer 6: Zod Schema Validation — Preventing Frontend/Backend Drift
&lt;/h2&gt;

&lt;p&gt;Another failure mode: the frontend adds a new SKU, but the API Worker's Zod schema doesn't recognize it. Result: HTTP 400 errors and "Error /mo" on the calculator.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Pattern
&lt;/h3&gt;

&lt;p&gt;Every calculator has a shared Zod schema in &lt;code&gt;workers/api/validation.ts&lt;/code&gt;. When the frontend adds a new tier or SKU, the schema must be updated first. This is now part of the end-of-sprint checklist.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcq157cq2cu4hdn3tngro.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcq157cq2cu4hdn3tngro.png" alt=" " width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Technical Tips for Building Your Own Verified Calculator
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;[ ] Query D1 for every price used in the calculator — verify exact match&lt;/li&gt;
&lt;li&gt;[ ] Test every API endpoint with real frontend requests (DevTools → Network)&lt;/li&gt;
&lt;li&gt;[ ] Verify FormulaDisclosure shows live D1 rate&lt;/li&gt;
&lt;li&gt;[ ] Hard refresh the live site, check TrustBar timestamp updates&lt;/li&gt;
&lt;li&gt;[ ] Zero TypeScript errors under strict mode&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Results: From "AI Garbage" to "Tight Loop No Other Tool Does"
&lt;/h2&gt;

&lt;p&gt;Three sprints later, the same Reddit thread had this comment from a cloud architect:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"The KQL query builder is the part I'd lean into hardest... If you can pair the calculator output with the KQL query that surfaces what it actually costs in the user's workspace, you've got a really tight loop that no other tool in this space does well right now."&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That's the difference between a tool that feels generated and one that feels maintained. The methodology isn't just documentation — it's the actual process I follow every night at 02:00 UTC when the cron trigger fires.&lt;/p&gt;

&lt;p&gt;The "AI garbage" label sticks to tools that feel generated rather than maintained. The antidote isn't better copy — it's evidence of ongoing operational work.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;For &lt;a href="https://www.azure-calc.co.uk/" rel="noopener noreferrer"&gt;https://www.azure-calc.co.uk/&lt;/a&gt;, that evidence is:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Live price data&lt;/strong&gt; refreshed nightly, with history&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Real invoice reconciliation&lt;/strong&gt; showing 0% variance on PAYG rates&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Formula disclosure&lt;/strong&gt; on every result showing the exact arithmetic&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Verified KQL queries&lt;/strong&gt; tested against real workspaces&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Public changelog&lt;/strong&gt; documenting actual fixes, not feature lists&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're building a data-driven tool, apply the same rigor. Your users might not read your methodology page, but they'll feel the difference between a static page and a living system.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Try it:&lt;/strong&gt; &lt;a href="https://www.azure-calc.co.uk/methodology" rel="noopener noreferrer"&gt;https://www.azure-calc.co.uk/methodology&lt;/a&gt; — see the exact API query and verification process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Open source:&lt;/strong&gt; The methodology is public. If something looks wrong, &lt;a href="mailto:gravitycontextdev@gmail.com"&gt;email me&lt;/a&gt; and I'll check it against the Azure API within 24 hours.&lt;/p&gt;

&lt;h2&gt;
  
  
  Series
&lt;/h2&gt;

&lt;p&gt;This is post 1 of "Building AzureCalc.uk" — a technical series on building  credible infrastructure tools. Follow for Sprint 3 (Networking), the Price Alerts feature, and the Invoice Reconciliation deep-dive.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>cloudcomputing</category>
      <category>ai</category>
      <category>cloudflarechallenge</category>
    </item>
    <item>
      <title>Automated Handwritten Food Order Processing with n8n and Claude's Vision API</title>
      <dc:creator>vinal-2</dc:creator>
      <pubDate>Sun, 15 Mar 2026 22:06:52 +0000</pubDate>
      <link>https://dev.to/vinal2/automated-handwritten-food-order-processing-with-n8n-and-claudes-vision-api-3g5f</link>
      <guid>https://dev.to/vinal2/automated-handwritten-food-order-processing-with-n8n-and-claudes-vision-api-3g5f</guid>
      <description>&lt;p&gt;A family firend run food business was drowning in handwritten orders. Here's how I built an n8n pipeline that uses vision API to read handwritten order slips and log them automatically to Google Sheets without the need for manual data entry.&lt;/p&gt;

&lt;p&gt;In this article I'll walk you through exactly how I built it, the specific problems I hit along the way, and how you can replicate it in an afternoon.&lt;/p&gt;

&lt;p&gt;Prerequisites:&lt;/p&gt;

&lt;p&gt;Before you start you'll need:&lt;br&gt;
A running n8n instance (cloud or self-hosted)&lt;br&gt;
An Anthropic API key&lt;br&gt;
A Google account with Drive and Sheets access&lt;br&gt;
Basic familiarity with n8n nodes&lt;/p&gt;

&lt;p&gt;Step 1: Setting Up the Trigger&lt;/p&gt;

&lt;p&gt;I used a Google Drive trigger node set to watch a specific folder called Orders. &lt;/p&gt;

&lt;p&gt;Whenever a new image file lands in that folder — whether dropped there manually or sent from a phone — the workflow fires.&lt;/p&gt;

&lt;p&gt;To configure it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a Google Drive Trigger node in n8n&lt;/li&gt;
&lt;li&gt;Connect your Google account via OAuth (more on the gotcha here later)&lt;/li&gt;
&lt;li&gt;Set the event to "File Created"&lt;/li&gt;
&lt;li&gt;Point it at your incoming orders folder&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you'd rather use Telegram, you can swap this node for a Telegram Trigger instead. I tested both. The Drive approach works better for a shared team workflow. Telegram is better if you're the only one sending orders in.&lt;/p&gt;

&lt;p&gt;Step 2: Sending the Image to Claude&lt;/p&gt;

&lt;p&gt;Once the trigger fires and the file is downloaded, we send it to Claude's vision API using an HTTP Request node. The API endpoint is:&lt;br&gt;
POST &lt;a href="https://api.anthropic.com/v1/messages" rel="noopener noreferrer"&gt;https://api.anthropic.com/v1/messages&lt;/a&gt;&lt;br&gt;
The headers you need:&lt;br&gt;
{&lt;br&gt;
  "x-api-key": "YOUR_ANTHROPIC_API_KEY",&lt;br&gt;
  "anthropic-version": "2023-06-01",&lt;br&gt;
  "content-type": "application/json"&lt;br&gt;
}&lt;br&gt;
The body is where the prompt lives. Here's the exact prompt I used:&lt;br&gt;
{&lt;br&gt;
  "model": "claude-opus-4-6",&lt;br&gt;
  "max_tokens": 1024,&lt;br&gt;
  "messages": [&lt;br&gt;
    {&lt;br&gt;
      "role": "user",&lt;br&gt;
      "content": [&lt;br&gt;
        {&lt;br&gt;
          "type": "image",&lt;br&gt;
          "source": {&lt;br&gt;
            "type": "base64",&lt;br&gt;
            "media_type": "image/jpeg",&lt;br&gt;
            "data": "{{ $binary.data.toString('base64') }}"&lt;br&gt;
          }&lt;br&gt;
        },&lt;br&gt;
        {&lt;br&gt;
          "type": "text",&lt;br&gt;
          "text": "This is a handwritten food order slip. Extract all order items, quantities, and any special instructions. Return the result as a JSON object with the following structure: { \"items\": [ { \"name\": \"\", \"quantity\": 0, \"notes\": \"\" } ], \"table\": \"\", \"timestamp\": \"\" }. Return only the JSON, no explanation."&lt;br&gt;
        }&lt;br&gt;
      ]&lt;br&gt;
    }&lt;br&gt;
  ]&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;Step 3: Parsing Claude's Response&lt;/p&gt;

&lt;p&gt;Claude returns a response object. The actual content you want is nested inside it at:&lt;/p&gt;

&lt;p&gt;{{ $json.content[0].text }}&lt;/p&gt;

&lt;p&gt;That string is your JSON order data, but it's still a string at this point. Add a Code node in n8n with this snippet to parse it cleanly:&lt;/p&gt;

&lt;p&gt;const raw = $input.first().json.content[0].text;&lt;/p&gt;

&lt;p&gt;const parsed = JSON.parse(raw);&lt;br&gt;
return [{ json: parsed }];&lt;/p&gt;

&lt;p&gt;After this node your workflow has a clean JavaScript object with the order items, table number, and any special instructions.&lt;/p&gt;

&lt;p&gt;Step 4: Writing to Google Sheets&lt;/p&gt;

&lt;p&gt;Add a Google Sheets node set to Append Row. Connect your Google account, point it at your orders spreadsheet, and map the fields from the parsed output.&lt;/p&gt;

&lt;p&gt;Keep it simple and build on it such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Timestamp&lt;/li&gt;
&lt;li&gt;Table&lt;/li&gt;
&lt;li&gt;Item&lt;/li&gt;
&lt;li&gt;Quantity&lt;/li&gt;
&lt;li&gt;Notes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Because the AI returns an array of items, you may have multiple rows per order slip. Handle this with a Split In Batches node before the Sheets node it loops through each item in the array and writes one row per item.&lt;/p&gt;

&lt;p&gt;What Actually Tripped Me Up and lessons to learn&lt;/p&gt;

&lt;p&gt;Google OAuth verification&lt;/p&gt;

&lt;p&gt;When you first connect Google Drive and Google Sheets in n8n, Google flags the OAuth app as unverified if you're running a self-hosted instance. You'll see a warning screen. The fix is to click "Advanced" and then "Go to app" during development this is fine for a personal workflow.&lt;/p&gt;

&lt;p&gt;Handwriting quality&lt;/p&gt;

&lt;p&gt;AI vision handles surprisingly messy handwriting well, but very faint pencil or heavily smudged ink causes extraction errors. I added a simple validation steps such as if the returned items array is empty, the workflow sends a Telegram alert flagging the image for manual review.&lt;/p&gt;

&lt;p&gt;That's it!&lt;br&gt;
If you build this or adapt it for something else, drop a comment below.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>webdev</category>
      <category>programming</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
