<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Eric Rodríguez</title>
    <description>The latest articles on DEV Community by Eric Rodríguez (@ericrodriguez10).</description>
    <link>https://dev.to/ericrodriguez10</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ericrodriguez10"/>
    <language>en</language>
    <item>
      <title>Day 60: Decoupling State and CloudWatch FinOps</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Sat, 18 Apr 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-60-decoupling-state-and-cloudwatch-finops-4g5f</link>
      <guid>https://dev.to/ericrodriguez10/day-60-decoupling-state-and-cloudwatch-finops-4g5f</guid>
      <description>&lt;p&gt;Today, on Day 60 of my 100 Days of Cloud challenge, I had to pause building new features to clean up some critical technical debt in my Serverless AI Financial Agent. When you are preparing an application to scale and handle real users, the little things you hardcoded during the sandbox phase will inevitably come back to haunt you. I faced two specific operational leaks today: one affecting my application's state, and the other quietly threatening my cloud billing.&lt;/p&gt;

&lt;p&gt;Fix 1: Decoupling Identity with Lambda Environment Variables&lt;/p&gt;

&lt;p&gt;My application was processing duplicate user reports, specifically sending two emails at the exact same time every afternoon. After thoroughly investigating Amazon DynamoDB and confirming the database was completely clean, I realized the issue was hiding inside the Lambda execution environment itself. During my initial testing weeks ago, I left a mock USER_ID hardcoded as a fallback in my Python logic.&lt;/p&gt;

&lt;p&gt;Because this hardcoded ID didn't match my real Amazon Cognito UUID in the database, the code generated a fake profile in memory and merged it with the real database records just before processing the SQS queue. The solution was to completely decouple the configuration from the code. I stripped the hardcoded ID from the Python script and injected the target user securely via AWS Lambda Environment Variables. Now, the code is dynamic, stateless, and ready to handle multiple tenants without identity collisions. Your configuration should always live outside your code.&lt;/p&gt;

&lt;p&gt;Fix 2: The Infinite Log Trap in Amazon CloudWatch&lt;/p&gt;

&lt;p&gt;The second issue was a silent FinOps time bomb. If you use AWS Lambda, you know that it automatically logs all output to Amazon CloudWatch. However, you might not realize that by default, the retention policy for these Log Groups is set to "Never Expire".&lt;/p&gt;

&lt;p&gt;If you have a high-traffic application, retaining debug logs indefinitely will eventually result in a hefty and unnecessary storage bill. I navigated to the CloudWatch console and changed the retention policy for my Lambda functions to 14 days. This quick 30-second fix acts as an automated garbage collector. It gives me a comfortable two-week sliding window to troubleshoot any bugs, while AWS automatically destroys the useless historical text logs before I have to pay for storing them.&lt;/p&gt;

&lt;p&gt;Architecture is not just about what you build, but also about what you actively choose not to keep. Never hardcode your state, and never keep your logs forever!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv8x3psrek5fnpfzblgq9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv8x3psrek5fnpfzblgq9.png" alt=" " width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>finops</category>
      <category>python</category>
    </item>
    <item>
      <title>Day 59: Fixing Race Conditions with DynamoDB Atomic Locks 🔒</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Fri, 17 Apr 2026 15:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-59-fixing-race-conditions-with-dynamodb-atomic-locks-35mm</link>
      <guid>https://dev.to/ericrodriguez10/day-59-fixing-race-conditions-with-dynamodb-atomic-locks-35mm</guid>
      <description>&lt;p&gt;Are your AWS Lambdas doing things twice? Here is how to fix it.&lt;/p&gt;

&lt;p&gt;Today, my SQS queue delivered duplicate messages, and my Lambda function sent two emails to the same user. My initial code checked if a lock existed before writing it, but a classic "Race Condition" bypassed it.&lt;/p&gt;

&lt;p&gt;The Fix: Use attribute_not_exists in DynamoDB.&lt;/p&gt;

&lt;p&gt;Python&lt;br&gt;
import botocore&lt;/p&gt;

&lt;p&gt;email_lock_key = f"email_report_{today}_{user_id}"&lt;/p&gt;

&lt;p&gt;try:&lt;br&gt;
    # ATOMIC OPERATION: Write ONLY if it doesn't exist&lt;br&gt;
    cache_table.put_item(&lt;br&gt;
        Item={&lt;br&gt;
            'cache_key': email_lock_key,&lt;br&gt;
            'status': 'sent',&lt;br&gt;
            'ttl': int(time.time()) + 86400, # 24h expiration&lt;br&gt;
            'user_id': user_id&lt;br&gt;
        },&lt;br&gt;
        ConditionExpression='attribute_not_exists(cache_key)'&lt;br&gt;
    )&lt;br&gt;
except botocore.exceptions.ClientError as e:&lt;br&gt;
    if e.response['Error']['Code'] == 'ConditionalCheckFailedException':&lt;br&gt;
        print("Race condition intercepted! Lock already exists.")&lt;br&gt;
        # Abort duplicate process here&lt;br&gt;
By doing this, you let DynamoDB act as the absolute source of truth. The hardest part? Having to manually delete these cache keys in the AWS Console just to be able to test the system again!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zytmiwro8mukmnyudqx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zytmiwro8mukmnyudqx.png" alt=" " width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>python</category>
      <category>dynamodb</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Day 58: Fix your Fintech app's calendar logic (The Payroll Offset)</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Thu, 16 Apr 2026 15:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-58-fix-your-fintech-apps-calendar-logic-the-payroll-offset-2kl</link>
      <guid>https://dev.to/ericrodriguez10/day-58-fix-your-fintech-apps-calendar-logic-the-payroll-offset-2kl</guid>
      <description>&lt;p&gt;When building AI-powered apps, your standard notification logic needs a serious rethink.&lt;/p&gt;

&lt;p&gt;Today, I wanted my AI Financial Agent to send a daily morning email to all users by default.&lt;/p&gt;

&lt;p&gt;The Problem: Traditional B2C apps send static templates (cheap). My app uses Amazon Bedrock to generate custom semantic text (expensive). If I blindly loop through 500 inactive or test users and call Bedrock 500 times just to send emails they will never read, my AWS bill will explode. 💥&lt;/p&gt;

&lt;p&gt;The Fix: I added a "FinOps" guardrail in my EventBridge Orchestrator Lambda before it pushes tasks to the SQS worker queue.&lt;/p&gt;

&lt;p&gt;Python&lt;/p&gt;

&lt;h1&gt;
  
  
  Before sending to SQS for AI processing, filter out the noise.
&lt;/h1&gt;

&lt;p&gt;valid_users = []&lt;br&gt;
for u in user_metadata_list:&lt;br&gt;
    # 1. Product Decision: Opt-out approach&lt;br&gt;
    wants_email = u.get('wants_daily_email', True) &lt;/p&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# 2. FinOps Decision: Drop ghost/test accounts&lt;br&gt;
is_test_account = str(u.get('user_id', '')).startswith('test_')

&lt;p&gt;if wants_email and not is_test_account:&lt;br&gt;
    valid_users.append(u)&lt;br&gt;
&lt;/p&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h1&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  ONLY push valid, engaged users to the expensive AI Worker queue&lt;br&gt;
&lt;/h1&gt;

&lt;p&gt;for user in valid_users:&lt;br&gt;
    sqs.send_message(QueueUrl=SQS_URL, MessageBody=json.dumps({"user_id": user['user_id']}))&lt;br&gt;
By filtering test_ accounts and respecting user Opt-Outs at the router level, my AI Worker Lambda only invokes the LLM for users that actually matter.&lt;/p&gt;

&lt;p&gt;Code for engagement, but architect for cost!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzi6ep3c1f4vtqd3twz7z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzi6ep3c1f4vtqd3twz7z.png" alt=" " width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>python</category>
      <category>fintech</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Day 58: Don't let GenAI bankrupt your Serverless App</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Wed, 15 Apr 2026 15:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-58-dont-let-genai-bankrupt-your-serverless-app-2gnj</link>
      <guid>https://dev.to/ericrodriguez10/day-58-dont-let-genai-bankrupt-your-serverless-app-2gnj</guid>
      <description>&lt;p&gt;When building AI-powered apps, be very careful with cron jobs.&lt;/p&gt;

&lt;p&gt;Today I wanted to make my AI Financial Agent send a daily morning email to all users by default (like traditional finance apps do).&lt;/p&gt;

&lt;p&gt;The Problem: Traditional apps send static templates. My app uses Amazon Bedrock to generate custom text. If I loop through 500 inactive users and call Bedrock 500 times just to send emails they won't read, my AWS bill will explode. 💥&lt;/p&gt;

&lt;p&gt;The Fix: I added a "FinOps" filter in my EventBridge orchestrator before it sends tasks to the SQS queue.&lt;/p&gt;

&lt;h1&gt;
  
  
  Before sending to SQS for AI processing, filter out the noise.
&lt;/h1&gt;

&lt;p&gt;valid_users = []&lt;br&gt;
for u in all_users:&lt;br&gt;
    wants_email = u.get('wants_daily_email', True) # Opt-out approach&lt;br&gt;
    is_test = str(u.get('user_id', '')).startswith('test_')&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if wants_email and not is_test:
    valid_users.append(u)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;By filtering test_ accounts and respecting user Opt-Outs before the SQS queue, my AI worker Lambda only invokes Bedrock for users that actually matter. Code for engagement, architect for cost!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5xqw57ahsd82wv1hmank.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5xqw57ahsd82wv1hmank.png" alt=" " width="800" height="481"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>python</category>
      <category>finops</category>
    </item>
    <item>
      <title>Day 57: Dynamic HTML Emails in AWS Lambda (FinTech UX) 🎨</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Tue, 14 Apr 2026 15:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-57-dynamic-html-emails-in-aws-lambda-fintech-ux-2n3f</link>
      <guid>https://dev.to/ericrodriguez10/day-57-dynamic-html-emails-in-aws-lambda-fintech-ux-2n3f</guid>
      <description>&lt;p&gt;Financial APIs like Plaid are great, but their data structures are built for accountants, not normal humans. &lt;/p&gt;

&lt;p&gt;Today, I rewrote the notification engine for my AI Agent. &lt;/p&gt;

&lt;h3&gt;
  
  
  The Problem
&lt;/h3&gt;

&lt;p&gt;My Lambda was sending emails that looked like a raw JSON dump. Refunds appeared as negative numbers (&lt;code&gt;-4.22€&lt;/code&gt;), and expenses were mixed with income. &lt;/p&gt;

&lt;p&gt;The Fix&lt;br&gt;
I built a dynamic HTML builder in Python that conditionally renders sections based on the day's payload:&lt;/p&gt;

&lt;p&gt;Separate transactions and fix signs&lt;br&gt;
for t in transactions:&lt;br&gt;
    amount = float(t['amount'])&lt;br&gt;
    if amount &amp;lt; 0 or is_income_keyword(t['description']):&lt;br&gt;
        income_txs.append(t)&lt;br&gt;
    else:&lt;br&gt;
        expense_txs.append(t)&lt;/p&gt;

&lt;h1&gt;
  
  
  Conditionally render HTML
&lt;/h1&gt;

&lt;p&gt;if income_txs:&lt;br&gt;
    html += "&lt;/p&gt;
&lt;h4&gt;Income / Credits&lt;/h4&gt;"&lt;br&gt;
    # Format negative API floats to positive UX strings&lt;br&gt;
    html += build_rows(abs(amount), color="#10b981") &lt;br&gt;
Now, the emails are cleanly divided, visually color-coded, and highly readable. Plus, the AWS EventBridge orchestrator now perfectly balances hourly SMS alerts with the 9:00 AM daily SQS report.

&lt;p&gt;Never underestimate the power of formatting your data before passing it to the user (or to the LLM!).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw3bato7ba8gigf7w76an.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw3bato7ba8gigf7w76an.png" alt=" " width="800" height="692"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>python</category>
      <category>ux</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Day 56: Beating LLM Latency with Amazon SQS Decoupling ⚡</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Mon, 13 Apr 2026 15:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-56-beating-llm-latency-with-amazon-sqs-decoupling-1fac</link>
      <guid>https://dev.to/ericrodriguez10/day-56-beating-llm-latency-with-amazon-sqs-decoupling-1fac</guid>
      <description>&lt;p&gt;If you are building apps with LLMs, you already know the pain: generation takes seconds, and users hate waiting. &lt;/p&gt;

&lt;p&gt;Today, I fixed the latency of my AI Financial Agent by implementing the Asynchronous Worker Pattern** using Amazon SQS.&lt;/p&gt;

&lt;p&gt;The Problem&lt;br&gt;
My AWS Lambda was running synchronously:&lt;br&gt;
Fetch Bank Data -&amp;gt; Run Heavy AI Prompt -&amp;gt; Generate Email -&amp;gt; Return UI Data. &lt;br&gt;
This caused 5+ second load times and occasional API timeouts.&lt;/p&gt;

&lt;p&gt;The Solution&lt;br&gt;
I split my Python lambda_handler to detect the event source. &lt;/p&gt;

&lt;p&gt;If the request comes from API Gateway (React Frontend), it bypasses the heavy AI email generation completely and returns the dashboard data instantly.&lt;/p&gt;

&lt;p&gt;If the request comes from my EventBridge daily cronjob, it acts as a Fan-Out orchestrator:&lt;/p&gt;

&lt;p&gt;Scans DynamoDB and queues the heavy work&lt;br&gt;
for user in users:&lt;br&gt;
    sqs.send_message(&lt;br&gt;
        QueueUrl=SQS_QUEUE_URL, &lt;br&gt;
        MessageBody=json.dumps({"task": "daily_report", "user_id": user['user_id']})&lt;br&gt;
    )&lt;/p&gt;

&lt;p&gt;Then, SQS automatically invokes the Lambda in the background to handle the heavy Bedrock processing and SES email delivery.&lt;/p&gt;

&lt;p&gt;By decoupling the architecture, UI latency dropped by over 70%. Stop making your users wait for background tasks! Have you implemented SQS in your serverless apps yet?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs517cn19w59cj5la0jgg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs517cn19w59cj5la0jgg.png" alt=" " width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>serverless</category>
      <category>python</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Day 55: Single Table Design for User Profiles in DynamoDB</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Sat, 11 Apr 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-55-single-table-design-for-user-profiles-in-dynamodb-1696</link>
      <guid>https://dev.to/ericrodriguez10/day-55-single-table-design-for-user-profiles-in-dynamodb-1696</guid>
      <description>&lt;p&gt;Hardcoding variables is a developer habit. Building user configuration is a Product mindset. 🛠️&lt;/p&gt;

&lt;p&gt;Up until today, my Serverless AI Financial Agent suffered from the classic "Minimum Viable Product" disease: hardcoded assumptions. The AI assumed a fixed €15 daily budget for everyone to maintain their savings streak, and it lazily parsed usernames directly from their Cognito JWT Token emails (e.g., turning &lt;code&gt;ericridri11@gmail.com&lt;/code&gt; into "Ericridri11"). &lt;/p&gt;

&lt;p&gt;It worked for a proof of concept, but as a user experience? It was terrible.&lt;/p&gt;

&lt;p&gt;For Day 55 of my #100DaysOfCloud challenge, it was time to mature the architecture and build a Stateful User Configuration Panel. The users need to take the wheel.&lt;/p&gt;

&lt;p&gt;The Challenge: Database Bloat&lt;/p&gt;

&lt;p&gt;The immediate thought of any developer is: &lt;em&gt;"I need to save user preferences. Let's spin up a &lt;code&gt;UserPreferences&lt;/code&gt; table in the database."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In relational databases (SQL), that's standard practice. But in the cloud, specifically with NoSQL databases like Amazon DynamoDB, compute is cheap but IOPS (Input/Output Operations Per Second) and maintaining multiple tables cost money and operational overhead. &lt;/p&gt;

&lt;p&gt;How do we store custom user data without bloating our infrastructure?&lt;/p&gt;

&lt;p&gt;The Solution: Single Table Design 🧠&lt;/p&gt;

&lt;p&gt;Instead of provisioning a completely new DynamoDB table, I implemented an advanced NoSQL pattern called &lt;strong&gt;Single Table Design&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;I overloaded my existing &lt;code&gt;FinanceAgent-Transactions&lt;/code&gt; table. By querying a specific sort key formatted as &lt;code&gt;PROFILE#{user_id}&lt;/code&gt;, I can retrieve a metadata record that acts as a container for user preferences, living right alongside their financial transactions.&lt;/p&gt;

&lt;p&gt;With this single record, I can now store:&lt;br&gt;
✅ Display Name (No more email parsing).&lt;br&gt;
✅ Custom Daily Budget Goal (Dynamic streak calculation).&lt;br&gt;
✅ Monthly Salary (For better FinOps forecasting).&lt;br&gt;
✅ AI Personality Tone (Users can choose "Brutal", "Sarcastic", or "Polite").&lt;/p&gt;

&lt;p&gt;Fun fact on the "Polite" tone: The app's core identity is "Tough Love". If a user selects "Polite", the system prompt instructs the AI to subtly mock them for being emotionally fragile and unable to handle the truth. The AI never loses!* 🤖&lt;/p&gt;

&lt;p&gt;The Lambda Router&lt;/p&gt;

&lt;p&gt;Since my entire backend runs on a single AWS Lambda Function URL, I had to upgrade my Python code to act as a basic RESTful router. &lt;/p&gt;

&lt;p&gt;I added logic to intercept the HTTP method from the &lt;code&gt;requestContext&lt;/code&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When the user hits "Save" on the React Frontend, a &lt;code&gt;POST&lt;/code&gt; request is fired. The Lambda parses the JSON payload and updates the &lt;code&gt;PROFILE&lt;/code&gt; record in DynamoDB.&lt;/li&gt;
&lt;li&gt;When a standard &lt;code&gt;GET&lt;/code&gt; request is made to load the dashboard, the Lambda fetches this profile first, and dynamically injects the user's real name and requested tone directly into the Amazon Nova Micro system prompt.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here is a snippet of how I implemented the Profile extraction in Python using &lt;code&gt;boto3&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;def get_user_profile(user_id, default_name):&lt;br&gt;
    profile_id = f"PROFILE#{user_id}"&lt;br&gt;
    try:&lt;br&gt;
        response = table.get_item(Key={'user_id': user_id, 'transaction_date': profile_id})&lt;br&gt;
        if 'Item' in response:&lt;br&gt;
            return response['Item']&lt;br&gt;
    except Exception as e:&lt;br&gt;
        logger.error("Error fetching profile", extra={"details": str(e)})&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Return default schema if no profile exists yet
return {
    'user_id': user_id, 
    'transaction_date': profile_id, 
    'display_name': default_name, 
    'daily_budget': 15.00, 
    'ai_tone': 'brutal'
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The Lesson&lt;/p&gt;

&lt;p&gt;Building the engine is only half the battle. Giving the driver a steering wheel is what makes it a real application. Think about your data access patterns before provisioning new cloud resources. Overloading NoSQL tables is an absolute superpower for keeping costs down while scaling up features.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F10rehzanzce6k97g0dlc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F10rehzanzce6k97g0dlc.png" alt=" " width="800" height="484"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>dynamodb</category>
      <category>serverless</category>
      <category>react</category>
    </item>
    <item>
      <title>Day 55: Single Table Design for User Profiles in DynamoDB</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Sat, 11 Apr 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-55-single-table-design-for-user-profiles-in-dynamodb-13pn</link>
      <guid>https://dev.to/ericrodriguez10/day-55-single-table-design-for-user-profiles-in-dynamodb-13pn</guid>
      <description>&lt;p&gt;Hardcoding variables is a developer habit. Building user configuration is a Product mindset. 🛠️&lt;/p&gt;

&lt;p&gt;Up until today, my Serverless AI Financial Agent suffered from the classic "Minimum Viable Product" disease: hardcoded assumptions. The AI assumed a fixed €15 daily budget for everyone to maintain their savings streak, and it lazily parsed usernames directly from their Cognito JWT Token emails (e.g., turning &lt;code&gt;ericridri11@gmail.com&lt;/code&gt; into "Ericridri11"). &lt;/p&gt;

&lt;p&gt;It worked for a proof of concept, but as a user experience? It was terrible.&lt;/p&gt;

&lt;p&gt;For Day 55 of my #100DaysOfCloud challenge, it was time to mature the architecture and build a Stateful User Configuration Panel. The users need to take the wheel.&lt;/p&gt;

&lt;p&gt;The Challenge: Database Bloat&lt;/p&gt;

&lt;p&gt;The immediate thought of any developer is: &lt;em&gt;"I need to save user preferences. Let's spin up a &lt;code&gt;UserPreferences&lt;/code&gt; table in the database."&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In relational databases (SQL), that's standard practice. But in the cloud, specifically with NoSQL databases like Amazon DynamoDB, compute is cheap but IOPS (Input/Output Operations Per Second) and maintaining multiple tables cost money and operational overhead. &lt;/p&gt;

&lt;p&gt;How do we store custom user data without bloating our infrastructure?&lt;/p&gt;

&lt;p&gt;The Solution: Single Table Design 🧠&lt;/p&gt;

&lt;p&gt;Instead of provisioning a completely new DynamoDB table, I implemented an advanced NoSQL pattern called &lt;strong&gt;Single Table Design&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;I overloaded my existing &lt;code&gt;FinanceAgent-Transactions&lt;/code&gt; table. By querying a specific sort key formatted as &lt;code&gt;PROFILE#{user_id}&lt;/code&gt;, I can retrieve a metadata record that acts as a container for user preferences, living right alongside their financial transactions.&lt;/p&gt;

&lt;p&gt;With this single record, I can now store:&lt;br&gt;
✅ Display Name (No more email parsing).&lt;br&gt;
✅ Custom Daily Budget Goal (Dynamic streak calculation).&lt;br&gt;
✅ Monthly Salary (For better FinOps forecasting).&lt;br&gt;
✅ AI Personality Tone (Users can choose "Brutal", "Sarcastic", or "Polite").&lt;/p&gt;

&lt;p&gt;Fun fact on the "Polite" tone: The app's core identity is "Tough Love". If a user selects "Polite", the system prompt instructs the AI to subtly mock them for being emotionally fragile and unable to handle the truth. The AI never loses!* 🤖&lt;/p&gt;

&lt;p&gt;The Lambda Router&lt;/p&gt;

&lt;p&gt;Since my entire backend runs on a single AWS Lambda Function URL, I had to upgrade my Python code to act as a basic RESTful router. &lt;/p&gt;

&lt;p&gt;I added logic to intercept the HTTP method from the &lt;code&gt;requestContext&lt;/code&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When the user hits "Save" on the React Frontend, a &lt;code&gt;POST&lt;/code&gt; request is fired. The Lambda parses the JSON payload and updates the &lt;code&gt;PROFILE&lt;/code&gt; record in DynamoDB.&lt;/li&gt;
&lt;li&gt;When a standard &lt;code&gt;GET&lt;/code&gt; request is made to load the dashboard, the Lambda fetches this profile first, and dynamically injects the user's real name and requested tone directly into the Amazon Nova Micro system prompt.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here is a snippet of how I implemented the Profile extraction in Python using &lt;code&gt;boto3&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;def get_user_profile(user_id, default_name):&lt;br&gt;
    profile_id = f"PROFILE#{user_id}"&lt;br&gt;
    try:&lt;br&gt;
        response = table.get_item(Key={'user_id': user_id, 'transaction_date': profile_id})&lt;br&gt;
        if 'Item' in response:&lt;br&gt;
            return response['Item']&lt;br&gt;
    except Exception as e:&lt;br&gt;
        logger.error("Error fetching profile", extra={"details": str(e)})&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Return default schema if no profile exists yet
return {
    'user_id': user_id, 
    'transaction_date': profile_id, 
    'display_name': default_name, 
    'daily_budget': 15.00, 
    'ai_tone': 'brutal'
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;The Lesson&lt;/p&gt;

&lt;p&gt;Building the engine is only half the battle. Giving the driver a steering wheel is what makes it a real application. Think about your data access patterns before provisioning new cloud resources. Overloading NoSQL tables is an absolute superpower for keeping costs down while scaling up features.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F10rehzanzce6k97g0dlc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F10rehzanzce6k97g0dlc.png" alt=" " width="800" height="484"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>dynamodb</category>
      <category>serverless</category>
      <category>react</category>
    </item>
    <item>
      <title>Day 54: Giving an LLM Long-Term Memory with DynamoDB</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Fri, 10 Apr 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-54-giving-an-llm-long-term-memory-with-dynamodb-16n5</link>
      <guid>https://dev.to/ericrodriguez10/day-54-giving-an-llm-long-term-memory-with-dynamodb-16n5</guid>
      <description>&lt;p&gt;One of the biggest limitations of stateless serverless applications using LLMs (like Amazon Bedrock or OpenAI) is amnesia. Every API call starts from zero.&lt;/p&gt;

&lt;p&gt;Today, I built a persistent memory engine for my AI Financial Agent so it remembers a user's behavior from previous months.&lt;/p&gt;

&lt;p&gt;The Architecture:&lt;br&gt;
I created a DynamoDB table with user_id (from Cognito JWT) as the Partition Key and month_year as the Sort Key.&lt;/p&gt;

&lt;p&gt;Before calling Bedrock, my Python Lambda fetches last month's summary:&lt;/p&gt;

&lt;p&gt;def get_user_memory(user_id, prev_month_str):&lt;br&gt;
    # DynamoDB get_item logic...&lt;br&gt;
I then inject this into the LLM prompt:&lt;/p&gt;

&lt;p&gt;Plaintext&lt;br&gt;
PREVIOUS MONTH MEMORY:&lt;br&gt;
"{past_memory}"&lt;br&gt;
Analyze the current transactions and compare them to the past memory.&lt;/p&gt;

&lt;p&gt;The crucial part? I force the LLM to output a memory_for_next_month string in its JSON response.&lt;/p&gt;

&lt;p&gt;{&lt;br&gt;
  "score_feedback": "You spent 50€ less on eating out than last month. Good job.",&lt;br&gt;
  "memory_for_next_month": "User successfully cut back on dining expenses but increased tech spending."&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;Lambda intercepts this key and saves it back to DynamoDB using the current month's timestamp. It creates an infinite loop of context, turning a basic API wrapper into a continuous AI companion!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foi5tj40gvjsjsdphm1um.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foi5tj40gvjsjsdphm1um.png" alt=" " width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>python</category>
      <category>ai</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Day 53: CI/CD for React on AWS S3 &amp; CloudFront (No Access Keys!) 🚀</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Thu, 09 Apr 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-53-cicd-for-react-on-aws-s3-cloudfront-no-access-keys-34lm</link>
      <guid>https://dev.to/ericrodriguez10/day-53-cicd-for-react-on-aws-s3-cloudfront-no-access-keys-34lm</guid>
      <description>&lt;p&gt;Deploying React to S3 manually gets old fast. If you are still dragging and dropping folders into the AWS Console, it's time to stop.&lt;/p&gt;

&lt;p&gt;Today, I built a GitHub Actions pipeline that builds my React app, syncs it to S3, and clears the CloudFront cache automatically. Best part? Zero static AWS credentials.&lt;/p&gt;

&lt;p&gt;The Workflow&lt;/p&gt;

&lt;p&gt;Assuming you already have an OIDC Identity Provider set up in AWS IAM (which you should, to avoid storing Access Keys in GitHub Secrets), here is the workflow I wrote today:&lt;/p&gt;

&lt;p&gt;YAML&lt;br&gt;
name: Deploy Frontend&lt;br&gt;
on:&lt;br&gt;
  push:&lt;br&gt;
    branches: [ main ]&lt;br&gt;
    paths: [ 'src/**', 'App.tsx', 'package.json' ]&lt;/p&gt;

&lt;p&gt;permissions:&lt;br&gt;
  id-token: write&lt;br&gt;
  contents: read&lt;/p&gt;

&lt;p&gt;jobs:&lt;br&gt;
  deploy:&lt;br&gt;
    runs-on: ubuntu-latest&lt;br&gt;
    steps:&lt;br&gt;
      - uses: actions/checkout@v4&lt;br&gt;
      - uses: actions/setup-node@v4&lt;br&gt;
        with: { node-version: '20' }&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  - run: npm ci
  - run: npm run build
    env:
      VITE_AWS_REGION: "eu-north-1"
      VITE_USER_POOL_ID: "your-pool-id"

  - name: Configure AWS Credentials via OIDC
    uses: aws-actions/configure-aws-credentials@v4
    with:
      role-to-assume: arn:aws:iam::1234567890:role/GitHubDeployRole
      aws-region: eu-north-1

  - name: Sync to S3
    run: aws s3 sync dist/ s3://my-react-bucket --delete

  - name: Invalidate CloudFront
    run: aws cloudfront create-invalidation --distribution-id E1ABCD2345 --paths "/*"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Why the --delete flag?&lt;/p&gt;

&lt;p&gt;Using aws s3 sync with --delete is crucial. It removes old JavaScript chunks from S3 that were generated in previous builds, preventing your bucket from growing infinitely with dead code.&lt;/p&gt;

&lt;p&gt;My deployments now take 45 seconds and require zero clicks in the AWS console. Automate everything!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3srsm7r4nyqi7da6y9s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3srsm7r4nyqi7da6y9s.png" alt=" " width="800" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>react</category>
      <category>devops</category>
      <category>github</category>
    </item>
    <item>
      <title>Day 52: How to build a Split-Screen Login &amp; Parse JWTs in AWS Lambda 🔐</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Wed, 08 Apr 2026 14:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-52-how-to-build-a-split-screen-login-parse-jwts-in-aws-lambda-47b1</link>
      <guid>https://dev.to/ericrodriguez10/day-52-how-to-build-a-split-screen-login-parse-jwts-in-aws-lambda-47b1</guid>
      <description>&lt;p&gt;I hated the default Amazon Cognito login screen. It looked like a dashboard from 2015.&lt;/p&gt;

&lt;p&gt;Today, I decided to build a custom Split-Screen Auth UI for my React app and update my AWS Lambda backend to actually read the JSON Web Tokens (JWT) sent by the frontend. Here is exactly how I did it.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The Frontend: Split-Screen Auth with Amplify&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Instead of using the standard  wrapper which forces a centered card, I used Authenticator.Provider. This gives you access to the useAuthenticator hook so you can render the login form exactly where you want it.&lt;/p&gt;

&lt;p&gt;I built a two-column layout using Tailwind CSS. Left side: Branding. Right side: Login.&lt;/p&gt;

&lt;p&gt;import { Authenticator, useAuthenticator } from '@aws-amplify/ui-react';&lt;/p&gt;

&lt;p&gt;const CustomAuthWrapper = () =&amp;gt; {&lt;br&gt;
  const { authStatus } = useAuthenticator((context) =&amp;gt; [context.authStatus]);&lt;/p&gt;

&lt;p&gt;if (authStatus === 'authenticated') return ;&lt;/p&gt;

&lt;p&gt;return (&lt;br&gt;
    &lt;/p&gt;
&lt;br&gt;
      {/* Left Column: Branding */}&lt;br&gt;
      &lt;br&gt;
         &lt;h1&gt;FinAI Agent&lt;/h1&gt;
&lt;br&gt;
      
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  {/* Right Column: Auth Form */}
  &amp;lt;div className="w-1/2 flex items-center justify-center"&amp;gt;
       &amp;lt;Authenticator /&amp;gt;
  &amp;lt;/div&amp;gt;
&amp;lt;/div&amp;gt;
&lt;/code&gt;&lt;/pre&gt;
&lt;p&gt;);&lt;br&gt;
};&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Injecting the Bearer Token&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To make the backend aware of who is calling it, we need to send the Cognito JWT. I updated my fetch calls in React to grab the session token dynamically:&lt;/p&gt;

&lt;p&gt;JavaScript&lt;br&gt;
import { fetchAuthSession } from 'aws-amplify/auth';&lt;/p&gt;

&lt;p&gt;const { tokens } = await fetchAuthSession();&lt;br&gt;
const jwtToken = tokens?.idToken?.toString();&lt;/p&gt;

&lt;p&gt;const response = await fetch(API_URL, {&lt;br&gt;
    headers: { 'Authorization': &lt;code&gt;Bearer ${jwtToken}&lt;/code&gt; }&lt;br&gt;
});&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The Backend: Parsing JWT in Python (AWS Lambda)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;API Gateway handles the heavy lifting of verifying the signature, but I still needed my Lambda function to know who the user was to query DynamoDB correctly. I wrote a small base64 decoding snippet to extract the sub (user ID) and email.&lt;/p&gt;

&lt;p&gt;Python&lt;br&gt;
import base64&lt;br&gt;
import json&lt;/p&gt;

&lt;p&gt;def lambda_handler(event, context):&lt;br&gt;
    headers = event.get('headers', {})&lt;br&gt;
    auth_header = headers.get('authorization', headers.get('Authorization', ''))&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if auth_header.startswith('Bearer '):
    token = auth_header.split(' ')[1]
    payload_b64 = token.split('.')[1]
    payload_b64 += '=' * (-len(payload_b64) % 4) # Add padding

    payload = json.loads(base64.b64decode(payload_b64).decode('utf-8'))

    user_id = payload.get('sub')
    user_email = payload.get('email')

    print(f"Authenticated: {user_email} (ID: {user_id})")
&lt;/code&gt;&lt;/pre&gt;

&lt;p&gt;My Serverless app is now officially Multi-Tenant! Every user gets their own DynamoDB sandbox, and the AI greets them by their real name. 🚀&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg5gw478hgudyn31utxmo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg5gw478hgudyn31utxmo.png" alt=" " width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>react</category>
      <category>python</category>
      <category>security</category>
    </item>
    <item>
      <title>Day 51: I stopped building Login pages manually 🛑🔑</title>
      <dc:creator>Eric Rodríguez</dc:creator>
      <pubDate>Tue, 07 Apr 2026 16:00:00 +0000</pubDate>
      <link>https://dev.to/ericrodriguez10/day-51-i-stopped-building-login-pages-manually-2ilc</link>
      <guid>https://dev.to/ericrodriguez10/day-51-i-stopped-building-login-pages-manually-2ilc</guid>
      <description>&lt;p&gt;I am officially done writing &lt;/p&gt; tags for login and registration.

&lt;p&gt;Today (Day 51 of my cloud journey), I connected my React application to Amazon Cognito using AWS Amplify UI.&lt;/p&gt;

&lt;p&gt;With just a few lines of configuration pointing to my User Pool ID and Client ID, I wrapped my app in the  component.&lt;/p&gt;

&lt;p&gt;What it gave me out of the box:&lt;/p&gt;

&lt;p&gt;A clean UI for Sign-In and Sign-Up.&lt;/p&gt;

&lt;p&gt;Automatic verification code emails (OTP).&lt;/p&gt;

&lt;p&gt;Secure JWT session management in local storage.&lt;/p&gt;

&lt;p&gt;A signOut hook to instantly kill the session.&lt;/p&gt;

&lt;p&gt;If you are a frontend developer building on AWS, using the Amplify UI library is practically a superpower. My dashboard is now completely locked down.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ynughr08k9cb1gifurl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ynughr08k9cb1gifurl.png" alt=" " width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>react</category>
      <category>aws</category>
      <category>security</category>
      <category>frontend</category>
    </item>
  </channel>
</rss>
