Welcome to Day 12. Today we connect the two islands we've built: The Database (Memory) and the AI (Brain).
The Concept
We want to inject our DynamoDB data into the prompt before sending it to the Large Language Model (LLM). This allows the AI to give personalized answers based on our actual history.
The Code
I updated my Lambda function to use boto3 for both services.
Python
1. Get Data from DynamoDB
response = table.scan()
items = response['Items']
2. Turn Data into Text
context_str = ""
for item in items:
context_str += f"- {item['description']}: {item['amount']}\n"
3. Send to Bedrock
prompt = f"Analyze this data:\n{context_str}"
... invoke_model code ...
The Result
Now, when I run the function, Amazon Titan reads the transactions I inserted two days ago and summarizes them. This is the core logic of most "AI Agents" on the market today: Fetch Context -> Process -> Answer.
See you on Day 13!

Top comments (0)