<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Faye Ellis</title>
    <description>The latest articles on DEV Community by Faye Ellis (@faye_ellis).</description>
    <link>https://dev.to/faye_ellis</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/faye_ellis"/>
    <language>en</language>
    <item>
      <title>Tutorial: Secure Serverless RAG in 5 Minutes with Amazon Bedrock + S3 Vector Store</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Mon, 16 Feb 2026 21:23:38 +0000</pubDate>
      <link>https://dev.to/aws-builders/tutorial-secure-serverless-rag-in-5-minutes-with-amazon-bedrock-s3-vector-store-3b77</link>
      <guid>https://dev.to/aws-builders/tutorial-secure-serverless-rag-in-5-minutes-with-amazon-bedrock-s3-vector-store-3b77</guid>
      <description>&lt;p&gt;While preparing for the &lt;a href="https://aws.amazon.com/certification/certified-generative-ai-developer-professional/" rel="noopener noreferrer"&gt;AWS Certified Generative AI Developer - Professional Exam&lt;/a&gt; I wanted to get some more hands-on experience with &lt;strong&gt;Guardrails&lt;/strong&gt;, to get a better understanding of exactly what can be done with them.&lt;/p&gt;

&lt;p&gt;Here's a five minute tutorial to quickly create an &lt;strong&gt;Amazon Bedrock Knowledge Base&lt;/strong&gt; so you can easily explore &lt;strong&gt;Guardrails&lt;/strong&gt; and close any knowledge gaps!&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Create an S3 bucket
&lt;/h2&gt;

&lt;p&gt;Create an &lt;strong&gt;S3 bucket&lt;/strong&gt; and upload the documents that you want to include in the &lt;strong&gt;Knowledge Base&lt;/strong&gt;. I used a fake &lt;a href="https://github.com/fayekins/genai-dev/blob/main/Ralphie_Bank_Corporate_Strategy.pdf" rel="noopener noreferrer"&gt;corporate strategy document&lt;/a&gt; for a fictitious company, Ralphie Bank. Feel free to use this if you don't already have a suitable document to use! &lt;/p&gt;

&lt;h2&gt;
  
  
  2. Create a Knowledge Base
&lt;/h2&gt;

&lt;p&gt;In the &lt;strong&gt;Bedrock&lt;/strong&gt; console, click &lt;strong&gt;Knowledge Bases&lt;/strong&gt;, and &lt;strong&gt;Create Knowledge Base&lt;/strong&gt;. Select &lt;strong&gt;S3 vector store&lt;/strong&gt; as the type.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgio0korrtyn0ka1ykb7u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgio0korrtyn0ka1ykb7u.png" alt="Create Knowledge Base. Select S3 vector store as the type." width="800" height="255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Data source type is &lt;strong&gt;Amazon S3&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxcf76wetwfcq4okhqcn0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxcf76wetwfcq4okhqcn0.png" alt="Data source type is Amazon S3" width="800" height="304"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then click &lt;strong&gt;Next&lt;/strong&gt; and &lt;strong&gt;Browse S3&lt;/strong&gt; to select the bucket where your data is stored. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya5juyftrz33yphi749f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fya5juyftrz33yphi749f.png" alt="browse S3 to select the bucket" width="800" height="231"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before proceeding you have the opportunity to select a chunking strategy from a range of different strategies that are natively provided in &lt;strong&gt;Bedrock&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhc4b8czwi1p31mku80z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhc4b8czwi1p31mku80z.png" alt="select a chunking strategy" width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There's also the option to configure a &lt;strong&gt;Lambda&lt;/strong&gt; function to perform custom chunking or data processing at this stage. &lt;/p&gt;

&lt;p&gt;After that, click &lt;strong&gt;Next&lt;/strong&gt;, and select an embeddings model, this is the model that will perform the vector embeddings that will be stored in the &lt;strong&gt;Knowledge Base&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkqrcmxz0f1lagxyx454.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkqrcmxz0f1lagxyx454.png" alt="Select an embeddings model" width="800" height="520"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the type of vector store you want to create. I used &lt;strong&gt;S3 Vectors&lt;/strong&gt;, it is much cheaper than &lt;strong&gt;OpenSearch Serverless&lt;/strong&gt; and fine for what I need. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ohsxjt5m917xuo0fm2e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ohsxjt5m917xuo0fm2e.png" alt="Select the type of vector store" width="800" height="238"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then click &lt;strong&gt;Next&lt;/strong&gt; and &lt;strong&gt;Create Knowledge Base&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Sync Your Data Source
&lt;/h2&gt;

&lt;p&gt;After the Knowledge Base is showing as available, you'll need to sync your data source to actually create populate the data source with the embeddings. Just select your &lt;strong&gt;Knowledge Base&lt;/strong&gt; and click &lt;strong&gt;Sync&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fol113nvw8ns4v50mnot9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fol113nvw8ns4v50mnot9.png" alt="Sync the data source" width="800" height="317"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Create a Guardrail
&lt;/h2&gt;

&lt;p&gt;Click &lt;strong&gt;Guardrails&lt;/strong&gt; in the &lt;strong&gt;Amazon Bedrock&lt;/strong&gt; menu on the left, then &lt;strong&gt;Create Guardrail&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Name it &lt;strong&gt;PII-filter&lt;/strong&gt; and click &lt;strong&gt;Next&lt;/strong&gt;. This is where you can explore all the different capabilities available in Guardrails. Some of them I have not used before, so I wanted to review everything here and understand what can be configured. For instance there's a prompt attack filter to protect against attempts to override system instructions. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fooxvtrz89l5vuywtam8h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fooxvtrz89l5vuywtam8h.png" alt="Prompt attack filter" width="800" height="266"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that click &lt;strong&gt;Next&lt;/strong&gt;, and here you can add denied topics. As my data contains information about corporate strategy and some financial data, I want my Guardrail to prevent the model giving investment advice. Click &lt;strong&gt;Add Denied Topic&lt;/strong&gt;, name it &lt;strong&gt;Financial-Advice&lt;/strong&gt; and then provide text describing the topic to block, here's what I added:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Financial advice, investment advice, stock purchase, investment fund
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foe5j662t7vgweeudcm5s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foe5j662t7vgweeudcm5s.png" alt="Adding a denied topic" width="800" height="478"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Keeping &lt;strong&gt;Input&lt;/strong&gt; and &lt;strong&gt;Output actions&lt;/strong&gt; set to &lt;strong&gt;Block&lt;/strong&gt;, click &lt;strong&gt;Confirm&lt;/strong&gt;, then &lt;strong&gt;Next&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Word filters allow you to add profanity filters as well as custom words and phrases, then click &lt;strong&gt;Next&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;After that you can specify how to handle PII.&lt;/p&gt;

&lt;p&gt;Click &lt;strong&gt;Add New PII&lt;/strong&gt;, then select the type of PII you want to identify. You then select how to deal with that type of PII, block, mask, or detect (no action). &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7tl0t6uqku958icenar.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7tl0t6uqku958icenar.png" alt="Select the type of PII" width="800" height="722"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I selected phone and email since I know that this type of PII exists in my data. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fesriyhkimd1lztcng7ak.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fesriyhkimd1lztcng7ak.png" alt="Blocking and masking different types of PII" width="800" height="274"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that click &lt;strong&gt;Next&lt;/strong&gt; in the next page you can optionally add a &lt;strong&gt;contextual grounding check&lt;/strong&gt; and &lt;strong&gt;relevance check&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Contextual grounding lets you set a threshold for what is acceptable in terms of model confidence score for contextual grounding. The default setting is 0.7 out of 1, if the score is lower than the defined threshold, the model response will be blocked.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx88ky39nos4d7ijpl4dm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx88ky39nos4d7ijpl4dm.png" alt="Contextual grounding check" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Relevance checks work in a similar way, a relevance score is calculated and if the score is lower than the defined threshold, the model response will be blocked.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9chqc8irmwh43rua6344.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9chqc8irmwh43rua6344.png" alt="Relevance check" width="800" height="389"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then click &lt;strong&gt;Next&lt;/strong&gt;, &lt;strong&gt;Next&lt;/strong&gt;, and &lt;strong&gt;Create Guardrail&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;After that you are ready to test the functionality and see it in action. &lt;/p&gt;

&lt;h2&gt;
  
  
  5. Testing the Knowledge Base Functionality
&lt;/h2&gt;

&lt;p&gt;In the left hand menu, click &lt;strong&gt;Knowledge Bases&lt;/strong&gt;, select your &lt;strong&gt;Knowledge Base&lt;/strong&gt;, then click &lt;strong&gt;Test Knowledge Base&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjxd2gnslsgxxms5yhoma.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjxd2gnslsgxxms5yhoma.png" alt="Test the Knowledge Base" width="800" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Select the model you want to test with, a lite model is fine for this task. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs40ubkohr46pq7bybon9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs40ubkohr46pq7bybon9.png" alt=" " width="800" height="421"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under &lt;strong&gt;Data Manipulation&lt;/strong&gt;, open the &lt;strong&gt;Guardrails&lt;/strong&gt; dropdown and then select your Guardrail.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2ntg4w422nc8ulyncj8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2ntg4w422nc8ulyncj8.png" alt=" " width="800" height="200"&gt;&lt;/a&gt;  &lt;/p&gt;

&lt;p&gt;There is also the opportunity to use a reranking model, like &lt;strong&gt;Cohere rerank&lt;/strong&gt; to rescore the relevancy of the retrieved data and override the vector database's relevancy ranking. &lt;/p&gt;

&lt;p&gt;After that I added a prompt to test out the configuration: &lt;br&gt;
&lt;code&gt;What are Ralphie Bank’s three strategic pillars?&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa0cid54qhhbxr4duk1jx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa0cid54qhhbxr4duk1jx.png" alt="Prompting the model" width="800" height="302"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Testing the Guardrail
&lt;/h2&gt;

&lt;p&gt;Next I tried some prompts that should trigger the Guardrail, here's what I tried: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;Should I invest in Ralphie Bank shares this year?&lt;/code&gt;&lt;br&gt;
&lt;code&gt;What is the internal strategy hotline number?&lt;/code&gt;&lt;br&gt;
&lt;code&gt;What is the compliance email address?&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;If it's all worked, the &lt;strong&gt;Guardrail&lt;/strong&gt; will block the request from being sent to the model, and &lt;strong&gt;Bedrock&lt;/strong&gt; should not fulfill the request.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9qjaqxsbecbqe2uiunqr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9qjaqxsbecbqe2uiunqr.png" alt="Bedrock should not fulfill the request" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  7. Clean-up
&lt;/h2&gt;

&lt;p&gt;Be sure to delete the &lt;strong&gt;Guardrail&lt;/strong&gt;, &lt;strong&gt;Knowledge Base&lt;/strong&gt;, and &lt;strong&gt;S3 bucket&lt;/strong&gt; to avoid unnecessary charges!&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>tutorial</category>
      <category>rag</category>
    </item>
    <item>
      <title>Gen AI Is Only as Reliable as Your Data: Lessons from AWS re:Invent 2025</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Tue, 16 Dec 2025 09:12:00 +0000</pubDate>
      <link>https://dev.to/aws-builders/gen-ai-is-only-as-reliable-as-your-data-lessons-from-aws-reinvent-2025-6j3</link>
      <guid>https://dev.to/aws-builders/gen-ai-is-only-as-reliable-as-your-data-lessons-from-aws-reinvent-2025-6j3</guid>
      <description>&lt;p&gt;These days, it feels like everyone is building something with AI. From specialized analytics assistants, diagnostics and decision-making systems to project management and CRM platforms. AI is becoming part of every tool we use. All powered by large language models and agents that increasingly depend on complex data pipelines.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;But what happens when the data behind those applications breaks?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The short answer is that things go wrong very quickly, and not always in obvious ways!&lt;/p&gt;

&lt;p&gt;While at re:Invent this year, I attended a session from Commvault on &lt;a href="//youtube.com/watch?si=YXUehbuQKVGM_q_M&amp;amp;v=d9f3NH_aDa8&amp;amp;feature=youtu.be"&gt;Best practices to simplify resilience at scale for Gen AI data &amp;amp; apps &lt;/a&gt;to understand how enterprise customers are doing AI resilience properly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzg7h0r9qkdj3c1k7h7v6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzg7h0r9qkdj3c1k7h7v6.jpeg" alt="Commvault recently announced Iceberg-aware recovery" width="800" height="1422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Gen AI is only as good as its data pipeline
&lt;/h2&gt;

&lt;p&gt;Gen AI applications rarely fail because of the model itself. Instead, failures often start much lower down the stack in the data layer. &lt;br&gt;
AI is only as good as the data we feed it. Missing records, corrupted partitions, overwritten tables, or deleted objects don’t just cause downtime. They cause:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Models to hallucinate&lt;/li&gt;
&lt;li&gt;Dashboards to show incorrect insights&lt;/li&gt;
&lt;li&gt;Applications to behave unpredictably&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The concerning thing is that you may not even realize it, applications can keep running while quietly becoming wrong.&lt;/p&gt;

&lt;p&gt;In the session, this was demonstrated using real-world AWS services like DynamoDB, Amazon S3, and Apache Iceberg, showing how failures in each layer of the data pipeline can directly impact Gen AI behavior - even when the application itself appears healthy.&lt;/p&gt;

&lt;h2&gt;
  
  
  The data protection gap
&lt;/h2&gt;

&lt;p&gt;During the session, the speakers asked a simple question: How many people are building GenAI applications in AWS? Unsurprisingly, many hands were raised. However when asked&lt;/p&gt;

&lt;p&gt;How many are actively protecting the data layer that powers those applications? I saw far fewer hands.&lt;/p&gt;

&lt;p&gt;This gap isn’t surprising. Most teams invest heavily in scaling compute, tuning models, and improving performance, while assuming data protection is handled somewhere in the background. Many organizations expect their existing backup system will handle the restoration of data following a loss or corruption.&lt;/p&gt;

&lt;p&gt;In reality, many modern data pipelines are only partially protected, or protected in ways that don’t support the fast, clean recovery that customers expect.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why traditional backup isn’t enough
&lt;/h2&gt;

&lt;p&gt;Traditional backup tools weren’t designed for the way modern cloud-native and Gen AI data pipelines fail, particularly when recovery needs to happen quickly and at a very granular level.&lt;br&gt;
A recurring theme throughout the talk was that &lt;strong&gt;recovery speed and simplicity matter more than theoretical durability.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Traditional recovery approaches often look like this:&lt;br&gt;
&lt;strong&gt;Step 1:&lt;/strong&gt; Restore an entire dataset&lt;br&gt;
&lt;strong&gt;Step 2:&lt;/strong&gt; Create temporary tables or buckets&lt;br&gt;
&lt;strong&gt;Step 3:&lt;/strong&gt; Repoint applications&lt;br&gt;
&lt;strong&gt;Step 4:&lt;/strong&gt; Rebuild downstream dependencies, like manually recreating dashboards&lt;/p&gt;

&lt;p&gt;This process is slow, error-prone, and expensive, especially when customers are waiting for systems to be back online.&lt;/p&gt;

&lt;p&gt;What stood out to me was the emphasis on in-place, granular recovery.&lt;/p&gt;

&lt;p&gt;Recover only what broke, restore it directly to the original location, and avoid reconfiguration wherever possible. Less operational work means faster recovery, and fewer mistakes during already stressful incidents.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.commvault.com/clumio/workloads" rel="noopener noreferrer"&gt;Clumio Backtrack&lt;/a&gt; enables individual records or partitions to be recovered in-place without creating temporary tables or repointing applications.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffdt99b1lzgmr92wouckb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffdt99b1lzgmr92wouckb.png" alt="Clumio Backtrack enables point-in-time recovery for DynamoDB tables following accidental deletion, or data compromise&amp;lt;br&amp;gt;
" width="800" height="465"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The “last known good state” problem
&lt;/h2&gt;

&lt;p&gt;Another subtle but important insight: in distributed cloud systems, there is rarely a single “last known good” timestamp.&lt;/p&gt;

&lt;p&gt;Different components fail at different times. DynamoDB partitions may be corrupted minutes apart, S3 objects may be deleted individually, Iceberg tables may be overwritten by a single bad query. Effective recovery needs to work at the level of records, partitions, objects, and snapshots, not just entire databases or buckets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.commvault.com/clumio/workloads" rel="noopener noreferrer"&gt;Clumio Backtrack&lt;/a&gt; also supports point-in-time and granular recovery across distributed services like DynamoDB and S3 rather than forcing teams into lengthy all-or-nothing restores.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefubemxnf26qooyow25m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefubemxnf26qooyow25m.png" alt="Clumio Backtrack point-in-time recovery for individual S3 objects, you can even restore to a completely different AWS account&amp;lt;br&amp;gt;
" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;RAG pipelines are especially vulnerable&lt;br&gt;
Retrieval-augmented generation (RAG) pipelines introduce an extra layer of fragility. Vector stores depend on underlying source data, if that disappears, embeddings quickly become useless, if you’ve ever ingested data into an Amazon Bedrock knowledge base you’ll know that recreating vectors is time-consuming and costly for large datasets.&lt;/p&gt;

&lt;p&gt;Even small amounts of data loss can lead to wildly inaccurate AI responses, delivered with high confidence. Protecting raw source data turns out to be one of the most important steps in keeping Gen AI outputs trustworthy.&lt;/p&gt;

&lt;p&gt;S3 object-level recovery from &lt;a href="https://www.commvault.com/clumio/workloads" rel="noopener noreferrer"&gt;Clumio Backtrack&lt;/a&gt; enables only the impacted objects to be restored, avoiding the need to recompute whole vector stores or rebuild the entire RAG pipeline.&lt;/p&gt;

&lt;h2&gt;
  
  
  Apache Iceberg: powerful, but not immune
&lt;/h2&gt;

&lt;p&gt;The session also covered Apache Iceberg, which has become a hugely popular foundation for modern lakehouse architectures. Supporting large-scale analytics, transactional consistency and schema evolution. &lt;/p&gt;

&lt;p&gt;However it also introduces new failure modes. A single overwrite or schema change can silently break dashboards and analytics without obviously deleting data.&lt;/p&gt;

&lt;p&gt;If you are only backing up the general purpose S3 bucket data that is powering the Iceberg tables, you’ll need to first do a full restore of every S3 bucket that holds Iceberg data, then restore the Iceberg table structure, including reconfiguring your manifest files, metadata and data files. Then reconfigure your applications to talk to the new table and recreate all your dashboards that existed for the previous dataset.&lt;/p&gt;

&lt;p&gt;**The key takeaway for me is that Iceberg-aware recovery matters. Restoring raw files alone isn’t sufficient. **Table structure, metadata, and dashboards must be preserved to truly recover.&lt;/p&gt;

&lt;h2&gt;
  
  
  The first Iceberg-aware and S3 tables data protection solution
&lt;/h2&gt;

&lt;p&gt;For organizations that require true AI maturity, Commvault recently announced &lt;a href="https://www.commvault.com/clumio/workloads/apache-iceberg" rel="noopener noreferrer"&gt;Iceberg-aware recovery capability from Clumio Backtrack&lt;/a&gt;. Providing industry-first Iceberg recovery capability, preserving table structure, metadata, and snapshots and delivering recovery without rebuilding dashboards or reconfiguring applications. Just pick a snapshot, pick a point in time, and recover, all in one step.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmjpp7592f4lbyxuz11t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmjpp7592f4lbyxuz11t.png" alt="Clumio Backtrack Iceberg-aware point-in-time recovery for S3 tables" width="800" height="680"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Key takeaways I’ll be applying going forward
&lt;/h2&gt;

&lt;p&gt;Design resilience at the data layer, not just the model layer&lt;br&gt;
Protect the entire data pipeline not individual services in isolation&lt;br&gt;
Optimize for fast, in-place recovery&lt;br&gt;
Plan for granular failures, not entire rollbacks&lt;br&gt;
Focus on user experience, not uptime metrics&lt;/p&gt;

&lt;p&gt;As GenAI applications become more central to how all of us operate, AI maturity demands that data resilience becomes non-negotiable. The quality, accuracy, and trustworthiness of AI outputs all depend on it.&lt;br&gt;&lt;br&gt;
To learn more, check out &lt;a href="https://youtu.be/d9f3NH_aDa8?si=65BmxTUTbxYzswi2" rel="noopener noreferrer"&gt;Best practices to simplify resilience at scale for Gen AI data &amp;amp; apps&lt;/a&gt; or &lt;a href="https://www.commvault.com/request-demo" rel="noopener noreferrer"&gt;request a demo&lt;/a&gt; of Commvault’s cloud-native cyber resiliency, data protection and recovery solutions.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>iceberg</category>
      <category>s3</category>
      <category>resilience</category>
    </item>
    <item>
      <title>Getting Started with Spec-driven Development Using Kiro</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Sat, 13 Dec 2025 18:33:12 +0000</pubDate>
      <link>https://dev.to/aws-heroes/getting-started-with-spec-driven-development-using-kiro-400l</link>
      <guid>https://dev.to/aws-heroes/getting-started-with-spec-driven-development-using-kiro-400l</guid>
      <description>&lt;p&gt;Curious to learn how Kiro could be your development partner for spec-driven development? Here's my simple tutorial to help you get started.&lt;/p&gt;

&lt;h2&gt;
  
  
  Creating a Simple Voting API
&lt;/h2&gt;

&lt;h3&gt;
  
  
  What You’ll Build
&lt;/h3&gt;

&lt;p&gt;A simple REST API where users can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a poll with options&lt;/li&gt;
&lt;li&gt;Vote on a poll&lt;/li&gt;
&lt;li&gt;View poll results&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Pre-requisites
&lt;/h2&gt;

&lt;p&gt;Download and install the &lt;a href="https://kiro.dev/" rel="noopener noreferrer"&gt;Kiro IDE&lt;/a&gt; for your operating system&lt;/p&gt;

&lt;h2&gt;
  
  
  Create a Project Folder and Start Developing!
&lt;/h2&gt;

&lt;p&gt;1) Create a folder in your local file system named &lt;strong&gt;Kiro_Voting_API&lt;/strong&gt;, this will be your project folder&lt;/p&gt;

&lt;p&gt;2) Launch Kiro, select &lt;strong&gt;File&lt;/strong&gt;, &lt;strong&gt;Open Folder&lt;/strong&gt;, and open the folder you just created&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqu4k4jzxhjitu1djb6qg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqu4k4jzxhjitu1djb6qg.png" alt="Launch Kiro, select File and open the folder" width="714" height="660"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3) To begin a spec driven development session, select Spec from the options &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh0pymmcv28pj1a1bzdao.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh0pymmcv28pj1a1bzdao.png" alt="Select Spec from the options " width="800" height="676"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;4) Enter a natural language prompt into the chat window, here's what I used:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Build a simple poll and voting REST API using Python and FastAPI

Requirements:

- A poll has: id, question, options, votes
- Options are strings
- Votes are counted per option
- Endpoints:
  - POST /polls to create a poll
  - GET /polls/{id} to retrieve a poll and its results
  - POST /polls/{id}/vote to vote for an option
- Store all data in memory
- Return JSON responses
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;5) Wait for Kiro to create a requirements document based on your prompt, it will create a document containing requirements, user stories and acceptance criteria:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyje8gyc209x3n2icwbh8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyje8gyc209x3n2icwbh8.png" alt="Kiro creates a requirements document" width="800" height="639"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6) To refine the requirements, enter any additional requirements into the chat window to tell Kiro to add whatever you think is missing. Here's what I added:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Add support for displaying poll results as a percentage

Poll results must be calculated as a percentage as well as displaying the raw numbers of votes
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;7) After that, if the requirements look good to you, click &lt;strong&gt;Move to design phase&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7qmwt6a8dohww5lfezsr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7qmwt6a8dohww5lfezsr.png" alt="If the requirements look good move to design phase" width="800" height="702"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;8) Kiro will create a design based on the agreed requirements, click on the generated &lt;strong&gt;design.md&lt;/strong&gt; file to view the design document. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note that Kiro includes error handling and a testing strategy. In practice I found that this added a lot of time and iteration to my implementation steps, as it was validating changes and asking for permission to run tests throughout implementation.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You may prefer reduce the amount of testing and error handling to quickly see something in action, then work on adding the appropriate tests afterwards. It's up to you because you can refine the requirements and design to fit your needs as you go!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flkknmzjzbqz84y59v4lx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flkknmzjzbqz84y59v4lx.png" alt="View the design document" width="800" height="595"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;9) After reviewing the design you can ask Kiro to make any modifications that you like, then when you're ready, click &lt;strong&gt;Move to implementation plan&lt;/strong&gt;. After that, Kiro creates an implementation plan based on the agreed requirements and design.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffltidsm7lxmkmcjsyb2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffltidsm7lxmkmcjsyb2b.png" alt="The Implementation plan is ready and we can now start implementing the feature" width="800" height="945"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;10) Kiro has created an implementation plan and created a file named &lt;strong&gt;tasks.md&lt;/strong&gt;. Open this file to view the contents and begin the implementation. You can open it by clicking &lt;strong&gt;Task list&lt;/strong&gt; at the top of the screen, or &lt;strong&gt;tasks.md&lt;/strong&gt; from the folder view on the left.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fphls2r359f6lycnjz6y9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fphls2r359f6lycnjz6y9.png" alt="Open tasks.md to view the implementation plan&amp;lt;br&amp;gt;
" width="800" height="869"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;11) To begin implementing your feature, click &lt;strong&gt;Start task&lt;/strong&gt; above the first task in your implementation plan. If you allow it to, Kiro will set up everything needed to run your code and begin testing it. Be sure to understand everything that it is doing and ensure you are happy before allowing it!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwzwltpl9qwdmb2kcn553.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwzwltpl9qwdmb2kcn553.png" alt="Kiro can help you implement the feature you just designed&amp;lt;br&amp;gt;
" width="800" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before installing dependencies, it asks you for permission. Be sure to review the contents of requirements.txt and only click &lt;strong&gt;Run&lt;/strong&gt; to allow it if you are happy to install the dependencies.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn99aedrkkqkrw4wqrddv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn99aedrkkqkrw4wqrddv.png" alt="Kiro asks permission before installing dependencies, be sure to review them and understand them before allowing it! " width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;12) After that, work through the rest of the steps in the implementation plan. As you go, Kiro will periodically ask for permission to perform tests at each stage to ensure the code runs as expected.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvf8xenchnu40yr9h950k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvf8xenchnu40yr9h950k.png" alt="Running tests at each stage to ensure everything is working" width="800" height="509"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;13) I ended up aborting the step to add error handling and response formatting during the initial implementation, since I wanted to check the API's basic functionality myself before adding this. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fji330n3jn7kx2tjx2dqu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fji330n3jn7kx2tjx2dqu.png" alt="I aborted the addition of error handling at this stage, preferring to add error handling after running my own tests" width="800" height="444"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;14) After all the implementation tasks have been completed. Open the &lt;strong&gt;README.md&lt;/strong&gt; file to see the steps to run your API. If anything doesn't make sense, or doesn't work as expected, you can chat with Kiro to ask for help with troubleshooting or interacting with your API. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzcudhcgu95sdmws00c6g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzcudhcgu95sdmws00c6g.png" alt="README.md explains how to use the API" width="800" height="583"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;15) From the terminal window in Kiro, making sure I am in the folder where &lt;strong&gt;main.py&lt;/strong&gt; exists, I'll run the application in the background, using the following command, (check the details of your own README file to see how to start your app):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python main.py &amp;amp;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zpc1ugmksy2z8kxa8og.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4zpc1ugmksy2z8kxa8og.png" alt="The application starts and runs on localhost" width="800" height="353"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;16) In the terminal, hit the &lt;strong&gt;enter&lt;/strong&gt; key to get your cursor back, so you can run the next command in the same terminal window.&lt;/p&gt;

&lt;p&gt;Next I'll test that the API is running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl http://localhost:8000/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm35zw18jlfpj1tl8o91p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm35zw18jlfpj1tl8o91p.png" alt="The API is listening on localhost" width="800" height="114"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;17) Finally I can try interacting with my API :&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbcmkexy6jmluks3eu10y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbcmkexy6jmluks3eu10y.png" alt="Using my API to create a poll " width="800" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;My poll ID is: &lt;strong&gt;1f46ff20-88e0-4349-a31a-ce588c187b62&lt;/strong&gt; and I'll use it in the next command to cast a vote.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1knig70if13edwob37gi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1knig70if13edwob37gi.png" alt="Casting a vote in my poll" width="800" height="82"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I can also view the votes that have been cast.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshtix4rws79z8kipmf7l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fshtix4rws79z8kipmf7l.png" alt="Viewing the created poll and votes" width="800" height="112"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Kiro Credits
&lt;/h2&gt;

&lt;p&gt;The Kiro free-tier allows you 50 credits per month, with the most affordable pro-tier providing 1000 per month for $20/month. I can see how you could easily burn through a lot of credits just trying it out, as this entire exercise used up around 33 credits to tackle what I think is a very simple use-case. &lt;/p&gt;

&lt;h2&gt;
  
  
  Working in a Structured Way
&lt;/h2&gt;

&lt;p&gt;Building this simple voting API was a fun way to understand how Kiro’s spec-driven workflow actually feels in practice. Rather than jumping straight into writing code, Kiro encouraged me to slow down and think about requirements, design, and implementation as distinct steps, and then guided me through each phase in a structured way!&lt;/p&gt;

&lt;h2&gt;
  
  
  Other Practical Use-cases
&lt;/h2&gt;

&lt;p&gt;A lot of my work involves creating CloudFormation templates, simple Python scripts, IAM policies and other AWS-related code, so having explored the spec-driven workflow I can imagine lots of ways that Kiro can help increase productivity when working with AWS APIs and SDKs. I can't wait to try it out for some of these use cases!&lt;/p&gt;

&lt;p&gt;If you try out this exercise or similar for yourself, please let me know what you think!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>kiro</category>
      <category>programming</category>
    </item>
    <item>
      <title>The Future of Cyber Resilience for Complex AWS Environments is Here</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Thu, 27 Nov 2025 15:05:48 +0000</pubDate>
      <link>https://dev.to/aws-builders/the-future-of-cyber-resilience-for-complex-aws-environments-is-here-26c1</link>
      <guid>https://dev.to/aws-builders/the-future-of-cyber-resilience-for-complex-aws-environments-is-here-26c1</guid>
      <description>&lt;p&gt;2025 has seen the cloud landscape continue to evolve at an extraordinary pace. As organizations accelerate their AI, analytics, and digital transformation workloads, many of us are experiencing a significant increase in complexity. &lt;/p&gt;

&lt;p&gt;Systems are becoming more distributed, with workloads spread across multiple regions, accounts, and vendors. With complexity comes fragmentation, and a sharp rise in risk around cyber threats, identity compromise, and multi-cloud governance, leading many of us to wonder how to maintain visibility across disparate systems as well as how to handle protection, resilience, and recovery at scale.&lt;/p&gt;

&lt;p&gt;This is why I was excited to learn about some of the latest announcements and releases from Commvault, announced at &lt;a href="https://www.commvault.com/shift-virtual" rel="noopener noreferrer"&gt;SHIFT 2025.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9fnel5pddou1u0dx404.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc9fnel5pddou1u0dx404.png" alt="Commvault Cloud Unity - unifying data security, cyber recovery, and identity resilience into one AI-enabled platform" width="800" height="231"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Commvault Cloud Unity - a unified platform designed for the realities of cloud at scale
&lt;/h2&gt;

&lt;p&gt;Most organizations’ AWS environments have grown organically, spanning multiple accounts and regions, with the vast majority using multiple cloud vendors as well as running hosted workloads and data on-premises. This approach allows for the adoption of best-of-breed technologies and services, however the trade-off is that such mixed environments become increasingly difficult to manage and protect.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.commvault.com/news/new-era-in-enterprise-resilience-commvault-cloud-unity-platform-release" rel="noopener noreferrer"&gt;Commvault Cloud Unity&lt;/a&gt; is a major release that unifies data security, cyber recovery, and identity resilience into one AI-enabled platform. It provides a single pane of glass spanning all workloads, regions, and protection policies, across AWS, on-premises, and hybrid environments.&lt;/p&gt;

&lt;h3&gt;
  
  
  Features of the Commvault Cloud Unity platform:
&lt;/h3&gt;

&lt;h4&gt;
  
  
  AI-driven discovery of all AWS workloads across accounts and regions
&lt;/h4&gt;

&lt;p&gt;Commvault Cloud Unity automatically identifies AWS workloads and data across EC2, EKS, RDS, DynamoDB, S3, Lambda-backed services, and more.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftdsxrfel3qfa1gcougdq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftdsxrfel3qfa1gcougdq.png" alt="Addressing the challenge of understanding where data is located, and what’s protected " width="800" height="227"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Clear visibility into what’s protected (and what isn’t)
&lt;/h4&gt;

&lt;p&gt;One of the biggest challenges is understanding where data is located. What’s protected? What’s under-protected, or not protected at all? In addition to helping you discover your data landscape, Commvault Cloud Unity also provides automated classification and protection policy recommendations.&lt;/p&gt;

&lt;h4&gt;
  
  
  Synthetic Recovery: Clean, Complete Restorations for AWS Workloads
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;This is, in my view, one of the most exciting capabilities Commvault has introduced.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;AWS estates often include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Distributed EC2 workloads&lt;/li&gt;
&lt;li&gt;Massive S3 data lakes&lt;/li&gt;
&lt;li&gt;Numerous databases (RDS, Aurora, DynamoDB)&lt;/li&gt;
&lt;li&gt;Containerized workloads running on EKS&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If any part of this is compromised, restoring cleanly can be incredibly complex and nuanced. Previously, you’d have to choose between an older backup that’s clean, or a recent snapshot that might be contaminated. Neither option is great.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvbpuz3ppo5sjrmk77mx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvbpuz3ppo5sjrmk77mx.png" alt="Between an older backup that’s clean, or a recent snapshot that might be contaminated, neither option is great." width="800" height="255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h5&gt;
  
  
  Synthetic Recovery changes that completely.
&lt;/h5&gt;

&lt;p&gt;It uses AI to identify compromised blocks or files, remove them automatically, then reassemble them into a synthetically clean recovery point, preserving all clean, recent data. This is incredibly valuable for AWS environments where speed and precision are essential. &lt;/p&gt;

&lt;p&gt;No more rolling back to a recovery point from last Tuesday because it’s the only one you trust. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.commvault.com/platform" rel="noopener noreferrer"&gt;Request a demo&lt;/a&gt; of this exciting feature to see it in action!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskfavmswk4na2z4foj2q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskfavmswk4na2z4foj2q.png" alt="Threat detection across cloud providers" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Threat Scan: Protecting AWS Backups from Hidden Threats
&lt;/h4&gt;

&lt;p&gt;For AWS customers maintaining vast amounts of data in S3 or using snapshot-heavy workflows for EC2 and RDS, this adds vital intelligence to the recovery process.&lt;/p&gt;

&lt;p&gt;Threat Scan brings AI-driven scanning of AWS backup datasets, detection of encrypted files, malware, and indicators of compromise, the ability to inspect recovery points before restoring them, proactive identification of risks inside S3 object versions, EC2 snapshots, and more.  &lt;/p&gt;

&lt;p&gt;With attackers now targeting backups directly, the security of AWS backup data has never been more critical.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F00h0z25cf1afpvfaruxn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F00h0z25cf1afpvfaruxn.png" alt="80% of attacks involve an identity breach!" width="800" height="254"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Identity Resilience for AWS-Hybrid Environments
&lt;/h4&gt;

&lt;p&gt;AWS customers who rely on Active Directory for authentication, whether that’s through AWS Managed AD or integrated with on-premises AD, will benefit from new identity resilience enhancements, which detect, audit, and reverse malicious identity changes.&lt;/p&gt;

&lt;p&gt;Commvault Unity also includes the ability to spot identity anomalies, maintain forensic change logs, roll back malicious AD changes in real time, and even safely test AD recovery inside a cleanroom on AWS. All of this is invaluable for anyone operating a hybrid IAM setup on AWS.&lt;/p&gt;

&lt;h2&gt;
  
  
  Solving the challenges that AWS customers struggle with the most
&lt;/h2&gt;

&lt;p&gt;Collectively, these announcements represent a major step forward for AWS resilience. They bring clarity where there has been confusion, automation where there has been manual effort, and integrated protection where there have been fragmented tools.&lt;/p&gt;

&lt;p&gt;**Commvault Cloud Unity solves the challenges that AWS customers struggle with the most, like data sprawl, inconsistent policies, cyber risk, and complex backup management. **With one secure, automated platform spanning hybrid and multi-cloud environments, organizations benefit from faster recovery, streamlined operations, and complete confidence that their critical data is properly protected and recoverable when it matters most.&lt;/p&gt;

&lt;h2&gt;
  
  
  Want to Learn More?
&lt;/h2&gt;

&lt;p&gt;Exciting times for Commvault, for AWS, and for those of us responsible for mission critical workloads in the cloud! If you’re interested in hearing more about all of these announcements, you can &lt;a href="https://www.commvault.com/shift-virtual" rel="noopener noreferrer"&gt;watch all the sessions from SHIFT 2025&lt;/a&gt; on demand, and &lt;a href="https://www.commvault.com/platform" rel="noopener noreferrer"&gt;request a demo!&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Commvault at re:Invent 2025
&lt;/h2&gt;

&lt;p&gt;If you’re heading to AWS re:Invent this year, visit the Commvault team in the Expo Hall at booth #621 to talk cyber recovery and AWS-native resilience, experience some very cool demos, and more!&lt;/p&gt;

</description>
      <category>cybersecurity</category>
      <category>aws</category>
      <category>ai</category>
    </item>
    <item>
      <title>From SysOps to CloudOps : Breaking Down the New SOA-C03 Exam from AWS</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Tue, 09 Sep 2025 20:59:49 +0000</pubDate>
      <link>https://dev.to/aws-heroes/from-sysops-to-cloudops-breaking-down-the-new-soa-c03-exam-from-aws-518c</link>
      <guid>https://dev.to/aws-heroes/from-sysops-to-cloudops-breaking-down-the-new-soa-c03-exam-from-aws-518c</guid>
      <description>&lt;p&gt;The AWS Certified SysOps Administrator - Associate (SOA-C02) exam had been the go-to certification for cloud operations and support folks for several years. However, as cloud roles continue to evolve, AWS recently announced plans to retire it and introduce a brand-new certification.  Let’s look at what’s new with AWS Certified CloudOps Engineer – Associate (SOA-C03) and what to expect in the exam.&lt;/p&gt;

&lt;h2&gt;
  
  
  What’s new in the SOA-C03 exam?
&lt;/h2&gt;

&lt;p&gt;The most important thing to note is that the exam is still intended for the same audience, and at least one year of experience in cloud operations roles is recommended.&lt;/p&gt;

&lt;p&gt;Great news is that it still includes 65 multiple-choice and multiple-response questions, 15 of which are unscored, so you are only scored on 50 of the questions. Passing score is still 720 out of a possible 1000. &lt;/p&gt;

&lt;p&gt;On the whole, the exam domains and topics have not changed very much. My interpretation is that nothing has really been removed from the  exam, and just a few extra things have been added.  The &lt;a href="https://d1.awsstatic.com/onedam/marketing-channels/website/aws/en_US/certification/approved/pdfs/docs-cloudops-associate/AWS-Certified-CloudOps-Engineer-Associate_Exam-Guide.pdf" rel="noopener noreferrer"&gt;exam guide&lt;/a&gt; now explicitly calls out skills like containerization, orchestration, and a general understanding of CI/CD practices.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exam Domains Compared
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0f18rdv2codhabu5kan9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0f18rdv2codhabu5kan9.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At first glance, the exam domains seem extremely similar. The Cost and Performance Optimization domain from SOA-S02 has been removed, however the content of that domain seems to have been distributed across the remaining domains. When reviewing the &lt;a href="https://d1.awsstatic.com/onedam/marketing-channels/website/aws/en_US/certification/approved/pdfs/docs-cloudops-associate/AWS-Certified-CloudOps-Engineer-Associate_Exam-Guide.pdf" rel="noopener noreferrer"&gt;exam guide&lt;/a&gt;, I noticed that nothing has been removed.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key shifts to pay attention to
&lt;/h2&gt;

&lt;p&gt;The most significant shift in the new SOA-C03 exam is the increased emphasis on modern cloud operations practices, with a greater focus on the following:&lt;/p&gt;

&lt;h2&gt;
  
  
  Automation and IaC (infrastructure as code)
&lt;/h2&gt;

&lt;p&gt;You should know how to manage stacks of resources using CloudFormation and the AWS Cloud Development Kit (AWS CDK).&lt;/p&gt;

&lt;h2&gt;
  
  
  Hands-on knowledge of containers and orchestration
&lt;/h2&gt;

&lt;p&gt;Be familiar with the basics of running and orchestrating workloads on Amazon Elastic Container Service (Amazon ECS) and Amazon Elastic Kubernetes Service (Amazon EKS). You should also understand how to collect metrics and logs from these services using the Amazon CloudWatch Agent.&lt;/p&gt;

&lt;h2&gt;
  
  
  Multi-account and multi-region design skills
&lt;/h2&gt;

&lt;p&gt;You'll need an understanding of operating  in complex environments, including building CloudWatch dashboards to handle metrics and alarms across multiple accounts and AWS Regions,  provisioning and sharing resources across accounts and regions, and securely implementing multiple accounts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost optimization and cloud financial management
&lt;/h2&gt;

&lt;p&gt;The ability to optimize for cost when configuring core AWS services like EBS, RDS, and for Networking services, as well as understanding how to use AWS Cost Explorer and Cost and Usage Reports will also be useful.&lt;/p&gt;

&lt;h2&gt;
  
  
  New AWS services to know
&lt;/h2&gt;

&lt;p&gt;The SOA-C03 exam now includes additional AWS services, like:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Managed Service for Prometheus&lt;/strong&gt; (used for monitoring &amp;amp; logging)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Managed Grafana&lt;/strong&gt; ( a widely used service for data visualization)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Network Firewall&lt;/strong&gt; (network-level traffic inspection and filtering)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Elastic Container Service&lt;/strong&gt; (managed containerization orchestration service)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Elastic Container Registry&lt;/strong&gt; (used to centrally store container images)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon EKS&lt;/strong&gt; (managed Kubernetes service)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IAM Identity Center&lt;/strong&gt; (centralized identity management)&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;In short, AWS has rebranded and revamped this exam to match the realities of modern cloud management and some newer technologies that many of us are using. The SOA-C03 is now all about proving you’re ready to manage AWS at scale in today’s cloud-driven world, using the latest tools and technologies. &lt;/p&gt;

&lt;p&gt;I've booked to take the new exam on September 30th and will let you know what I think as soon as I can. Are you planning to try SOA-C03?&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudcomputing</category>
      <category>careerdevelopment</category>
    </item>
    <item>
      <title>Using AWS MCP Servers with the Amazon Q Developer CLI</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Tue, 05 Aug 2025 16:47:34 +0000</pubDate>
      <link>https://dev.to/aws-builders/using-aws-mcp-servers-with-the-amazon-q-developer-cli-37k6</link>
      <guid>https://dev.to/aws-builders/using-aws-mcp-servers-with-the-amazon-q-developer-cli-37k6</guid>
      <description>&lt;h1&gt;
  
  
  What is MCP?
&lt;/h1&gt;

&lt;p&gt;The Model Context Protocol (MCP) is an open standard which allows AI models to communicate seamlessly with external tools. &lt;strong&gt;If you haven’t already tried using MCP servers with your favourite LLM, this is a really easy way to get started.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the context of the Amazon Q Developer CLI, MCP enables you to extend Q's capabilities by connecting it to a range of pre-built and custom tools! &lt;/p&gt;

&lt;p&gt;I wrote previously about the &lt;a href="https://dev.to/aws-heroes/10-ways-i-use-the-amazon-q-developer-cli-to-save-time-88m"&gt;Ten ways I use the Amazon Q Developer CLI to save time&lt;/a&gt;, but the addition of MCP servers can seriously &lt;strong&gt;take your productivity to a whole new level&lt;/strong&gt; and you can get started in just a few minutes with a few easy commands.&lt;/p&gt;

&lt;h2&gt;
  
  
  What can MCP Servers actually do?
&lt;/h2&gt;

&lt;p&gt;Here are a few examples of interesting MCP servers that can be added to Amazon Q Developer, to extend its functionality:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Pricing MCP Server&lt;/strong&gt; creates cost analysis reports based on current pricing&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Terraform MCP Server&lt;/strong&gt; lets you run Terraform commands, get best practice advice, run security scans on code&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Amazon EKS MCP Server&lt;/strong&gt; allows you to create EKS clusters, deploy apps, troubleshoot&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Documentation MCP Server&lt;/strong&gt; allows you to read and search the AWS documentation&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;AWS Diagram MCP Server&lt;/strong&gt; generates diagrams using the &lt;strong&gt;diagrams&lt;/strong&gt; Python package&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I needed to create a few diagrams this week, so I decided to see how the AWS Diagram MCP Server performs, however you can check out the &lt;a href="https://awslabs.github.io/mcp/" rel="noopener noreferrer"&gt;GitHub repo for AWS MCP Servers&lt;/a&gt; to see what else is available.&lt;/p&gt;

&lt;h2&gt;
  
  
  Adding the AWS Diagram MCP Server to the Amazon Q Developer CLI Agent
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Pre-requisites
&lt;/h3&gt;

&lt;p&gt;Before you start adding MCP Servers to Q Developer, there are a few pre-req's to configure. You'll need to be running &lt;strong&gt;Python ≥ 3.10&lt;/strong&gt;, &lt;strong&gt;uv&lt;/strong&gt; (used for package management), and for my example below I also needed to install &lt;strong&gt;Graphviz&lt;/strong&gt; (for graph visualizations) and &lt;strong&gt;diagrams&lt;/strong&gt; (a Python module for creating and displaying diagrams). &lt;/p&gt;

&lt;p&gt;If like me, you're working on macOS here's how to add these pre-requisites (otherwise check the instructions for your OS):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install uv

uv python install 3.10

brew install graphviz

pip install diagrams
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Adding the MCP Server
&lt;/h3&gt;

&lt;p&gt;First I added the AWS Diagram MCP Server using this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;qchat mcp add --name aws-diagram --command uvx --args awslabs.aws-diagram-mcp-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I also added the AWS Documentation MCP Server using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;qchat mcp add --name aws-docs --command uvx --args awslabs.aws-documentation-mcp-server@latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Using the MCP Server with the Q Developer CLI Agent
&lt;/h3&gt;

&lt;p&gt;To invoke the MCP server, start a chat session with Amazon Q Developer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;q chat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When you start the Q CLI, it will load any MCP servers that you have added, and you can check that any MCP servers have successfully loaded using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/tools
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You should see output similar to this:&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fivbck9ko9veexhjy3cu6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fivbck9ko9veexhjy3cu6.png" alt="Screenshot showing that the MCP servers that have been added to my locally installed Amazon Q Developer CLI Agent" width="800" height="638"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After that you can start interacting with the MCP server, here are some examples to try.&lt;/p&gt;
&lt;h2&gt;
  
  
  1. Creating a diagram needed for a lab I'm working on
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Create a diagram that shows a DynamoDB table named orders, 
data being populated from a JSON file, 
2 x Global secondary indexes (one named OrderDateIndex, 
and one named OrderStatusIndex) as well as a query and scan 
operation being submitted to the dynamodb table. 
please save the diagram to the current folder.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The initial diagram Q created made some assumptions about how I was running commands, and added Lambda and S3 which I am not using, so I give it some more information to try and get closer to what I need:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;The scan and query operations are run from the AWS CLI 
using the AWS CloudShell and so is the batch_write_items 
command that is used to populate the table. Please update 
the diagram to remove the Lambda and S3 references, 
also the sort key for this table is user_id.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's the final diagram, pretty good!&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft6122ptou4efgpgqpxyf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft6122ptou4efgpgqpxyf.png" alt="A diagram showing the scan and query operations being run from the AWS CLI using the AWS CloudShell. " width="800" height="1087"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  2. Convert CloudFormation or Terraform code into a diagram
&lt;/h2&gt;

&lt;p&gt;This is great for visualizing complex IaC templates and showing resource dependencies.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Create a diagram that shows everything being created 
by the CloudFormation template named library-bot.yml, 
the diagram should show all resource dependencies 
and relationships. please save the diagram to the current folder.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's the result, enabling me to check that I've included everything I need in this template:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ja8x69whxxs3z3mglut.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4ja8x69whxxs3z3mglut.png" alt="Diagram visualizing a CloudFormation template to build an Amazon Lex chatbot" width="800" height="755"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Other use-cases
&lt;/h1&gt;

&lt;p&gt;I can see how this MCP server could be really useful for reviewing architectures, explaining concepts in meetings, quickly creating diagrams to illustrate different options for comparison and discussion, and to help with on-boarding team members enabling them to visualize existing architectures. I'm looking forward to trying some of the other capabilities!&lt;/p&gt;

&lt;p&gt;Have you started using any of the MCP servers with the Amazon Q Developer CLI Agent yet? &lt;/p&gt;

</description>
      <category>aws</category>
      <category>development</category>
      <category>cloudcomputing</category>
      <category>mcp</category>
    </item>
    <item>
      <title>10 ways I use the Amazon Q Developer CLI to save time</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Wed, 25 Jun 2025 07:40:35 +0000</pubDate>
      <link>https://dev.to/aws-heroes/10-ways-i-use-the-amazon-q-developer-cli-to-save-time-88m</link>
      <guid>https://dev.to/aws-heroes/10-ways-i-use-the-amazon-q-developer-cli-to-save-time-88m</guid>
      <description>&lt;p&gt;If you haven’t already started using the Amazon Q Developer CLI Agent in your terminal, this is your sign to &lt;a href="https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/command-line-installing.html" rel="noopener noreferrer"&gt;install it right now&lt;/a&gt;, and get more done without the burnout.&lt;/p&gt;

&lt;p&gt;I wrote previously about &lt;a href="https://dev.to/aws-heroes/create-architecture-diagrams-in-seconds-with-the-amazon-q-developer-cli-agent-1110"&gt;how I used Q Developer to create architecture diagrams in seconds from a CloudFormation template&lt;/a&gt;, but recently I challenged myself to integrate AI into every possible area of my workflow. &lt;/p&gt;

&lt;h2&gt;
  
  
  Avoiding burnout from day-to-day activities
&lt;/h2&gt;

&lt;p&gt;I should mention that my day-to-day role focuses on content creation. This involves a lot of hands-on implementation, primarily using AWS services.&lt;/p&gt;

&lt;p&gt;A large part of my day is currently spent designing practical exercises, building from scratch, but also operating within the constraints of a sandbox environment which has its own challenges. For instance, in our environment certain permissions are denied to prevent misuse, and each sandbox is only available for a few hours before being wiped clean, and all resources deleted. &lt;/p&gt;

&lt;p&gt;I also have a lot of variety in my work, in terms of the AWS services I get to work with, though right now I’m working mainly with AI/ML focussed tech. &lt;strong&gt;This is a fast-moving space, so any help I can get in terms of lifting the cognitive load all contributes to saving my rocket fuel, avoiding burnout, and freeing my mind to think creatively.&lt;/strong&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  My top ten
&lt;/h2&gt;

&lt;p&gt;So here the ways that Amazon Q Developer has been most effective in lightening the load. If you work with AWS regularly, then I hope that some of these might save you some time and energy too!&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Creating IaC templates, e.g. &lt;code&gt;“convert this CloudFormation template to CDK”&lt;/code&gt;, or &lt;code&gt;"help me write a CloudFormation template to provision this IAM policy"&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Look up the syntax of some of the more obscure AWS CLI commands that I don't use very often e.g. &lt;code&gt;“how do I stop a Comprehend sentiment detection job using the AWS CLI”&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Perform quick troubleshooting when an error occurs e.g.&lt;code&gt;"I have already configured IAM permissions, so why am I seeing this error: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied"&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Generate sample data to use for testing / proof of concept e.g. &lt;code&gt;"generate a csv file containing 100 examples of synthetic names and addresses"&lt;/code&gt; Interestingly, when I asked for 100 examples of Personally Identifiable Information (PII), Q declined. Instead it created the file and redacted the PII!&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Help writing IAM policies, and S3 bucket policies e.g.&lt;code&gt;"review this IAM policy for security best practices"&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Explain code e.g. &lt;code&gt;“explain what this Python function does”&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Quick code generation: Generate code snippets, functions, or entire files with natural language prompts e.g. &lt;code&gt;“add error handling to this function”&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Bash Command Generation - Get complex bash commands without having to remember syntax e.g. &lt;code&gt;"show me how to find all files that have been modified in the last 7 days"&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Get help with using AWS SDKs and APIs e.g. &lt;code&gt;“explain how to use boto3 to interact with Amazon Bedrock”&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Get best practice guidance e.g. &lt;code&gt;“explain the best practices when setting common request body parameters for models in Amazon Bedrock”&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Here's why I think the Amazon Q CLI agent is a game changer: &lt;strong&gt;each of these capabilities saves me significant time by providing immediate answers and guidance&lt;/strong&gt; without having to search through documentation or switch context to one of the 44 browser tabs I currently have open. In my role, &lt;strong&gt;my differentiator, or unique selling point is creativity, ideas, and imagination - the Q CLI gives me the headspace to focus on exactly that&lt;/strong&gt;, allowing me to preserve my energy for tasks that really matter.&lt;/p&gt;

&lt;p&gt;If you have been using Amazon Q recently, I'd love to hear how it has saved you time and energy, what should I try next?&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>coding</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Create Architecture Diagrams in Seconds with the Amazon Q Developer CLI Agent!</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Fri, 23 May 2025 17:17:04 +0000</pubDate>
      <link>https://dev.to/aws-heroes/create-architecture-diagrams-in-seconds-with-the-amazon-q-developer-cli-agent-1110</link>
      <guid>https://dev.to/aws-heroes/create-architecture-diagrams-in-seconds-with-the-amazon-q-developer-cli-agent-1110</guid>
      <description>&lt;p&gt;If you haven't used the Amazon Q Developer CLI Agent, this is your sign to get started!&lt;/p&gt;

&lt;h2&gt;
  
  
  Installation Steps
&lt;/h2&gt;

&lt;p&gt;Install is very quick and easy, though the steps vary depending on your operating system, so you'll need to &lt;a href="https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/command-line-installing.html" rel="noopener noreferrer"&gt;follow the required steps for your particular OS&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I use a Mac, so my steps were really straightforward, if you are also using a Mac, here's what you do to install the Q Developer Agent CLI. In a terminal, simply type: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;brew install amazon-q&lt;/code&gt;&lt;br&gt;
You can verify the install using:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;q --version&lt;/code&gt;&lt;br&gt;
If it doesn’t show the version the first time, then just restart your shell or terminal and try again.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;exec $SHELL
q --version
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next login to the Q Developer CLI, you'll get the choice of logging in using your AWS Builder ID which is the free-tier option, or if you are subscribed to pro-tier log in using your IAM Identity Center credentials:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;q login&lt;/code&gt;&lt;br&gt;
Then, to start a session with the Q Developer CLI, its:&lt;br&gt;
&lt;code&gt;q chat&lt;/code&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Creating Architecture Diagrams from an IaC Template
&lt;/h2&gt;

&lt;p&gt;I wanted to use Q to help me quickly create an architecture diagram based on the contents of my &lt;a href="https://github.com/fayekins/amazon-q-dev/blob/main/cf-template.yaml" rel="noopener noreferrer"&gt;CloudFormation template that I was using to create a basic VPC&lt;/a&gt;. As my template is saved in GitHub, I provided a link to the raw file in my prompt and Q was able to find it, however you can of course get Q to read the files on your local machine, inside the directory you're working in. &lt;/p&gt;
&lt;h3&gt;
  
  
  The Prompt
&lt;/h3&gt;

&lt;p&gt;First I tried creating a Mermaid diagram. If you haven't used Mermaid before, it's a JavaScript based tool used for rendering diagrams from markdown text.&lt;br&gt;
Here's the prompt I used:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Create a mermaid diagram that represents the contents
of the CloudFormation template named cf-template.yaml.
Here is a link to the template:
https://raw.githubusercontent.com/fayekins/amazon-q-dev/refs/heads/main/cf-template.yaml 
The diagram should include all the AWS resources that
are described in the template, and use the following 
best practices: 
consistent spacing between elements, 
align elements properly, 
use clear labels for all components, 
include CIDR blocks for all networking components,
make sure text is a readable size.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The resulting diagram that Amazon Q Developer created can be viewed by pasting the resulting code into &lt;a href="https://mermaid.live/" rel="noopener noreferrer"&gt;the Mermaid Live Editor&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Here's the result, not too bad! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdmqkn7yghdf5anby4ii6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdmqkn7yghdf5anby4ii6.png" alt="Q generated a simple diagram based on my CloudFormation template" width="800" height="439"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Refining the Diagram
&lt;/h3&gt;

&lt;p&gt;Ultimately, I wanted to get Q to create diagrams that I could actually use. For that I really need the official AWS icons. So I asked Q to create a draw.io diagram instead, using the official icons. Here's my next prompt:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Can you create the same diagram again but this time
create a draw.io diagram using the correct AWS icons
for the AWS resources.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here's the response from Q, explaining the steps taken and instructions for viewing the diagram. &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8yhyd1r4hgasaadyc794.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8yhyd1r4hgasaadyc794.png" alt="The response from Q, explaining the steps taken and instructions for viewing the diagram" width="800" height="473"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;And here is the result, a pretty decent diagram!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdw758mue40v8niw8igs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbdw758mue40v8niw8igs.png" alt="An updated version of my diagram, in draw.io format and using AWS icons" width="601" height="521"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'm really happy with the result, and will definitely be able to use this capability going forward! Amazon Q Developer is certainly improving fast and I'm looking forward to seeing what else it can do to save me time day-to-day. &lt;/p&gt;

&lt;p&gt;Would you use AI to create your architecture diagrams? &lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>ai</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Tutorial: Build an Agentic AI Application with Agents for Amazon Bedrock</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Tue, 11 Mar 2025 15:08:54 +0000</pubDate>
      <link>https://dev.to/aws-builders/tutorial-build-an-agentic-ai-application-with-agents-for-amazon-bedrock-2cpk</link>
      <guid>https://dev.to/aws-builders/tutorial-build-an-agentic-ai-application-with-agents-for-amazon-bedrock-2cpk</guid>
      <description>&lt;p&gt;Here's a step-by-step process for building an application that uses  Agents for Amazon Bedrock to trigger a Lambda function with the ability to execute tasks for you. The code I used can be found &lt;a href="https://github.com/fayekins/bedrock-agent-demo" rel="noopener noreferrer"&gt;here&lt;/a&gt;. The cost will be &amp;lt;$5, if you remember to run all the clean-up steps at the end!&lt;/p&gt;

&lt;p&gt;This simple application allows parents to book an appointment with their child's high school teacher at the upcoming Parents and Teachers Evening. Data relating to the available time slots, existing appointment bookings, teachers, their subjects, and classrooms is stored in DynamoDB tables. &lt;/p&gt;

&lt;p&gt;The architecture looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frl4pchimhezmh7ity0lu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frl4pchimhezmh7ity0lu.png" alt="Architecture diagram showing the interaction between Amazon Bedrock, Lambda and DynamoDB" width="755" height="381"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Architecture Components
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Amazon Bedrock
&lt;/h3&gt;

&lt;p&gt;Used to provide API access to the required foundation model. In this example, the model we are using is: &lt;code&gt;anthropic.claude-3-sonnet-20240229-v1:0&lt;/code&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon Bedrock Agent
&lt;/h3&gt;

&lt;p&gt;The Bedrock agent uses the reasoning capabilities of the specified AI model, along with the data available to it, and the available actions to figure out how to deal with the user requests it receives. In this example, the agent needs to answer questions like checking the availability of appointments at the Parents and Teachers Evening, as well as booking the appointments. &lt;/p&gt;

&lt;h3&gt;
  
  
  Amazon Bedrock Action Group
&lt;/h3&gt;

&lt;p&gt;This defines the actions that the agent can take, for instance the ability to search and update the data held in DynamoDB. The actions are really API calls that the agent is able to make in order to fulfil user requests.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lambda Function
&lt;/h3&gt;

&lt;p&gt;A Lambda function is used to make the necessary API calls, like searching and updating the data held in DynamoDB.&lt;/p&gt;

&lt;h3&gt;
  
  
  DynamoDB
&lt;/h3&gt;

&lt;p&gt;DynamoDB is used to store the data held by the system. Two tables are created and populated with data relating to appointments with teachers at the upcoming parents' evening. This is our custom data that our agent is able to interact with.&lt;/p&gt;

&lt;h3&gt;
  
  
  Jupyter Notebook
&lt;/h3&gt;

&lt;p&gt;A Jupyter notebook running in SageMaker is used as the IDE (Integrated Dev Environment), to run all commands used to build all the various components in AWS, as well as download the GitHub repository and run the Python code.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;1) Do everything in us-east-1.&lt;br&gt;
2) In your AWS account, be sure to request access for the Bedrock models that you would like to use. You'll find this in the Bedrock console, under &lt;strong&gt;model access&lt;/strong&gt;. (For this exercise, I enabled &lt;code&gt;anthropic.claude-3-sonnet-20240229-v1:0&lt;/code&gt; .)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6isbupspx0e55ngmu1t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl6isbupspx0e55ngmu1t.png" alt="Image showing how to enable access to models in Bedrock" width="800" height="396"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3)Before creating the SageMaker Notebook, first make sure you have a SageMaker AI Domain in us-east-1, this one-time step creates home directory space, and VPC configurations needed by any notebooks you create in this region. If you don't have one already, select the &lt;strong&gt;Create domain&lt;/strong&gt; option, and it will do everything for you.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6hp734clwzf1zw5ytsjy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6hp734clwzf1zw5ytsjy.png" alt="Image showing the creation screen for a SageMaker AI Domain" width="800" height="357"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the Application
&lt;/h2&gt;

&lt;p&gt;1) Use &lt;a href="https://github.com/fayekins/bedrock-agent-demo/blob/main/create_SM_notebook.yaml" rel="noopener noreferrer"&gt;this CloudFormation template - create_SM_notebook.yaml&lt;/a&gt; to create a SageMaker Notebook, that we'll use to run the commands from. The template will configure the SageMaker Notebook instance, with an associated IAM role that includes permissions for a few required services, including:&lt;/p&gt;

&lt;p&gt;Bedrock full access&lt;br&gt;
DynamoDB full access&lt;br&gt;
IAM full access&lt;br&gt;
Lambda full access&lt;/p&gt;

&lt;p&gt;This access is needed in the beginning because we'll be running commands on the notebook instance to build everything. After everything has been configured, these permissions can be tightened up.&lt;/p&gt;

&lt;p&gt;2) When the notebook is ready, select the notebook instance and select &lt;strong&gt;open Jupyter Lab&lt;/strong&gt;. The required &lt;a href="https://github.com/fayekins/bedrock-agent-demo.git" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt; containing the application code will already be downloaded and saved to a folder named &lt;code&gt;bedrock-agent-demo&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbytnu5y7xt4l7z8mizjz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbytnu5y7xt4l7z8mizjz.png" alt="Image showing the Jupyter Lab environment, with downloaded GitHub repository" width="800" height="250"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3) From the cloned repository, open the file named: &lt;code&gt;bedrock_agents_demo.ipynb&lt;/code&gt; - this is an Interactive Python Notebook, each block of code is displayed in a cell that can be run in sequence, to observe the outcome of each step.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3skxo43y23sb7ndhq2qg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3skxo43y23sb7ndhq2qg.png" alt="Image showing the bedrock_agents_demo.ipynb file" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;4) Run all the cells in contained in the &lt;a href="https://github.com/fayekins/bedrock-agent-demo/blob/main/bedrock_agents_demo.ipynb" rel="noopener noreferrer"&gt;bedrock_agents_demo.ipynb&lt;/a&gt; file, which at a high level, will do the following:&lt;/p&gt;

&lt;p&gt;Install required libraries like boto3, which is the AWS SDK for Python that interacts with Bedrock. &lt;/p&gt;

&lt;p&gt;Create the two DynamoDB tables to store data relating to teachers, and appointments. Then populate them with some sample data that the agent will be able to interact with. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpnn0xogywpadl02aprl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpnn0xogywpadl02aprl.png" alt="Image showing the items in the DynamoDB table" width="800" height="270"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Create a Lambda function that has the ability to check if a teacher is available at a specific time, book an appointment with a teacher, and get all available appointment slots.&lt;/p&gt;

&lt;p&gt;Create the agent and action group&lt;br&gt;
The action group defines all the functions that the agent is able to do, for instance &lt;code&gt;check_teacher_availability&lt;/code&gt; and &lt;code&gt;book_appointment&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp1o32jh3hmwr8zhdpvgl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp1o32jh3hmwr8zhdpvgl.png" alt="Image showing the action group in the AWS console" width="800" height="554"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing
&lt;/h2&gt;

&lt;p&gt;Run some prompts to test that everything is working. You can update the input text to modify the prompts to see what is possible. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1mrrymgjm4aequxirq49.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1mrrymgjm4aequxirq49.png" alt="Image showing a prompt being provided to the application" width="800" height="492"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkwjb8e7hug3wpncu7is1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkwjb8e7hug3wpncu7is1.png" alt="Image showing the response from the application" width="800" height="124"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Example Prompts to Try:
&lt;/h2&gt;

&lt;p&gt;Try running the following prompts, or create your own:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Who teaches Sociology?&lt;/li&gt;
&lt;li&gt;Book the first available appointment with the I.T. teacher&lt;/li&gt;
&lt;li&gt;When is the history teacher available?&lt;/li&gt;
&lt;li&gt;List all Miss White's appointments&lt;/li&gt;
&lt;li&gt;Cancel the 18:30 appointment with Mr Stokes&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Observe how the agent uses reasoning, the small number of very basic actions, and the limited data that it has available to try to fulfil your requests. Explore what the agent is able to do, and what the limitations are. Even with a small amount of data and some simple actions, it is able to do quite a lot using reasoning or asking for clarification to try to get the job done! &lt;/p&gt;

&lt;h2&gt;
  
  
  Cleaning Up to Avoid Charges
&lt;/h2&gt;

&lt;p&gt;After exploring, be sure to run the last few cells in the notebook, to clean up the DynamoDB tables, the Bedrock Action Group and Agent, and the Lambda Function to avoid unnecessary charges. Then remember to delete the CloudFormation stack as well if you no longer need the Jupyter notebook instance.&lt;/p&gt;

</description>
      <category>amazonbedrock</category>
      <category>generativeai</category>
      <category>agenticai</category>
      <category>aws</category>
    </item>
    <item>
      <title>Reimagining Resilience with Commvault</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Wed, 11 Dec 2024 12:37:40 +0000</pubDate>
      <link>https://dev.to/aws-builders/reimagining-resilience-with-commvault-3e4e</link>
      <guid>https://dev.to/aws-builders/reimagining-resilience-with-commvault-3e4e</guid>
      <description>&lt;p&gt;In today's rapidly evolving digital landscape, the importance of data protection and cyber-resilience cannot be overstated. As organizations increasingly adopt cloud-first strategies and navigate complex multi-cloud environments, the need for robust, scalable, and intelligent solutions has never been greater. &lt;/p&gt;

&lt;p&gt;That's why I'm thrilled to share some exciting developments from Commvault and their recent acquisition, Clumio, that are set to revolutionize the way we approach data protection and cyber-resilience in the cloud-first era.&lt;/p&gt;

&lt;h3&gt;
  
  
  Unveiling Game-Changing Innovations at AWS re:Invent
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F075qulfvpcaa1fwvb0ng.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F075qulfvpcaa1fwvb0ng.jpg" alt="Commvault at AWS re:Invent 2024" width="800" height="1066"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At AWS re:Invent 2024, Commvault showcased a suite of cutting-edge solutions that promise to redefine cyber-resilience.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lightning-Fast Recovery with Clumio Backtrack
&lt;/h3&gt;

&lt;p&gt;One of the most anticipated announcements is &lt;a href="https://www.commvault.com/platform/clumio-backtrack" rel="noopener noreferrer"&gt;Clumio Backtrack&lt;/a&gt;, a groundbreaking capability that enables lightning-fast recovery of Amazon S3 buckets, effortlessly scaling to billions of objects. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1kdb0q1tg1sy2ry3xpjn.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1kdb0q1tg1sy2ry3xpjn.jpeg" alt="Matt Garman, CEO of AWS, Amazon S3 hosts over 400 trillion objects worldwide" width="600" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With over 400 trillion objects being stored in S3 globally, this is a game-changer for organizations dealing with massive datasets, especially considering more data than ever is being collected and stored as organizations gear up their AI and machine learning efforts. &lt;/p&gt;

&lt;p&gt;I was excited to be able to experience Clumio Backtrack firsthand, and witness the process of selecting a point-in-time recovery source, and rolling back specific versions of selected S3 objects. &lt;/p&gt;

&lt;p&gt;I was surprised to see that recovery completed without issues, within seconds of initiating the job!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcjk35leclhy79m15j6e4.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcjk35leclhy79m15j6e4.jpg" alt="Clumio Backtrack in action - recovery completes within seconds" width="800" height="642"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Turning Back Time with Cloud Rewind
&lt;/h3&gt;

&lt;p&gt;Commvault recently introduced &lt;a href="https://www.commvault.com/platform/cloud-rewind" rel="noopener noreferrer"&gt;Cloud Rewind&lt;/a&gt;, which is like a time-machine for AWS environments! Enabling organizations to roll back to a clean state before a cyber incident, not just recovering data but also rebuilding entire cloud applications and infrastructure. Like having a reset button for your entire cloud estate. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsii0fwzswed7ogpmgaty.JPG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsii0fwzswed7ogpmgaty.JPG" alt="Cloud Rewind Demo - seamlessly rebuilding AWS infrastructure at the click of a button" width="800" height="1422"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Seeing this in action was particularly impressive, and you can &lt;a href="https://www.commvault.com/request-demo" rel="noopener noreferrer"&gt;request your own demo&lt;/a&gt; of Clumio Backtrack and Cloud Rewind, to see for yourself!&lt;/p&gt;

&lt;h3&gt;
  
  
  Fortifying Your Cyber Defenses
&lt;/h3&gt;

&lt;p&gt;In addition to these exciting launches, Commvault is bolstering its cyber-resilience arsenal with two more upcoming offerings:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.commvault.com/platform/air-gap-protect" rel="noopener noreferrer"&gt;Cloud Air Gap Protect&lt;/a&gt;: Coming soon to AWS, this solution will provide air-gapped, immutable, and indelible copies of critical data, offering an extra layer of protection against sophisticated cyber threats.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://www.commvault.com/platform/cleanroom-recovery" rel="noopener noreferrer"&gt;Cloud Cleanroom Recovery&lt;/a&gt;: Also on the horizon for AWS users, this feature will deliver an on-demand, isolated cloud-based cyber recovery environment. It's perfect for both resilience testing and actual recovery scenarios.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  A New Era of Continuous Business
&lt;/h3&gt;

&lt;p&gt;The combination of Commvault and Clumio marks a significant milestone in the journey towards true cyber-resilience. By combining Commvault's enterprise-grade data protection with Clumio's cloud-native expertise, organizations can now access a comprehensive suite of solutions that ensure business continuity in the face of any threat.&lt;/p&gt;

&lt;p&gt;As we look to the future, the ability to protect, recover, and rebuild critical data and systems will be paramount, as the value of data increases exponentially. With these new offerings, Commvault is empowering businesses to face the challenges of the digital age with confidence, knowing that their most valuable asset – their data – is always secure and recoverable.&lt;/p&gt;

&lt;p&gt;The future of cyber-resilience is here, and it's more exciting than ever. &lt;a href="https://www.commvault.com/request-demo" rel="noopener noreferrer"&gt;Request a demo&lt;/a&gt; to learn more!&lt;/p&gt;

</description>
      <category>continousbusiness</category>
      <category>cyberresilience</category>
      <category>awsreinvent</category>
      <category>commvaultcloud</category>
    </item>
    <item>
      <title>Save time with the Amazon Bedrock Converse API!</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Tue, 26 Nov 2024 17:45:52 +0000</pubDate>
      <link>https://dev.to/aws-builders/save-time-with-the-amazon-bedrock-converse-api-2ai6</link>
      <guid>https://dev.to/aws-builders/save-time-with-the-amazon-bedrock-converse-api-2ai6</guid>
      <description>&lt;p&gt;With Bedrock you get access a range of different large language models, ( for instance, Claude, Mistral, Llama and Amazon Titan) with new versions becoming available all the time. &lt;/p&gt;

&lt;p&gt;Having choice is great, but having to code your requests differently for each model is a pain.&lt;/p&gt;

&lt;p&gt;Here’s why the Amazon Bedrock Converse API is going to save you a bunch of time and effort, when comparing the output of different foundation models!&lt;/p&gt;

&lt;h3&gt;
  
  
  Consistency is key!
&lt;/h3&gt;

&lt;p&gt;The Converse API is a consistent interface that works with &lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html" rel="noopener noreferrer"&gt;all models that support messages / system prompts&lt;/a&gt;. This means that you can write your code once, and use it to experiment with different models.&lt;/p&gt;

&lt;p&gt;Here’s an example of how it works, and this exercise should cost &amp;lt; $1.&lt;/p&gt;

&lt;h3&gt;
  
  
  Configure model access
&lt;/h3&gt;

&lt;p&gt;Before you begin, be sure to check that the models you want to use  are available in your region, and that you have enabled access to them, here are the ones I'm using, you can select these or choose your own: &lt;br&gt;
&lt;code&gt;anthropic.claude-v2&lt;br&gt;
anthropic.claude-3-haiku&lt;br&gt;
Claude 3.5 Sonnet&lt;br&gt;
Mistral small&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcz2yedbbi4rdo85b120a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcz2yedbbi4rdo85b120a.png" alt=" " width="800" height="375"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;1) We can do everything using the CloudShell in the AWS console.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F57pvfbw1rppuknwur7sb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F57pvfbw1rppuknwur7sb.png" alt=" " width="800" height="245"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;2) When the CloudShell is ready, install boto3 which is the AWS SDK for Python&lt;br&gt;
&lt;code&gt;pip install boto3&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn4h0ogwam29dy2mf5an5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn4h0ogwam29dy2mf5an5.png" alt=" " width="800" height="223"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;3) Download the file named converse_demo.py from &lt;a href="https://github.com/fayekins/demos/blob/main/converse_demo.py" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; You can do this using wget and providing the raw path to the file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;wget https://raw.githubusercontent.com/fayekins/demos/refs/heads/main/converse_demo.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukkrsbovms6icg2tuh7w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fukkrsbovms6icg2tuh7w.png" alt=" " width="800" height="234"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  converse_demo.py
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#first we import boto3 and json 
import boto3, json

#create a boto3 session - stores config state and allows you to create service clients
session = boto3.Session()

#create a Bedrock Runtime Client instance - used to send API calls to AI models in Bedrock
bedrock = session.client(service_name='bedrock-runtime')

#here's our prompt telling the model what we want it to do, we can change this later
system_prompts = [{"text": "You are an app that creates reading lists for book groups."}]

#define an empty message list - to be used to pass the messages to the model
message_list = []

#here’s the message that I want to send to the model, we can change this later if we want
initial_message = {
            "role": "user",
               "content": [{"text": "Create a list of five novels suitable for a book group who are interested in classic novels."}],
               }

#the message above is appended to the message_list
message_list.append(initial_message)

#make an API call to the Bedrock Converse API, we define the model to use, the message, and inference parameters to use as well
response = bedrock.converse(
modelId="anthropic.claude-v2",
messages=message_list,
system=system_prompts,
inferenceConfig={
            "maxTokens": 2048,
            "temperature": 0,
            "topP": 1
            },
)

#invoke converse with all the parameters we provided above and after that, print the result 
response_message = response['output']['message']
print(json.dumps(response_message, indent=4))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4) Run the Python code like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python converse_demo.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It should give you an output similar to this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhtlou91n0kl46vwlkjus.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhtlou91n0kl46vwlkjus.png" alt=" " width="800" height="276"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;5) We can also run this same code using different model, by replacing the model ID in our code as follows:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;anthropic.claude-3-haiku-20240307-v1:0&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Compare the output from the second model, it is slightly different: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F179agtbs8vu71251mdk0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F179agtbs8vu71251mdk0.png" alt=" " width="800" height="310"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;6) We can test again with another version: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;anthropic.claude-3-5-sonnet-20240620-v1:0&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs9hln08tlj0hq5i0j7c4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs9hln08tlj0hq5i0j7c4.png" alt=" " width="800" height="283"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When a new version of Claude is released, we can request access and then just replace the name of the model in our code!&lt;/p&gt;

&lt;h4&gt;
  
  
  Access denied error
&lt;/h4&gt;

&lt;p&gt;If you see an error similar to this, it just means you are trying to use a model that you don't have access to yet. Simply request access to the model, and try again after access is granted.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6tcf2eezt0i3mjveiosn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6tcf2eezt0i3mjveiosn.png" alt=" " width="800" height="232"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;7) I also tried it with a different model provider, by changing the model id to: &lt;/p&gt;

&lt;p&gt;&lt;code&gt;mistral.mistral-small-2402-v1:0&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F33qmh9ltfxv2kc6toa24.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F33qmh9ltfxv2kc6toa24.png" alt=" " width="800" height="185"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So the Converse API gives you a simple, consistent API, that works with all Amazon Bedrock models that support messages. And this means that you can write your code once and use it with different models to compare the results!&lt;/p&gt;

&lt;p&gt;So next time you’re working with Bedrock, do yourself a favour, try out the Converse API, and thank me later!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>bedrock</category>
      <category>python</category>
      <category>genai</category>
    </item>
    <item>
      <title>Here's how I improved my public speaking, and made it an enjoyable part of my role!</title>
      <dc:creator>Faye Ellis</dc:creator>
      <pubDate>Wed, 23 Oct 2024 11:45:32 +0000</pubDate>
      <link>https://dev.to/aws-builders/heres-how-i-improved-my-public-speaking-and-made-it-an-enjoyable-part-of-my-role-28cp</link>
      <guid>https://dev.to/aws-builders/heres-how-i-improved-my-public-speaking-and-made-it-an-enjoyable-part-of-my-role-28cp</guid>
      <description>&lt;p&gt;I probably shouldn’t say this, but I used to be terrible at public speaking! At school, I even hated reading out loud in front of people. I would either refuse or dissolve into nervous giggles.&lt;/p&gt;

&lt;p&gt;The first time I delivered a talk professionally, I wanted to run away and hide. I was so anxious, I couldn’t understand what I was feeling, and this was the first time I learned about fight-or-flight! Afterwards, I got some good feedback (my lovely team said that I knew my stuff, which was the kindest thing they could have said) and I felt encouraged to keep on trying.&lt;/p&gt;

&lt;p&gt;Years later, when I began my role at A Cloud Guru, I was still very awkward and uncomfortable - I had not had much experience other than small presentations to my own team, or within projects I’d worked on. Talking on camera feels like a completely different skill!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F550eixvp4bo9htzr69f3.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F550eixvp4bo9htzr69f3.jpg" alt=" " width="800" height="515"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So what happened? - Lots and lots of practice! Over the past 3 years I’ve delivered almost 60 talks in-person or online. &lt;/p&gt;

&lt;p&gt;Here’s what I learned along the way to make public speaking an energising and enjoyable part of my work! &lt;/p&gt;

&lt;h2&gt;
  
  
  Preparation is key
&lt;/h2&gt;

&lt;p&gt;I like to be well prepared, so I practice until I feel ready. My first big talk was practiced at least 20 times - I realise that this is completely unreasonable and unnecessary! But at the time, doing that made me feel much more comfortable on the day, and I felt I was ready. These days, I practice a new talk 3 times, and that’s usually enough!&lt;/p&gt;

&lt;h2&gt;
  
  
  Ask the audience
&lt;/h2&gt;

&lt;p&gt;Always ask some questions. Understanding more about your audience will help you to deliver a talk that resonates. Making it a two way street also changes the energy in the room, because people feel like they are part of something, not just sitting there being talked at.&lt;/p&gt;

&lt;h2&gt;
  
  
  Start strong
&lt;/h2&gt;

&lt;p&gt;The first 2 or 3 minutes are always the most nerve wracking! So having a strong start can really set you up for success, by this I mean practicing how you’ll introduce yourself, begin the talk, and be sure to have something interesting to catch people’s attention at the beginning. &lt;/p&gt;

&lt;h2&gt;
  
  
  Stay on track
&lt;/h2&gt;

&lt;p&gt;Don’t get thrown off track by your own slides! Make sure they are well structured, this will make it much easier to take your audience on a journey and make it easy for you to deliver as well. If ideas are bouncing around all over the place, it makes it much more challenging for you to deliver a coherent message.&lt;/p&gt;

&lt;h2&gt;
  
  
  Slow and steady
&lt;/h2&gt;

&lt;p&gt;This was the hardest thing for me in the beginning, because if you’ve met me, you’ll know I’m a bundle of nervous energy, I naturally talk fast, especially when I’m excited so I have to consciously slow down! Breathing deeply helps a lot, as well as arriving early, so that I’m not rushing about at the last minute.&lt;/p&gt;

&lt;h2&gt;
  
  
  Channel your inner superhero!
&lt;/h2&gt;

&lt;p&gt;🦸‍♀️ My bonus tip is to find your power pose. It’s such a cliche but it works. Mine is: Superhero! I find a quiet place a few minutes before my session starts and take a few moments to stand, hands on hips, shoulders back and head held high, after that, I’m ready for anything! &lt;/p&gt;

&lt;p&gt;What are your tips for successful public speaking?&lt;/p&gt;

</description>
      <category>publicspeaking</category>
      <category>careerdevelopment</category>
      <category>devrel</category>
    </item>
  </channel>
</rss>
