<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Carlos Cortez 🇵🇪 [AWS Hero]</title>
    <description>The latest articles on DEV Community by Carlos Cortez 🇵🇪 [AWS Hero] (@ccortezb).</description>
    <link>https://dev.to/ccortezb</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/ccortezb"/>
    <language>en</language>
    <item>
      <title>Building Your First RAG System with Amazon Bedrock Agents and FAISS: A Developer's Journey</title>
      <dc:creator>Carlos Cortez 🇵🇪 [AWS Hero]</dc:creator>
      <pubDate>Wed, 03 Sep 2025 03:14:15 +0000</pubDate>
      <link>https://dev.to/aws-heroes/building-your-first-rag-system-with-amazon-bedrock-agents-and-faiss-a-developers-journey-11ad</link>
      <guid>https://dev.to/aws-heroes/building-your-first-rag-system-with-amazon-bedrock-agents-and-faiss-a-developers-journey-11ad</guid>
      <description>&lt;h1&gt;
  
  
  Building Your First RAG System with Amazon Bedrock Agents and FAISS: A Developer's Journey
&lt;/h1&gt;

&lt;p&gt;Discover how AI is transforming the way we build intelligent applications in the cloud. Let's explore how to create your first Retrieval-Augmented Generation (RAG) system using &lt;a href="https://docs.aws.amazon.com/bedrock/latest/userguide/agents.html" rel="noopener noreferrer"&gt;Amazon Bedrock Agents&lt;/a&gt; (managed AI agents service) with &lt;a href="https://github.com/facebookresearch/faiss" rel="noopener noreferrer"&gt;FAISS&lt;/a&gt; (Facebook AI Similarity Search) as our local vector database, perfect for those first steps into GenAI territory.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why FAISS for Your First RAG Adventure?
&lt;/h2&gt;

&lt;p&gt;The idea here is simple: when you're starting with RAG (Retrieval-Augmented Generation, a technique to enrich LLMs with external data), you want to focus on understanding the core concepts without getting lost in database configurations. FAISS runs locally, it's fast, and gives you complete control over your vector operations. Plus, it's free and doesn't require any AWS infrastructure setup for the vector storage part.&lt;/p&gt;

&lt;p&gt;In practice, this means FAISS is your local playground where you can experiment, learn, and prototype before moving to production-ready solutions like &lt;a href="https://docs.aws.amazon.com/opensearch-service/latest/developerguide/serverless.html" rel="noopener noreferrer"&gt;Amazon OpenSearch Serverless&lt;/a&gt; or &lt;a href="https://docs.aws.amazon.com/memorydb/" rel="noopener noreferrer"&gt;Amazon MemoryDB&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture We're Building
&lt;/h2&gt;

&lt;p&gt;Our setup combines the best of both worlds:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Amazon Bedrock Agents&lt;/strong&gt;: Handles the orchestration, reasoning, and LLM (Large Language Model) interactions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;FAISS&lt;/strong&gt;: Manages vector storage and similarity search locally&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Custom Action Group&lt;/strong&gt;: Bridges the agent with our FAISS operations
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;User Query → Bedrock Agent → Action Group → FAISS Search → Context → LLM Response
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Setting Up Your Local FAISS Environment
&lt;/h2&gt;

&lt;p&gt;Let's explore the initial configuration. What's interesting is that we can have everything running with just a few key components:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;sentence_transformers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;SentenceTransformer&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;uuid&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Any&lt;/span&gt;

&lt;span class="c1"&gt;# Initialize the embedding model
&lt;/span&gt;&lt;span class="n"&gt;embedding_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SentenceTransformer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;all-MiniLM-L6-v2&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;embedding_dimension&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;384&lt;/span&gt;

&lt;span class="c1"&gt;# Create FAISS index
&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;IndexFlatIP&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;embedding_dimension&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# Inner Product for cosine similarity
&lt;/span&gt;&lt;span class="n"&gt;document_store&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;  &lt;span class="c1"&gt;# Store original documents with metadata
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Creating Your Document Ingestion Pipeline
&lt;/h2&gt;

&lt;p&gt;The idea here is to create a simple but effective ingestion system. My recommendation is to start with this class that encapsulates all the functionality:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;FAISSDocumentStore&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;embedding_model_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;all-MiniLM-L6-v2&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;embedding_model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SentenceTransformer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;embedding_model_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;embedding_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_sentence_embedding_dimension&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;IndexFlatIP&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_documents&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Add documents to the FAISS index&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;metadata&lt;/span&gt; &lt;span class="ow"&gt;is&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;metadata&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[{}]&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Generate embeddings
&lt;/span&gt;        &lt;span class="n"&gt;embeddings&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;embedding_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;encode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;normalize_embeddings&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Add to FAISS index
&lt;/span&gt;        &lt;span class="n"&gt;start_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;embeddings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;astype&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;float32&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

        &lt;span class="c1"&gt;# Store documents with metadata
&lt;/span&gt;        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;meta&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;zip&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;)):&lt;/span&gt;
            &lt;span class="n"&gt;doc_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;start_id&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;doc_id&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;metadata&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;meta&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;id&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;doc_id&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;Dict&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
        &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;Search for similar documents&lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
        &lt;span class="n"&gt;query_embedding&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;embedding_model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;encode&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;normalize_embeddings&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;scores&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;indices&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query_embedding&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;astype&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;float32&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;idx&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;zip&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;scores&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;indices&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]):&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;idx&lt;/span&gt; &lt;span class="o"&gt;!=&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;  &lt;span class="c1"&gt;# Valid result
&lt;/span&gt;                &lt;span class="n"&gt;doc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;idx&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;copy&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="n"&gt;doc&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;similarity_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;score&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;doc&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;results&lt;/span&gt;

&lt;span class="c1"&gt;# Initialize your document store
&lt;/span&gt;&lt;span class="n"&gt;doc_store&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;FAISSDocumentStore&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Building the Bedrock Agent Action Group
&lt;/h2&gt;

&lt;p&gt;Let's explore how to create the action group that will handle RAG operations. In practice, this means creating a Lambda function that acts as a bridge:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;lambda_handler&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Lambda function to handle Bedrock Agent action group calls
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;

    &lt;span class="c1"&gt;# Parse the incoming request
&lt;/span&gt;    &lt;span class="n"&gt;action_group&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;actionGroup&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;api_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;apiPath&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;http_method&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpMethod&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;''&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;parameters&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;parameters&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;[])&lt;/span&gt;

    &lt;span class="c1"&gt;# Extract query parameter
&lt;/span&gt;    &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;param&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;parameters&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;param&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;name&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;query&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;param&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;value&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="k"&gt;break&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;messageVersion&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1.0&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;response&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;actionGroup&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;action_group&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;apiPath&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;api_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpMethod&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;http_method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpStatusCode&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;responseBody&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Query parameter is required&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
                    &lt;span class="p"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="c1"&gt;# Perform FAISS search
&lt;/span&gt;        &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;doc_store&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Format results for the agent
&lt;/span&gt;        &lt;span class="n"&gt;context_documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;context_documents&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;similarity_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;metadata&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;metadata&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{})&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;

        &lt;span class="n"&gt;response_body&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;query&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;documents&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;context_documents&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;total_results&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context_documents&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;messageVersion&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1.0&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;response&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;actionGroup&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;action_group&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;apiPath&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;api_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpMethod&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;http_method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpStatusCode&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;responseBody&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response_body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                    &lt;span class="p"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;messageVersion&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;1.0&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;response&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;actionGroup&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;action_group&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;apiPath&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;api_path&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpMethod&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;http_method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;httpStatusCode&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;responseBody&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;application/json&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;body&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)})&lt;/span&gt;
                    &lt;span class="p"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Configuring Your Bedrock Agent
&lt;/h2&gt;

&lt;p&gt;What's interesting is that we can use AWS CLI to create the entire setup. My recommendation is to use this approach:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Create the agent&lt;/span&gt;
aws bedrock-agent create-agent &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--agent-name&lt;/span&gt; &lt;span class="s2"&gt;"faiss-rag-agent"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--description&lt;/span&gt; &lt;span class="s2"&gt;"RAG agent using FAISS for vector search"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--foundation-model&lt;/span&gt; &lt;span class="s2"&gt;"anthropic.claude-3-sonnet-20240229-v1:0"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--instruction&lt;/span&gt; &lt;span class="s2"&gt;"You are a helpful assistant that can search through documents to answer questions. When a user asks a question, use the search_documents function to find relevant information, then provide a comprehensive answer based on the retrieved context."&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--region&lt;/span&gt; us-east-1

&lt;span class="c"&gt;# Create action group (after deploying your Lambda function)&lt;/span&gt;
aws bedrock-agent create-agent-action-group &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--agent-id&lt;/span&gt; &lt;span class="s2"&gt;"YOUR_AGENT_ID"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--agent-version&lt;/span&gt; &lt;span class="s2"&gt;"DRAFT"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--action-group-name&lt;/span&gt; &lt;span class="s2"&gt;"document-search"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--description&lt;/span&gt; &lt;span class="s2"&gt;"Search documents using FAISS vector database"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--action-group-executor&lt;/span&gt; &lt;span class="nv"&gt;lambda&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:lambda:us-east-1:YOUR_ACCOUNT:function:faiss-rag-function"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--api-schema&lt;/span&gt; &lt;span class="s1"&gt;'{
        "openapi": "3.0.0",
        "info": {
            "title": "Document Search API",
            "version": "1.0.0"
        },
        "paths": {
            "/search": {
                "post": {
                    "description": "Search for relevant documents",
                    "parameters": [
                        {
                            "name": "query",
                            "in": "query",
                            "required": true,
                            "schema": {
                                "type": "string"
                            },
                            "description": "The search query"
                        }
                    ]
                }
            }
        }
    }'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--region&lt;/span&gt; us-east-1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Loading Your First Documents
&lt;/h2&gt;

&lt;p&gt;Let's explore how to populate our FAISS index with sample documents:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Sample documents about AWS services
&lt;/span&gt;&lt;span class="n"&gt;sample_docs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Amazon S3 is a highly scalable object storage service that offers industry-leading durability, availability, and performance.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Amazon EC2 provides secure, resizable compute capacity in the cloud. It&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;s designed to make web-scale cloud computing easier.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Amazon RDS makes it easy to set up, operate, and scale a relational database in the cloud.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Amazon DynamoDB is a key-value and document database that delivers single-digit millisecond performance at any scale.&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="n"&gt;metadata&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;service&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;S3&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;category&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Storage&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;service&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Lambda&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;category&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Compute&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;service&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;EC2&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;category&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Compute&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;service&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;RDS&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;category&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Database&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;service&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;DynamoDB&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;category&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Database&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="c1"&gt;# Add documents to FAISS
&lt;/span&gt;&lt;span class="n"&gt;doc_store&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_documents&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sample_docs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;metadata&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Added &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sample_docs&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt; documents to FAISS index&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Testing Your RAG System
&lt;/h2&gt;

&lt;p&gt;The idea here is to validate that everything works before connecting the components. In practice, this means:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Test the search functionality
&lt;/span&gt;&lt;span class="n"&gt;test_query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;What database services does AWS offer?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;doc_store&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;test_query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Query: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;test_query&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Results:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;enumerate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="s"&gt;. Score: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;similarity_score&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;:&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;   Text: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;text&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;   Service: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;metadata&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;service&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;N/A&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Connecting Everything with Bedrock Agents SDK
&lt;/h2&gt;

&lt;p&gt;What's interesting is that we can interact with our agent programmatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;chat_with_rag_agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;agent_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;agent_alias_id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Chat with the Bedrock agent that uses FAISS for RAG
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;bedrock_agent_runtime&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;boto3&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bedrock-agent-runtime&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;region_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;us-east-1&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;bedrock_agent_runtime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;invoke_agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;agentId&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;agent_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;agentAliasId&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;agent_alias_id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;sessionId&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;uuid&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;uuid4&lt;/span&gt;&lt;span class="p"&gt;()),&lt;/span&gt;
            &lt;span class="n"&gt;inputText&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="c1"&gt;# Process the streaming response
&lt;/span&gt;        &lt;span class="n"&gt;full_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;""&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;completion&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;chunk&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="n"&gt;chunk&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;chunk&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
                &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bytes&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                    &lt;span class="n"&gt;full_response&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;bytes&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;decode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;utf-8&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;full_response&lt;/span&gt;

    &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Error invoking agent: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

&lt;span class="c1"&gt;# Example usage
&lt;/span&gt;&lt;span class="n"&gt;agent_response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;chat_with_rag_agent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;What are the benefits of using AWS Lambda?&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;agent_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_AGENT_ID&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;agent_alias_id&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_ALIAS_ID&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Agent Response:&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;agent_response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Optimizing Your FAISS Performance
&lt;/h2&gt;

&lt;p&gt;In practice, this means that when you scale up, consider these optimizations:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# For larger datasets, use IndexIVFFlat for faster search
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;create_optimized_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;nlist&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Create an optimized FAISS index for larger datasets
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="n"&gt;quantizer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;IndexFlatIP&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;IndexIVFFlat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;quantizer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;nlist&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;

&lt;span class="c1"&gt;# Add GPU support if available
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;create_gpu_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;int&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="sh"&gt;"""&lt;/span&gt;&lt;span class="s"&gt;
    Create GPU-accelerated FAISS index
    &lt;/span&gt;&lt;span class="sh"&gt;"""&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get_num_gpus&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;StandardGpuResources&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;IndexFlatIP&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;gpu_index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;index_cpu_to_gpu&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;res&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;gpu_index&lt;/span&gt;
    &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;IndexFlatIP&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dimension&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Common Pitfalls and How to Avoid Them
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;1. Embedding Model Consistency&lt;/strong&gt;&lt;br&gt;
Always use the same embedding model for indexing and querying. Mixing models will give you poor results.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Normalization Matters&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Always normalize embeddings for cosine similarity
&lt;/span&gt;&lt;span class="n"&gt;embeddings&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;encode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;normalize_embeddings&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;3. Batch Processing&lt;/strong&gt;&lt;br&gt;
For large document sets, process in batches to avoid memory issues:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_documents_in_batches&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;doc_store&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;batch_size&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;batch_size&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;batch&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;texts&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;+&lt;/span&gt;&lt;span class="n"&gt;batch_size&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
        &lt;span class="n"&gt;doc_store&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;add_documents&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;batch&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Processed batch &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;//&lt;/span&gt;&lt;span class="n"&gt;batch_size&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Next Steps and Production Considerations
&lt;/h2&gt;

&lt;p&gt;This setup is perfect for learning and prototyping, but for production, my recommendation is:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Persistence&lt;/strong&gt;: Save your FAISS index to disk
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;faiss&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;write_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;my_index.faiss&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Scalability&lt;/strong&gt;: Consider Amazon OpenSearch Serverless for production workloads&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monitoring&lt;/strong&gt;: Add logging and metrics to your Lambda function&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security&lt;/strong&gt;: Implement proper IAM roles and VPC configurations&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  El Takeaway Principal
&lt;/h2&gt;

&lt;p&gt;The main takeaway is that starting with FAISS locally allows you to understand RAG fundamentals without additional complexities. Once you feel comfortable, you can explore Amazon Bedrock Knowledge Bases for a fully managed solution, or Amazon OpenSearch Serverless for more advanced vector search capabilities.&lt;/p&gt;

&lt;p&gt;Lo interesante es que this approach allows you to iterate quickly, understand each component, and gradually move to more sophisticated setups as your needs grow.&lt;/p&gt;

&lt;h2&gt;
  
  
  Quick Actions for You
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;[ ] Set up your local Python environment with the required dependencies&lt;/li&gt;
&lt;li&gt;[ ] Create a simple FAISS index with your own documents&lt;/li&gt;
&lt;li&gt;[ ] Deploy the Lambda function for the action group&lt;/li&gt;
&lt;li&gt;[ ] Configure your Bedrock Agent with the action group&lt;/li&gt;
&lt;li&gt;[ ] Test the end-to-end RAG pipeline&lt;/li&gt;
&lt;li&gt;[ ] Experiment with different embedding models&lt;/li&gt;
&lt;li&gt;[ ] Try optimizing FAISS for your specific use case&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  My Final Recommendation
&lt;/h2&gt;

&lt;p&gt;Start small with this local FAISS approach to understand RAG fundamentals. Once you're comfortable, explore Amazon Bedrock Knowledge Bases for a fully managed solution, or Amazon OpenSearch Serverless for more advanced vector search capabilities.&lt;/p&gt;

&lt;p&gt;The beauty of this approach is that you can iterate quickly, understand every component, and gradually move to more sophisticated setups as your needs grow.&lt;/p&gt;




&lt;p&gt;Thanks for reading, now it's your turn to try it out! Let's connect and share our experiences building with GenAI:&lt;/p&gt;

&lt;p&gt;🔗 &lt;strong&gt;LinkedIn&lt;/strong&gt;: &lt;a href="https://www.linkedin.com/in/carloscortezcloud" rel="noopener noreferrer"&gt;https://www.linkedin.com/in/carloscortezcloud&lt;/a&gt;&lt;br&gt;&lt;br&gt;
🐦 &lt;strong&gt;X&lt;/strong&gt;: &lt;a href="https://x.com/ccortezb" rel="noopener noreferrer"&gt;https://x.com/ccortezb&lt;/a&gt;&lt;br&gt;&lt;br&gt;
💻 &lt;strong&gt;GitHub&lt;/strong&gt;: &lt;a href="https://github.com/ccortezb" rel="noopener noreferrer"&gt;https://github.com/ccortezb&lt;/a&gt;&lt;br&gt;&lt;br&gt;
📝 &lt;strong&gt;Dev.to&lt;/strong&gt;: &lt;a href="https://dev.to/ccortezb"&gt;https://dev.to/ccortezb&lt;/a&gt;&lt;br&gt;&lt;br&gt;
🏆 &lt;strong&gt;AWS Heroes&lt;/strong&gt;: &lt;a href="https://builder.aws.com/community/@breakinthecloud" rel="noopener noreferrer"&gt;https://builder.aws.com/community/@breakinthecloud&lt;/a&gt;&lt;br&gt;&lt;br&gt;
📖 &lt;strong&gt;Medium&lt;/strong&gt;: &lt;a href="https://ccortezb.medium.com" rel="noopener noreferrer"&gt;https://ccortezb.medium.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Stay curious, keep experimenting — and I'll catch you in the next one!&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>python</category>
      <category>aws</category>
    </item>
    <item>
      <title>Starting with NLP on AWS from scratch — NLP Series part 1 | by Carlos Cortez | Breaking the Cloud</title>
      <dc:creator>Carlos Cortez 🇵🇪 [AWS Hero]</dc:creator>
      <pubDate>Tue, 13 Apr 2021 01:17:56 +0000</pubDate>
      <link>https://dev.to/ccortezb/starting-with-nlp-on-aws-from-scratch-nlp-series-part-1-by-carlos-cortez-breaking-the-cloud-4gha</link>
      <guid>https://dev.to/ccortezb/starting-with-nlp-on-aws-from-scratch-nlp-series-part-1-by-carlos-cortez-breaking-the-cloud-4gha</guid>
      <description>&lt;h3&gt;
  
  
  Listen this article in our new mode: AudioBlog
&lt;/h3&gt;

&lt;p&gt;&lt;iframe width="100%" height="166" src="https://w.soundcloud.com/player/?url=https://soundcloud.com/karlosgnr_22/audio-blog-starting-with-nlp-on-aws-from-scratch-nlp-series-part-1-english-1&amp;amp;auto_play=false&amp;amp;color=%23000000&amp;amp;hide_related=false&amp;amp;show_comments=true&amp;amp;show_user=true&amp;amp;show_reposts=false&amp;amp;show_teaser=true"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;It would be great to start segmenting our texts as well as the cover image, which is an Open Source software called Coreference created by Hugging Face, right? Let's go in parts ...&lt;/p&gt;

&lt;p&gt;In case you are interested in hearing about Hugging Face and SageMaker, I spoke in depth a few days ago about this topic and its recent alliance with AWS here:&lt;/p&gt;

&lt;h2&gt;
  
  
  The learning curve
&lt;/h2&gt;

&lt;p&gt;The learning curve is long and can get complex if we want to. For that there are flows and frameworks that will help us know where we are right now.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdtldro6fg8z8jwix1ylq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdtldro6fg8z8jwix1ylq.png" width="800" height="113"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Natural Language Processing (NLP) flow *&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Many times we skip key steps before we really understand each of these procedures. For example, data ingestion, web scraping, text analytics, EDA (exploratory analysis, and much more)&lt;/p&gt;

&lt;h2&gt;
  
  
  Remember CRISP-DM?
&lt;/h2&gt;

&lt;p&gt;It is a framework for data mining, which is now known more as Machine Learning, Deep Learning, Data Science, etc.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5hxb65z3idhp8e9wsgap.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5hxb65z3idhp8e9wsgap.png" width="447" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Essentials to get started on AWS
&lt;/h2&gt;

&lt;p&gt;Always to each place we go, we keep this pair in our pocket:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Python 3.x + boto3 *&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;** Boto3 *&lt;em&gt;, is the AWS python library that will give us access to run and manage any resource within the cloud, as simple as *&lt;/em&gt; import boto3, ** and have your private keys configured or your roles correctly created .&lt;/p&gt;

&lt;h2&gt;
  
  
  Pre-trained models
&lt;/h2&gt;

&lt;p&gt;Pre-trained models help us to shorten research, training and analysis time, and not start inventing what already exists. and for this we can use cloud services.&lt;/p&gt;

&lt;p&gt;** Amazon Comprehend (Applied NLP) **, It is a SaaS type service, which means that it is ready to consume an API with the AWS boto3 library and begin to obtain insights from the data. It contains models already trained by Amazon Web Services, which we can reuse to start our own, and we will start knowing this great service and start building interesting applications using other AWS services.&lt;/p&gt;

&lt;p&gt;** Hugging Face (advanced) **, later we will play with transformers, tokenizers, and more with these Deep Learning models of Hugging Face.&lt;/p&gt;

&lt;h2&gt;
  
  
  Build your own models:
&lt;/h2&gt;

&lt;p&gt;Once we have a good understanding of NLP's self-managed cloud services, we can move to the next level with Amazon Sagemaker.&lt;/p&gt;

&lt;p&gt;Or, with your own containers, which we will also learn, or locally&lt;/p&gt;

&lt;h2&gt;
  
  
  NLP Open Source Libraries
&lt;/h2&gt;

&lt;p&gt;I will mention these for now and we will surely know more than 2 of this list in the coming months:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;NLTK&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Spark NLP&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CoreNLP&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;SpaCy&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Pytorch&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Tensorflow&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;etc..&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Tools and IDEs
&lt;/h2&gt;

&lt;p&gt;From now on we will begin to know and learn step by step, what it is like to work in the cloud with an IDE for Data Science, and learn about AI services on AWS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femoxl6wa85am4cvyi3jx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femoxl6wa85am4cvyi3jx.png" width="800" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I recommend taking a look at the following, some paid and some free.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Sagemaker Studio / Sagemaker Notebooks&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Jupyter Notebooks / Jupyter Lab&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Databricks&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Google Colab&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;among others ...&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;To start for free, I suggest ** Databricks and Colab **, the first one, has a community platform that allows you to create free clusters and they are eliminated if you do not use it for limited times, and Google's Colab, you can run it directly on your Google Drive, at no cost.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deployment of my models in the cloud
&lt;/h2&gt;

&lt;p&gt;It is not yet time to think about deploying, and yes, we will find many, many ways to do it, from the oldest to the most modern.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Traditional servers (VMs, Bare Metal)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Containers (such as AWS ECS, EKS, Fargate)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Without Servers (Functions as a Service)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa77wjl1bwcwzcu4cbx7d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa77wjl1bwcwzcu4cbx7d.png" width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is a very good way to start, putting together serverless and Sagemaker.&lt;/p&gt;

&lt;h2&gt;
  
  
  Think Serverless First
&lt;/h2&gt;

&lt;p&gt;Now we can start working without having to run servers using AWS Lambda, AWS Step Functions, Sagemaker Pipelines, AWS CodePipeline, which are services already deployed in the cloud that we can use to speed up our development and start-up.&lt;/p&gt;

&lt;p&gt;In the following posts we learn how to deploy with AWS Lambda and Amazon Comprehend.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuzmg2me9iavpfjah4vby.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuzmg2me9iavpfjah4vby.png" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Amazon Comprehend and its superpowers
&lt;/h2&gt;

&lt;p&gt;Can you imagine something as simple as writing a few lines of code to create a word detector and phrase extraction in just a few minutes instead of many hours of research? It is possible if you start to take advantage of Amazon Comprehend, a little Python and a couple of drops of Serverless&lt;/p&gt;

&lt;p&gt;Phrase extraction:&lt;/p&gt;

&lt;p&gt;We are using Python 3.6 here with boto3, the library&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;response **=** client**.**detect_key_phrases(

Text=’string’,

LanguageCode=’en’|’es’

)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Extraction of Entities, sentiment, language, syntax, topics, and document classification, are some of the additional things that we are going to do with Comprehend, as well as a specific version for Medicine, called ** Amazon Comprehend Medical **&lt;/p&gt;

&lt;h2&gt;
  
  
  What do we continue with?
&lt;/h2&gt;

&lt;p&gt;Data ingestion and text Analytics is part of the important bases to understand NLP, so in the next post, we will start to build our first data ingestion of a web, to be able to obtain data, texts, titles, in a data frame to play.&lt;/p&gt;

&lt;p&gt;** Next post: (on air on April 02) **&lt;/p&gt;

&lt;p&gt;(Post Original: &lt;a href="https://ccortezb.medium.com/empezando-con-nlp-en-aws-desde-cero-nlp-series-parte-1-por-carlos-cortez-breaking-the-cloud-13da2adc0151" rel="noopener noreferrer"&gt;Medium&lt;/a&gt; and &lt;br&gt;
&lt;a href="https://dev.cortez.cloud/posts/empezando-con-nlp-en-aws-desde-cero-nlp-series-1-3nak/" rel="noopener noreferrer"&gt;Dev.cortez.cloud&lt;/a&gt; )&lt;/p&gt;

&lt;p&gt;If you liked this post, give it a like, share and comment.&lt;/p&gt;

&lt;h2&gt;
  
  
  Learn by breaking things.
&lt;/h2&gt;

&lt;p&gt;Subscribe to my channel, Breaking the Cloud and be Up to date with AWS at &lt;a href="https://cortez.cloud/" rel="noopener noreferrer"&gt;https://cortez.cloud&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;⭐Subscribe to my channel : &lt;a href="http://bit.ly/aldiaconaws" rel="noopener noreferrer"&gt;http://bit.ly/aldiaconaws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;videos, AWS news, analytics, reviews, demos, workshops&lt;/p&gt;

&lt;h2&gt;
  
  
  🔥🔥 Sígueme en mis redes 🔥🔥
&lt;/h2&gt;

&lt;p&gt;follow me:&lt;/p&gt;

&lt;p&gt;🦜 My Twitter: &lt;a href="https://twitter.com/ccortezb" rel="noopener noreferrer"&gt;https://twitter.com/ccortezb&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📺 Youtube Channel: &lt;a href="http://bit.ly/aldiaconaws" rel="noopener noreferrer"&gt;http://bit.ly/aldiaconaws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📺 AWSUGPerú: &lt;a href="https://www.youtube.com/awsusergroupperuoficial" rel="noopener noreferrer"&gt;https://www.youtube.com/awsusergroupperuoficial&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📟 My Facebook: &lt;a href="https://www.facebook.com/ccortezb/" rel="noopener noreferrer"&gt;https://www.facebook.com/ccortezb/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🤳 My Instagram: ccortezbazan&lt;/p&gt;

&lt;p&gt;📜My AWS Courses: &lt;a href="https://cennticloud.thinkific.com/" rel="noopener noreferrer"&gt;https://cennticloud.thinkific.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🕮 My blog — &lt;a href="https://cortez.cloud/" rel="noopener noreferrer"&gt;cortez.cloud&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thank you very much, I hope we see each other again&lt;/p&gt;

&lt;h2&gt;
  
  
  🔥🔥 About me 🔥🔥
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.cortez.cloud/whoami" rel="noopener noreferrer"&gt;Cortez.Cloud/whoami&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Let me  present to you, my little personal website &lt;a href="https://www.cortez.cloud/" rel="noopener noreferrer"&gt;https://www.Cortez.Cloud&lt;/a&gt; called “Breaking the Cloud”.&lt;/p&gt;

&lt;p&gt;I will continue to create content each week from AWS on Al / ML, Serverless, Security and how to break the rules!&lt;/p&gt;

&lt;p&gt;Also my next initiatives, workshops, courses, free videos, awsugperu and more. &lt;/p&gt;

</description>
      <category>machinelearning</category>
      <category>aws</category>
      <category>beginners</category>
      <category>nlp</category>
    </item>
    <item>
      <title>Empezando con NLP en AWS desde cero - NLP Series 1</title>
      <dc:creator>Carlos Cortez 🇵🇪 [AWS Hero]</dc:creator>
      <pubDate>Wed, 31 Mar 2021 04:15:22 +0000</pubDate>
      <link>https://dev.to/aws-builders/empezando-con-nlp-en-aws-desde-cero-nlp-series-1-3nak</link>
      <guid>https://dev.to/aws-builders/empezando-con-nlp-en-aws-desde-cero-nlp-series-1-3nak</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffcy5bb9uw68qc0xbyrnr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffcy5bb9uw68qc0xbyrnr.png" alt="Alt Text" width="800" height="263"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sería genial ya empezar a segmentar nuestros textos así como la imagen de portada que es un software Open Source llamado Coreference creado por Hugging Face,  no? Vamos por partes…&lt;/p&gt;

&lt;p&gt;Por si te interesa escuchar Sobre Hugging Face y SageMaker, hablé hace unos días profundamente sobre este tema y su reciente alianza con AWS aquí: &lt;/p&gt;

&lt;p&gt;Al día con AWS - Ep 4 La semana de ML con algunas ROSAs en AWS (Marzo 21 al 29)&lt;br&gt;
&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/4yJFcv1sASA"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  La curva de Aprendizaje
&lt;/h2&gt;

&lt;p&gt;La curva de aprendizaje es larga y puede volverse compleja si quisiéramos. Para eso existen los flujos y los frameworks que nos van a ayudar a saber en qué parte estamos ahora mismo. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9utl41wbqw9zitcyzpl8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9utl41wbqw9zitcyzpl8.png" alt="Alt Text" width="800" height="113"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Flujo de Procesamiento de lenguaje natural (NLP)&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Muchas veces saltamos pasos claves antes de entender realmente cada uno de estos procedimientos. Por ejemplo, la ingesta de datos, web scraping, text analytics, EDA (exploratory analysis, y mucho más)&lt;/p&gt;
&lt;h2&gt;
  
  
  ¿Recuerdan CRISP-DM?
&lt;/h2&gt;

&lt;p&gt;Es un framework para realizar minería de datos, lo que ahora se conoce más como Machine Learning, Deep Learning, Data Science, etc..&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj48y78jfk953x9zkp8po.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj48y78jfk953x9zkp8po.png" alt="Alt Text" width="447" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Lo indispensable para empezar en AWS&lt;br&gt;
Siempre a cada lugar que vayamos, guardemos en el bolsillo esta dupla:&lt;/p&gt;
&lt;h2&gt;
  
  
  Python 3.x + boto3
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Boto3&lt;/strong&gt;, es la librería de python de AWS que nos dará acceso a ejecutar y administrar cualquier recurso dentro de la nube, tan sencillo como un import boto3, y tener tus llaves privadas configuradas o tus roles correctamente creados.&lt;/p&gt;
&lt;h2&gt;
  
  
  Modelos pre-entrenados
&lt;/h2&gt;

&lt;p&gt;Los modelos pre entrenados nos ayudan a acortar tiempo de investigación, de entrenamiento y de análisis, y no empezar a inventar lo que ya existe.  &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Amazon Comprehend&lt;/strong&gt; (NLP Aplicado), Es un servicio de tipo SaaS, que quiere decir que está listo para poder consumir un API con la librería de AWS boto3 y empezar a obtener insights de los datos. Contiene modelos ya entrenados por Amazon Web Services, que podemos re utilizar para empezar los nuestros, e iniciaremos conociendo este gran servicio y empezar a construir aplicaciones interesantes usando otros servicios de AWS.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hugging Face&lt;/strong&gt; (avanzado), más adelante jugaremos con transformers, tokenizers, y más con estos modelos de Deep Learning de Hugging Face.&lt;/p&gt;
&lt;h2&gt;
  
  
  Construir tus propios modelos:
&lt;/h2&gt;

&lt;p&gt;Una vez entendamos muy bien los servicios auto administrados en la nube de NLP, podemos pasar al siguiente nivel ya se con &lt;strong&gt;Amazon Sagemaker&lt;/strong&gt;&lt;br&gt;
o con tus propios &lt;strong&gt;Containers&lt;/strong&gt;, que también lo aprenderemos, o en modo local&lt;/p&gt;
&lt;h2&gt;
  
  
  Librerías NLP Open Source
&lt;/h2&gt;

&lt;p&gt;Mencionaré estos por ahora y seguro conoceremos más de 2 de esta lista durante los próximos meses:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;NLTK&lt;/li&gt;
&lt;li&gt;Spark NLP&lt;/li&gt;
&lt;li&gt;CoreNLP&lt;/li&gt;
&lt;li&gt;SpaCy&lt;/li&gt;
&lt;li&gt;Pytorch&lt;/li&gt;
&lt;li&gt;Tensorflow
etc..&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  Tools e IDEs
&lt;/h2&gt;

&lt;p&gt;Desde ahora empezaremos a conocer y aprender paso a paso, cómo es trabajar en la nube con una IDE para Ciencia de Datos, y conocer los servicios de AI en AWS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5jfa8nsum3x9noohpl92.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5jfa8nsum3x9noohpl92.png" alt="Alt Text" width="800" height="321"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Recomiendo echar un vistazo a las siguientes, algunas de pago y otras gratuitas.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Sagemaker Studio / Sagemaker Notebooks&lt;/li&gt;
&lt;li&gt;Jupyter Notebooks / Jupyter Lab&lt;/li&gt;
&lt;li&gt;Databricks&lt;/li&gt;
&lt;li&gt;Colab de Google
entre otros... &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Para empezar de manera gratuita, sugiero Databricks y Colab, el primero, tiene una plataforma de comunidad que te permite crear cluster gratuitos y se eliminan si no lo usas durante  tiempos limitados, y Colab de Google, puedes ejecutarlo directo en tu Google Drive, sin costo alguno.&lt;/p&gt;
&lt;h2&gt;
  
  
  Despliegue de mis modelos en la nube
&lt;/h2&gt;

&lt;p&gt;Aún no es momento de pensar en desplegar, y sí, encontraremos muchísimas maneras de hacerlo, desde la más antigua, a la más moderna.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Servidores tradicionales (VMs, Bare Metal)&lt;/li&gt;
&lt;li&gt;Contenedores (como AWS ECS, EKS, Fargate) &lt;/li&gt;
&lt;li&gt;Sin Servers (Funciones como Servicio)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbzvxsyop79xckj137xbf.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbzvxsyop79xckj137xbf.jpeg" alt="Alt Text" width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Esta es una muy buena manera de empezar, juntando serverless y Sagemaker.&lt;/p&gt;
&lt;h2&gt;
  
  
  Think Serverless First
&lt;/h2&gt;

&lt;p&gt;Ahora podemos empezar a trabajar sin tener que correr servidores usando AWS Lambda, AWS Step Functions, Sagemaker Pipelines, AWS CodePipeline, que son servicios ya desplegados en la nube que podemos utilizar para agilizar nuestro desarrollo y puesta en marcha.&lt;/p&gt;

&lt;p&gt;En los siguientes posts aprendemos a desplegar con AWS Lambda y Amazon Comprehend.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frluc4duzbh0904ki5qno.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frluc4duzbh0904ki5qno.png" alt="Alt Text" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Amazon Comprehend y sus superpoderes
&lt;/h2&gt;

&lt;p&gt;Imaginas algo tan sencillo como escribir algunas líneas de código para crear un detector de palabras y extracción de frases en tan solo unos minutos en vez de muchas horas de investigación? Es posible si empiezas a sacarle provecho a Amazon Comprehend, un poco de Python y un par de gotitas de Serverless&lt;/p&gt;
&lt;h3&gt;
  
  
  Extracción de frases:
&lt;/h3&gt;

&lt;p&gt;Estamos usando aquí Python 3.6 con boto3, la librería&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;response = client.detect_key_phrases(
    Text='string',
    LanguageCode='en'|'es'
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Extracción de Entidades, sentimiento, lenguaje, sintaxis, tópicos, y clasificación de documentos, son algunas de las cosas adicionales que vamos a realizar con Comprehend, así como una versión específica para Medicina, llamada Amazon Comprehend Medical&lt;/p&gt;

&lt;h2&gt;
  
  
  ¿Con qué continuamos?
&lt;/h2&gt;

&lt;p&gt;La ingesta de datos y el text Analytics, es parte de las bases importantes para comprender NLP, así que en el siguiente post, empezaremos a construir nuestra primera ingesta de datos de una web, para poder obtener datos, textos, títulos, en un data frame para jugar.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Siguiente post: (Al Aire el 02 de Abril)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;em&gt;Si te gustó este post, dale un like, comparte y comenta.&lt;/em&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Rompiendo se aprende
&lt;/h1&gt;

&lt;p&gt;Suscríbete a mi canal, Breaking the Cloud y Al día con AWS en &lt;a href="https://cortez.cloud" rel="noopener noreferrer"&gt;https://cortez.cloud&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;⭐Suscríbete a mi canal : &lt;a href="http://bit.ly/aldiaconaws" rel="noopener noreferrer"&gt;http://bit.ly/aldiaconaws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;videos, noticas de AWS, análisis, demos, workshops&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  🔥🔥 Sígueme en mis redes 🔥🔥
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;follow &amp;lt;- me()&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;🦜 Mi Twitter: &lt;a href="https://twitter.com/ccortezb" rel="noopener noreferrer"&gt;https://twitter.com/ccortezb&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📺 Youtube Channel: &lt;a href="http://bit.ly/aldiaconaws" rel="noopener noreferrer"&gt;http://bit.ly/aldiaconaws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📺 AWSUGPerú: &lt;a href="https://www.youtube.com/awsusergroupperuoficial" rel="noopener noreferrer"&gt;https://www.youtube.com/awsusergroupperuoficial&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📟 Mi Facebook: &lt;a href="https://www.facebook.com/ccortezb/" rel="noopener noreferrer"&gt;https://www.facebook.com/ccortezb/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🤳 Mi Instagram: ccortezbazan&lt;/p&gt;

&lt;p&gt;📜 Mis cursos de AWS: &lt;a href="https://cennticloud.thinkific.com" rel="noopener noreferrer"&gt;https://cennticloud.thinkific.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🕮 Mi blog — &lt;a href="https://cortez.cloud" rel="noopener noreferrer"&gt;cortez.cloud&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Muchas gracias, espero nos volvamos a ver&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  🔥🔥 Acerca de mí 🔥🔥
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.cortez.cloud/whoami" rel="noopener noreferrer"&gt;Cortez.Cloud/whoami&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Les presento mi pequeña web personal &lt;a href="https://www.Cortez.Cloud" rel="noopener noreferrer"&gt;https://www.Cortez.Cloud&lt;/a&gt; llamado “Breaking the Cloud”.&lt;/p&gt;

&lt;p&gt;Seguiré creando contenido cada semana de AWS sobre Al/ML, Serverless, Security y como romper las reglas!&lt;/p&gt;

&lt;p&gt;También mis próximas iniciativas, talleres, cursos, videos gratuitos, awsugperu y más.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>aws</category>
      <category>nlp</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>Trabajando con Amazon DocumentDB desde mi local</title>
      <dc:creator>Carlos Cortez 🇵🇪 [AWS Hero]</dc:creator>
      <pubDate>Tue, 30 Mar 2021 17:46:02 +0000</pubDate>
      <link>https://dev.to/aws-builders/trabajando-con-amazon-documentdb-desde-mi-local-164m</link>
      <guid>https://dev.to/aws-builders/trabajando-con-amazon-documentdb-desde-mi-local-164m</guid>
      <description>&lt;p&gt;Hoy vamos a continuar con la 2da parte de las series de Aprendiendo DocumentDB en AWS, Si aún no has visto la 1ra parte, ve aquí: &lt;a href="https://www.cortez.cloud/blog/primeros-pasos-con-amazon-documentdb-y-aws-cli" rel="noopener noreferrer"&gt;https://www.cortez.cloud/blog/primeros-pasos-con-amazon-documentdb-y-aws-cli&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Siempre es importante trabajar localmente, y casi siempre se cometen errores y descuidos que hacen que nuestros ambientes de desarrollo estén expuestos a conexiones locales inseguras. Para ello vamos a asegurar nuestra conexión poco a poco, y en este post aprenderemos a usar las herramientas que nos permitirán una mejor administración de nuestra instancia.&lt;/p&gt;

&lt;p&gt;En el anterior post, explicamos cómo crear un cluster e instancias de DocumentDB con AWS CLI, así que partamos desde allí y veamos cómo trabajar localmente con ello.&lt;/p&gt;

&lt;h2&gt;
  
  
  Usando Robo3T (Ex Robomongo)
&lt;/h2&gt;

&lt;p&gt;Primero que todo veamos la arquitectura esencial y más eficiente para trabajar con DocumentDB:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd6yodl703cszdby6vju2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd6yodl703cszdby6vju2.png" alt="Alt Text" width="800" height="471"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Me conecto desde mi escritorio y voy instalando Robo3T y mongodb client 3.6 (funciona para mac, windows, linux)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://robomongo.org/download" rel="noopener noreferrer"&gt;https://robomongo.org/download&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Bonita arquitectura, pero ¿Qué es un Bastion Host?
&lt;/h2&gt;

&lt;p&gt;Es un servidor puente, en el cual usaremos recursos para poder saltar a otras partes de la red. En este caso, lo usaremos para saltar de una red pública a una privada que es dónde estará docdb.&lt;/p&gt;

&lt;p&gt;Ejemplo:&lt;br&gt;
un EC2, una máquina virtual, una droplet, etc&lt;/p&gt;
&lt;h2&gt;
  
  
  Veamos paso a paso, cómo configurar desde cero mi conexión hacia mi cluster
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Crear un ec2 como bastion Host:&lt;/p&gt;

&lt;p&gt;Ya hemos visto cómo usar la CLI, así que hagamos lo siguiente y creemos un EC2, Security groups y agreguemos permisos para poder acceder desde la IP que tengas en casa:&lt;br&gt;
&lt;/p&gt;


&lt;/li&gt;

&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ aws ec2 create-security-group --group-name docdb --description "docdb" --vpc-id  vpc-81a06de4
{
    "GroupId": "sg-031f86d07f253eacf"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ahora, verifica tu IP Pública, también por cli, somos unos fanáticos de la cli ahora!&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;curl https://checkip.amazonaws.com
190.236.xxx.xxx 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;*(usa tu propio resultado para las siguientes líneas..)&lt;/p&gt;

&lt;p&gt;Agreguemos estas reglas en nuestro nuevo SG para luego poder usarlo en el bastion host EC2:&lt;/p&gt;

&lt;p&gt;Abrir puerto 22&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress --group-id sg-031f86d07f253eacf --protocol tcp --port 22 --cidr 190.236.xxx.xxx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Abrir puerto 27017 y  27018 (mongodb y adicional para el bastion host)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress --group-id sg-031f86d07f253eacf --protocol tcp --port 27017 --cidr 190.236.xxx.xxx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress --group-id sg-031f86d07f253eacf --protocol tcp --port 27018 --cidr 190.236.xxx.xxx
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;habilitar acceso de mongodb hacia sí mismo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 authorize-security-group-ingress --group-id sg-031f86d07f253eacf --protocol tcp --port 27017 --source-group
 sg-031f86d07f253eacf
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Creamos ahora la instancia:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ aws ec2 run-instances --image-id ami-067f5c3d5a99edc80 --count 1 --instance-type t2.micro --key-name ~/llaves/docdb-ccortez.pem --security-group-ids sg-031f86d07f253eacf --subnet-id subnet-6d558c08
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Configuramos SSH tunnel para poder habilitar o deshabilitar cuando queramos:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sudo vim .ssh/config

Host tunnel
HostName 52.89.199.68
User ec2-user
IdentitiesOnly yes
IdentityFile ~/llaves/docdb-ccortez.pem
LocalForward 27018 docdb-001.cluster-cgyttffk8nhp.us-west-2.docdb.amazonaws.com:27017:27017
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;También es posible crear un bashero ejecutable aparte para no sobreescribir el archivo ~/.ssh/config&lt;/p&gt;

&lt;p&gt;OJO: puedes refrescar, borrar caché o abrir una terminal nueva para renovar la conexión del túnel en caso tengas problemas para abrirlo, o te aparezca algún error.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Iniciamos nuestra conexión con Robo3T, así, en la 1ra pestaña, &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Config inicial:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd0gpxuhp22pqqy0q4w9o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd0gpxuhp22pqqy0q4w9o.png" alt="Alt Text" width="800" height="547"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Autenticación&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Opcional:&lt;br&gt;
    Database: admin (no vamos a poder ver la base de datos admin pero si todas las demás)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjn9xqidqfegx9kjd9s3c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjn9xqidqfegx9kjd9s3c.png" alt="Alt Text" width="800" height="544"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;SSH tunnel:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ctqg8e9c678p50i3yxp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ctqg8e9c678p50i3yxp.png" alt="Alt Text" width="800" height="545"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;SSL Certificate (en caso sea):&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;(Advanced Options)&lt;br&gt;
Invalid Hostnames: Allowed&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foexxdjm984zxkrr8682v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foexxdjm984zxkrr8682v.png" alt="Alt Text" width="800" height="549"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Test final de conexión:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy34139prxongu13hbpk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy34139prxongu13hbpk.png" alt="Alt Text" width="800" height="232"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;En caso sea un cluster creado sin SSL: (opcional, en este ejemplo usé otro cluster que creé sin SSH llamado docdb2)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fycrr2z6vj6msu0dzfou1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fycrr2z6vj6msu0dzfou1.png" alt="Alt Text" width="800" height="232"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusiones:
&lt;/h2&gt;

&lt;p&gt;Conectarse a Docdb es fácil, y seguro.&lt;br&gt;
Nos incentiva a crear arquitecturas más seguras, a usar más la capa privada para las bases de datos.&lt;br&gt;
Con estos tutoriales, irás aprendiendo a usar una base de datos NoSQL desde cero.&lt;/p&gt;

&lt;h2&gt;
  
  
  Próximos pasos
&lt;/h2&gt;

&lt;p&gt;En el próximo post, empezaremos con las migraciones hacia Amazon DocumentDB con AWS DMS.&lt;/p&gt;

&lt;p&gt;Aún necesitamos ahondar más en la seguridad y los accesos, así como seguir aprendiendo más de AWS CLI.&lt;/p&gt;

&lt;p&gt;Suscríbete a mi canal, Breaking the Cloud y Al día con AWS en &lt;a href="https://cortez.cloud" rel="noopener noreferrer"&gt;https://cortez.cloud&lt;/a&gt;&lt;br&gt;
Si te gustó este post, dale un like, comparte y comenta.&lt;br&gt;
⭐Suscríbete a mi canal : &lt;a href="http://bit.ly/aldiaconaws" rel="noopener noreferrer"&gt;http://bit.ly/aldiaconaws&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  🔥🔥 Sígueme en mis redes 🔥🔥
&lt;/h2&gt;

&lt;p&gt;🦜 Mi Twitter: &lt;a href="https://twitter.com/ccortezb" rel="noopener noreferrer"&gt;https://twitter.com/ccortezb&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📺 Youtube Channel: Al día con AWS / Breaking The Cloud: &lt;a href="http://bit.ly/aldiaconaws" rel="noopener noreferrer"&gt;http://bit.ly/aldiaconaws&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📺 AWS Perú Community AWS UG Perú Oficial: &lt;a href="https://www.youtube.com/awsusergroupperuoficial" rel="noopener noreferrer"&gt;https://www.youtube.com/awsusergroupperuoficial&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📟 Mi Facebook: &lt;a href="https://www.facebook.com/ccortezb/" rel="noopener noreferrer"&gt;https://www.facebook.com/ccortezb/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🤳 Mi Instagram: ccortezbazan&lt;/p&gt;

&lt;p&gt;📜 Mis cursos de AWS: &lt;a href="https://cennticloud.thinkific.com" rel="noopener noreferrer"&gt;https://cennticloud.thinkific.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🕮 Mi blog — &lt;a href="https://cortez.cloud" rel="noopener noreferrer"&gt;https://cortez.cloud&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  🔥🔥 Acerca de mí 🔥🔥
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.cortez.cloud/whoami" rel="noopener noreferrer"&gt;https://www.cortez.cloud/whoami&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Les presento mi pequeña web personal &lt;a href="https://www.Cortez.Cloud" rel="noopener noreferrer"&gt;https://www.Cortez.Cloud&lt;/a&gt; llamado “Breaking the Cloud”.&lt;/p&gt;

&lt;p&gt;Seguiré creando contenido cada semana de AWS sobre Al/ML, Serverless, Security y como romper las reglas!&lt;/p&gt;

&lt;p&gt;También mis próximas iniciativas, talleres, cursos, videos gratuitos, awsugperu y más.&lt;/p&gt;

&lt;h1&gt;
  
  
  aws #breakingthecloud
&lt;/h1&gt;

</description>
      <category>aws</category>
      <category>documentdb</category>
      <category>mongodb</category>
      <category>database</category>
    </item>
    <item>
      <title>Análisis en caliente de la Keynote de Andy Jassy Parte 1 - reinvent 2020</title>
      <dc:creator>Carlos Cortez 🇵🇪 [AWS Hero]</dc:creator>
      <pubDate>Wed, 02 Dec 2020 01:40:08 +0000</pubDate>
      <link>https://dev.to/aws-builders/analisis-en-caliente-de-la-keynote-de-andy-jassy-parte-1-reinvent-2020-58mh</link>
      <guid>https://dev.to/aws-builders/analisis-en-caliente-de-la-keynote-de-andy-jassy-parte-1-reinvent-2020-58mh</guid>
      <description>&lt;p&gt;Hola a todos, bienvenido a #BreakingtheCloud, yo soy Carlos Cortez &lt;a class="mentioned-user" href="https://dev.to/ccortezb"&gt;@ccortezb&lt;/a&gt; y esto es cobertura del re:invent 2020 día tras día.&lt;/p&gt;

&lt;p&gt;¡Vayamos directo al grano!&lt;/p&gt;

&lt;h1&gt;
  
  
  Nuevas instancias y más nombres que aprenderse
&lt;/h1&gt;

&lt;p&gt;Alguna de ellas que soportan Arm Graviton2&lt;br&gt;
Amazon EC2 C6gn! Con 100 gbps &lt;/p&gt;

&lt;p&gt;Durante la mañana también fue lanzado:&lt;br&gt;
*D3dn (Dense Storage)&lt;br&gt;
*R5b (EBS Performance)&lt;br&gt;
*M5zn (ntel Xeon Scalable CPU)&lt;br&gt;
*G4ad (AMD GPUs)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0zdheh71oooyjbibqn5h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F0zdheh71oooyjbibqn5h.png" alt="graviton2" width="800" height="448"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Nuevas características:
&lt;/h2&gt;

&lt;p&gt;Nueva EC2 D3 y D3dn hace referencia a instancias con bajo costo en discos magnéticos. Son perfectos para procesamiento big data y montar los datos en HDFS, procesamiento de logs, etc.&lt;/p&gt;

&lt;p&gt;Cabe resaltar que el costo de Terabytes es 80% menor que cuando usábamos D2.&lt;/p&gt;

&lt;p&gt;Nueva EC2 R5b, nueva en la familia, con AWS Nitro System, con 60 gbps de ancho de banda en el EBS y con 260 mil de IOPS ya se nos desborda de las manos! OMG! La R5 norma solo llega hasta 19 gbps&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fu58ihfv299v1q3yy1bpa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fu58ihfv299v1q3yy1bpa.png" alt="ec2 new instances" width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Cuando pensábamos que ya habíamos tocado el cielo, viene la nueva EC2 M5zn! La evolución de la z1d, también con AWS Nitro System y EFA (Elastic Fabric Adapter) especialmente diseñada para HPC. &lt;/p&gt;

&lt;p&gt;Pero, ¿para qué se necesitaría este tipo de instancia? Lo vimos en la Keynote, un buen ejemplo es BOOM, Supersonic Passenger Airplanes.&lt;/p&gt;

&lt;p&gt;*Industria Aeroespacial&lt;br&gt;
*Automoviles&lt;br&gt;
*Aplicaciones financieras&lt;br&gt;
*Simulación de modelamiento de aplicaciones&lt;br&gt;
*Energía, etc..&lt;/p&gt;
&lt;h1&gt;
  
  
  Habana Gaudi-based Amazon EC2 instances
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1bkt8r6jiwdfbbhsy8y9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1bkt8r6jiwdfbbhsy8y9.png" alt="gaudi" width="800" height="452"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  La verdadera competencia de NVIDIA
&lt;/h2&gt;

&lt;p&gt;AWS, nunca escapa poder presentar nueva tecnología para machine learning, pero que cosa realmente es Habana Gaudi? &lt;br&gt;
Habana Labs es una start-up israelí, adquirida luego por Intel que tienen los Procesadores para AI, llamados Gaudi AI processors. &lt;/p&gt;

&lt;p&gt;Excelentes para correr modelos de deep learning, en todas sus gamas: NLP, Computer vision, motor de recomendaciones, etc.&lt;br&gt;
Por qué Habana Gaudi?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsg9aeqw709zrivmn8jyo.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsg9aeqw709zrivmn8jyo.jpeg" alt="Alt Text" width="309" height="163"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;40% más económicas que las anteriores EC2 con procesamiento de GPU para ML. Y Lo más impactante es que tendremos hasta 8 tarjetas Gaudi:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;An 8-card Gaudi EC2 instance can process about 12,000 images-&amp;gt;per-second training the ResNet-50 model on TensorFlow. Each &amp;gt;Gaudi processor integrates 32GB of HBM2 memory and features &amp;gt;RoCE on-chip integration used for inter-processor &amp;gt;connectivity inside the server. Scaling across servers will &amp;gt;be enabled using the AWS Elastic Fabric Adapter (EFA) &amp;gt;technology, allowing AWS and its customers to seamlessly &amp;gt;expand use of multiple Gaudi based systems for efficient and &amp;gt;scalable distributed training.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;code&gt;Fuente: Hanaba: https://habana.ai/habana-gaudi-ai-processors-to-bring-lower-cost-to-train-to-amazon-ec2-customers/&lt;/code&gt;&lt;/p&gt;
&lt;h1&gt;
  
  
  El hermano menor de Inferencia: AWS Trainium
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F002odau9yy21qr0q4upf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F002odau9yy21qr0q4upf.png" alt="trainium" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Un Billón de cálculos por segundo? La mayor cantidad de TFLOPS, chip de entrenamiento? ¿Qué son estas cosas y cómo lo entendemos más fácil?&lt;/p&gt;

&lt;p&gt;Primero que todo, volvamos a Inferentia, que fue lanzado en el 2019. &lt;code&gt;Palabras claves?&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Chip de inferencia para modelos de ML&lt;/li&gt;
&lt;li&gt;Neuron SDK, ¿qué es? Un compilador.&lt;/li&gt;
&lt;li&gt;Instancias Inf1 EC2, que corren con el chip de Inferencia.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Aprendamos los términos:
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;El término FLOPS (Floating point operations per second) &amp;gt;significa “operaciones de coma flotante por segundo”,  es una &amp;gt;unidad que se utiliza para medir cálculos matemáticos que &amp;gt;puede hacer por segundo una CPU o en una GPU.&lt;br&gt;
De allí viene GigaFlops, es decir millones de cálculos por &amp;gt;segundo y TeraFlops, Billones de cálculos por segundo.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;AWS Trainium sale a cubrir todo lo que Inferentia no hace, es decir la parte del entrenamiento. También funcionará con Neuron SDK y en instancias Inf1&lt;/p&gt;
&lt;h3&gt;
  
  
  Ya puedo usarlo?
&lt;/h3&gt;

&lt;p&gt;No, aún estará disponible en los primeros meses del 2021. Así que atento a las noticias.&lt;/p&gt;
&lt;h1&gt;
  
  
  Nos alineamos con las nubes. ECS y EKS desde cualquier lado.
&lt;/h1&gt;

&lt;p&gt;Ahora podremos correr cargas de trabajo literalmente desde donde se nos plazca, &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fltx5415pdpk9oiozo4uz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fltx5415pdpk9oiozo4uz.png" alt="ecs" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Como funcionará?
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Seguiremos usando el Clásico ECS Control Plane dentro de la región&lt;/li&gt;
&lt;li&gt;Se instalará el agente de AWS Systems Manager &lt;/li&gt;
&lt;li&gt;Se agrega y veremos un nuevo tipo de opción para lanzar ECS, aparte de los ya conocidos “EC2” y  “FARGATE” se tendrá uno nuevo llamado “EXTERNAL”.&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;Revisen el blog oficial donde encontrarán una demo y un diagrama de cómo se verá realmente:&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9b7ntwynnc9n0jbccwce.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9b7ntwynnc9n0jbccwce.png" alt="Alt Text" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  EKS Distro de código abierto!
&lt;/h2&gt;

&lt;p&gt;ECS no es el único que recibe una actualización, sino que EKS también podrá “desplegarse en cualquier lado” de una manera distinta. La distribución de Kubernetes que usa EKS será puesta a disposición de la comunidad. &lt;/p&gt;

&lt;p&gt;Veamos qué podemos construir con esta nueva característica en la que recién tendremos acceso a mediados del 2021.&lt;/p&gt;

&lt;p&gt;Github Repo:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://github.com/aws/eks-distro" rel="noopener noreferrer"&gt;https://github.com/aws/eks-distro&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h1&gt;
  
  
  Lambda ahora nos cobra por cada 1 ms. Debo preocuparme?
&lt;/h1&gt;

&lt;p&gt;Serverless Lovers, acaso no es genial que ahora Lambda nos cobre por cada milisegundo procesado de cómputo? &lt;/p&gt;

&lt;p&gt;Solo si tienen cargas de trabajo de millones y millones de transacciones verás la diferencia sustancial. &lt;br&gt;
Tengo que resaltar que también va a ser soportado por Provisioned Concurrency para realizar pruebas de carga. Esto sí está espectacular!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fm53w6s9bs47wcvu7en8z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fm53w6s9bs47wcvu7en8z.png" alt="lambda" width="800" height="451"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Ejemplo:
&lt;/h2&gt;

&lt;p&gt;50 millones de transacciones de 100 ms vs 50 ms tiene una mejora de 25% en costos.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6ra0t0w4k7ql1l9z2hri.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6ra0t0w4k7ql1l9z2hri.png" alt="Lambda comparison ms" width="800" height="419"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h1&gt;
  
  
  Desarrollando microservicios ahora con AWS Proton!, es en serio? Más vendor lock-in? más simplicidad?
&lt;/h1&gt;

&lt;p&gt;Después de leerme parte de la documentación, tengo que decirles que aprovechen este post, porque les explicaré algo que me tomó varios minutos entenderlo. Vamos AWS, sí que hiciste algo complicado al inicio con todos esos componentes nuevos, templates, environments, services, services templates y más! Ufff, ahora sí vayamos a AWS Proton:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffdl0ia4g2u7d7m48mjz0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffdl0ia4g2u7d7m48mjz0.png" alt="proton" width="800" height="413"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Qué realmente es?
&lt;/h2&gt;
&lt;h3&gt;
  
  
  Automatiza:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;aprovisionamiento y &lt;/li&gt;
&lt;li&gt;despliegue: Sólo para infraestructura relacionada con Serverless y Contenedores. Con 2 caminos: Equipo de infra aprovisiona y Equipo de desarrollo consume.&lt;/li&gt;
&lt;li&gt;Infra estandarizada: CloudFormation&lt;/li&gt;
&lt;li&gt;CI/CD Integrado para que sea consumido por los desarrolladores.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  Cómo va funcionar?
&lt;/h2&gt;

&lt;p&gt;Antes de nada según la documentación nos piden algunos pre-requisitos!!!!!&lt;br&gt;
&lt;code&gt;Permisos de Admin, IAM Roles, Saber CloudFormation y Jinja y pues un repositorio Github&lt;/code&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Proton desde cero:
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Creamos la plantilla de ambientes (environment template)&lt;/li&gt;
&lt;li&gt;Creamos un ambiente (Proton Environment)&lt;/li&gt;
&lt;li&gt;Crear una plantilla de servicio&lt;/li&gt;
&lt;li&gt;Crear un Proton Service y desplegar una aplicación&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffbiramik5ad8atx0munx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffbiramik5ad8atx0munx.png" alt="proton_Desde_cero" width="800" height="364"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Proton simplifica la entrega de infraestructura como también define un estándar. Es cierto que seguimos usando CloudFormation, posiblemente veamos integraciones internas con SAM y luego con third-party apps como Terraform que puedan ir agregándole a esta gran herramienta.&lt;/p&gt;

&lt;p&gt;Genera vendor lock-in de alguna manera, seguimos usando CloudFormation, tal vez en el roadmap de desarrollo estén pensando integrar herramientas de testing para plantillas así como pruebas de seguridad, &lt;/p&gt;
&lt;h2&gt;
  
  
  Beneficios?
&lt;/h2&gt;

&lt;p&gt;*Rapidez en la entrega de Infraestructura para los que ya *tenemos todo corriendo con CloudFormation.&lt;br&gt;
*Simplicidad para el desarrollador elegir y desplegar en ambientes aislados&lt;br&gt;
*Pruebas rápidas de aplicaciones&lt;/p&gt;

&lt;p&gt;Estemos atentos a lo que se viene con AWS Proton!&lt;/p&gt;
&lt;h1&gt;
  
  
  ¡Actualizados al fin! ¡Llegó la nueva generación de EBS! gp3 e io2
&lt;/h1&gt;
&lt;h2&gt;
  
  
  ¿Qué hay de nuevo con gp3? 20% más económico y 4x mejor performance
&lt;/h2&gt;

&lt;p&gt;Simplemente una muy buena opción, ahora tendremos 3,000 IOPS de base para todo el volumen, mientras que en gp2 tenemos 3 IOPS por GB.&lt;/p&gt;

&lt;p&gt;Habrá que afinar más la calculadora para ver cuál conviene y en qué momento.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;Acciones inmediatas&lt;/code&gt;: Migremos todo a gp3 right now!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fplwo8vwaktn2vgy8gtp9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fplwo8vwaktn2vgy8gtp9.png" alt="gp3" width="800" height="446"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Pueden revisar más en el blog oficial de AWS:&lt;br&gt;
&lt;a href="https://aws.amazon.com/es/blogs/aws/new-amazon-ebs-gp3-volume-lets-you-provision-performance-separate-from-capacity-and-offers-20-lower-price/" rel="noopener noreferrer"&gt;https://aws.amazon.com/es/blogs/aws/new-amazon-ebs-gp3-volume-lets-you-provision-performance-separate-from-capacity-and-offers-20-lower-price/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Y por último, tenemos a:&lt;/p&gt;
&lt;h1&gt;
  
  
  Nuevo EBS io2 Block Express al mismo precio que io1?
&lt;/h1&gt;

&lt;p&gt;La nueva generación de los discos EBS especializados para IOPS, tienen un gran aumento en el ratio de IOPS por GB, subiendo a 500 IOPS por GB.&lt;/p&gt;

&lt;p&gt;Lo interesante y raro al mismo tiempo es que está al mismo precio que su generación anterior y sport muchisimo más IOPS, de manera que es mandatorio que migremos todo lo que actualmente esté en io1 hacia io2&lt;/p&gt;

&lt;p&gt;Algo tan simple como esto:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws ec2 modify-volume --volume-id vol-0b3c663aeca5aabb7 --volume-type io2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Pueden revisar más en el blog oficial de AWS:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://aws.amazon.com/es/blogs/aws/new-ebs-volume-type-io2-more-iops-gib-higher-durability/" rel="noopener noreferrer"&gt;https://aws.amazon.com/es/blogs/aws/new-ebs-volume-type-io2-more-iops-gib-higher-durability/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fyemqovbf1250kohk1qb5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fyemqovbf1250kohk1qb5.png" alt="io2 block express" width="800" height="448"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Definitivamente fue un día muy largo y lleno de emociones. Lo vivimos tal cual fuese presencial, como un partido de Fútbol, cómo cuándo Perú volvió al mundial después de 36 años. Gritando cada nuevo servicio lanzado, especialmente los de Sagemaker que son mis favoritos y haré un post especial para cada uno de ellos!&lt;/p&gt;

&lt;p&gt;Por ahora terminamos con esta parte 1 del Keynote, realmente espectacular y esperando probar estos servicios en estos días, espero les haya servido y por favor comenten y compartan.&lt;/p&gt;

&lt;p&gt;En la siguiente parte:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Parte 1: EC2 y microservicios&lt;/li&gt;
&lt;li&gt;Parte 2: Bases de datos y ML &lt;/li&gt;
&lt;li&gt;Parte 3: Contact Center e Industrial IoT&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Unanse al Slack de AWS UG Perú para recibir noticias en vivo del re:invent 2020, &lt;/p&gt;

&lt;p&gt;Estén atentos para cuando lancemos los After Party de los anuncios en las noches, y puedas unirte a discutir con nosotros. Vive las reacciones en vivo, repasemos los nuevos servicios y más sorpresas y sorteos!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Slack: &lt;a href="https://join.slack.com/t/aws-peru/shared_invite/zt-jt2324io-3VBHUQ8iH5rEkqGchKicEw" rel="noopener noreferrer"&gt;https://join.slack.com/t/aws-peru/shared_invite/zt-jt2324io-3VBHUQ8iH5rEkqGchKicEw&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ft60e3l2vqu6jyvgoq5ug.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ft60e3l2vqu6jyvgoq5ug.png" alt="Alt Text" width="800" height="1131"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Carlos Cortez - AWS UG Perú Leader / AWS ML Community Builder&lt;/strong&gt;&lt;br&gt;
&lt;a href="mailto:ccortez@aws.pe"&gt;ccortez@aws.pe&lt;/a&gt;&lt;br&gt;
&lt;a class="mentioned-user" href="https://dev.to/ccortezb"&gt;@ccortezb&lt;/a&gt;&lt;br&gt;
Podcast: imperiocloud.com @imperiocloud&lt;br&gt;
twitch.tv/awsugperu&lt;br&gt;
cennticloud.thinkific.com&lt;/p&gt;

</description>
      <category>aws</category>
      <category>awsugperu</category>
      <category>reinvent2020</category>
      <category>awsperu</category>
    </item>
    <item>
      <title>Primeros pasos con Amazon DocumentDB y AWS CLI</title>
      <dc:creator>Carlos Cortez 🇵🇪 [AWS Hero]</dc:creator>
      <pubDate>Tue, 20 Oct 2020 06:22:26 +0000</pubDate>
      <link>https://dev.to/aws-builders/primeros-pasos-con-amazon-documentdb-y-aws-cli-5ela</link>
      <guid>https://dev.to/aws-builders/primeros-pasos-con-amazon-documentdb-y-aws-cli-5ela</guid>
      <description>&lt;p&gt;Hola a todos, bienvenido a #BreakingtheCloud, yo soy Carlos Cortez &lt;a class="mentioned-user" href="https://dev.to/ccortezb"&gt;@ccortezb&lt;/a&gt;&lt;br&gt;
Y estas serán unas series donde hablaremos de una de las bases de datos más interesantes en AWS: Amazon DocumentDB&lt;/p&gt;

&lt;p&gt;Primero que todo debemos de saber de qué tipo de base de datos estamos hablando. DocumentDB, pertenece al grupo de bases de datos NoSQL orientadas a documentos.&lt;/p&gt;

&lt;p&gt;NoSQL, significa Not Only SQL, donde los modelos de datos pueden ser esquemas flexibles que se adaptan a los requisitos de las aplicaciones más modernas. &lt;/p&gt;

&lt;p&gt;Les recomiendo leer a Martin Fowler, &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://martinfowler.com/tags/noSQL.html" rel="noopener noreferrer"&gt;https://martinfowler.com/tags/noSQL.html&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Pero, porqué deberíamos usar una base de datos orientada a documentos? Cuál es la verdadera ventaja? Qué tipos de aplicaciones son las que usualmente se desarrollan usando una base de datos orientada a documentos?&lt;/p&gt;

&lt;p&gt;Vayamos paso a paso, no nos desesperemos, hay mucho por describir en el mundo de Nosql y más aún en Amazon Web Services&lt;/p&gt;

&lt;p&gt;Lo primero que debemos saber, es que una de las bases de datos más conocidas y comerciales orientadas a documentos es MongoDB&lt;/p&gt;

&lt;p&gt;Ahora, DocumentDB, según la documentación de AWS, DocumentDB tiene compatibilidad con MongoDB 3.6 exactamente en el momento que estoy escribiendo este post, pero qué significa que sea “compatible” realmente? La verdad detrás de este espectacular motor de base de datos es que no replica nada de MongoDB 3.6, sino que emula el comportamiento del motor, y de sus componentes.&lt;/p&gt;

&lt;p&gt;Muchas de sus funcionalidades que empezaremos a conocer se pueden realizar en DocumentDB, la manera cómo hacemos las consultas, creación de índices también son compatibles y cómo usamos los comandos, se mantiene:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fbrxtvc75uphr8ny55lu3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fbrxtvc75uphr8ny55lu3.png" alt="Alt Text" width="593" height="234"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/documentdb/latest/developerguide/what-is.html" rel="noopener noreferrer"&gt;https://docs.aws.amazon.com/documentdb/latest/developerguide/what-is.html&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ahora que sabemos, revisemos los componentes y funcionalidades que tiene DocumentDB. Este servicio se compone por una capa de cluster, instancia y almacenamiento. Así como en Mongodb al crear múltiples instancias introducimos el término de replica set o ReplSet, en dónde existe un nodo máster y nodos réplicas. En DocumentDB las réplicas se conocen como nodos de lectura y el máster como nodo de escritura.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flxyaqlbvhupl0bteg2zw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Flxyaqlbvhupl0bteg2zw.png" alt="Alt Text" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Manos a la obra ahora, empezamos a jugar con la línea de comandos de AWS, la famosa CLI, y lo haremos dentro de una instancia de Cloud9 con un InstanceRole&lt;/p&gt;

&lt;p&gt;Primero creemos una instancia de Cloud9&lt;/p&gt;

&lt;p&gt;Vamos a crear un instance Role, vamos al servicio IAM y le damos a create role y también agregamos documentDB Full Access al rol.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnctviipnmz8v2b728533.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fnctviipnmz8v2b728533.png" alt="Alt Text" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Elegimos las políticas de IAM necesarias, en este caso accesos Full para DocumentDB, esto nos permitirá acceder por líneas de comandos a todas la API de DocumentDB&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwg4bbqpv1clkt4n4i11w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwg4bbqpv1clkt4n4i11w.png" alt="Alt Text" width="800" height="630"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Seleccionamos EC2 para que asuma el rol&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftvpifhftbbz4ot3o0v4m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ftvpifhftbbz4ot3o0v4m.png" alt="Alt Text" width="800" height="580"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Creamos un Instance Role en IAM y cambiamos en EC2 del Cloud9, al nuevo rol.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7kt3kc2wd5cdbs9zxxn6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F7kt3kc2wd5cdbs9zxxn6.png" alt="Alt Text" width="800" height="673"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Luego solo reemplazamos el rol por defecto de Cloud9 por el nuevo creado. Otra opción es también solo editar el rol por defecto y agregarle las políticas que creas necesario.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9p5204eegajb3q13y1yu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F9p5204eegajb3q13y1yu.png" alt="Alt Text" width="800" height="472"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ahora toca desactivar las credenciales temporales en cloud9 para poder trabajar con Roles&lt;br&gt;
Ir a configuracion&lt;br&gt;
AWS Settings&lt;br&gt;
Desactivar Temporary Credentials&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fytd2nbzjz5l8i3ocwwdc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fytd2nbzjz5l8i3ocwwdc.png" alt="Alt Text" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SI quieres saber más de el principio de “Least privilege” dale un vistazo a IAM Keeper: &lt;br&gt;
&lt;a href="https://medium.com/@iamkeeper/achieving-least-privilege-permissions-in-aws-97ab8378fb2#:%7E:text=One%20of%20the%20basic%20principles,required%20to%20get%20job%20done" rel="noopener noreferrer"&gt;https://medium.com/@iamkeeper/achieving-least-privilege-permissions-in-aws-97ab8378fb2#:~:text=One%20of%20the%20basic%20principles,required%20to%20get%20job%20done&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Ahora que ya podemos ejecutar CLI en cloud9, vayamos a crear recursos de DocumentDB, &lt;/p&gt;

&lt;p&gt;Ejemplo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws docdb create-db-cluster \
      --db-cluster-identifier docdb-cluster \
      --engine docdb \
      --deletion-protection \
      --master-username ccortez \
      --master-user-password cortez,,123 \
      --region us-west-2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Puedes ir verificando lo que creas con describe-db-clusters así:&lt;/p&gt;

&lt;p&gt;Describe clusters:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws docdb describe-db-clusters --filter Name=engine,Values=docdb --region us-west-2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Otra manera de buscar con filtros con la CLI es la siguiente:&lt;/p&gt;

&lt;p&gt;Describe Clusters con filtros:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws docdb describe-db-clusters \
   --filter Name=engine,Values=docdb \
   --db-cluster-identifier docdb-cluster \
   --region us-west-2
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ahora que el cluster está creado, vamos a crear instancias dentro del cluster:&lt;/p&gt;

&lt;p&gt;Crear instancia de documentDB en el Cluster creado&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws docdb create-db-instance \
    --db-cluster-identifier docdb-cluster \
    --db-instance-class db.t3.medium \
    --db-instance-identifier docdb-cluster-instance01 \
    --engine docdb
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;(Opcional) Otra manera de lanzar clusters e instancias de DocumentDB es usar CloudFormation, la herramienta de &lt;/p&gt;

&lt;p&gt;Infraestructura como código de AWS:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://s3.amazonaws.com/amazon-documentdb-samples/v1/cloudformation-templates/documentdb_full_stack.yaml" rel="noopener noreferrer"&gt;https://s3.amazonaws.com/amazon-documentdb-samples/v1/cloudformation-templates/documentdb_full_stack.yaml&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Finalmente,  necesitamos instalar el cliente de Mongodb 3.6 en nuestra consola de Cloud9, &lt;br&gt;
Eso lo podemos hacer directamente con esta documentación oficial:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;a href="https://docs.mongodb.com/v3.6/tutorial/install-mongodb-on-amazon/" rel="noopener noreferrer"&gt;https://docs.mongodb.com/v3.6/tutorial/install-mongodb-on-amazon/&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;E intentemos conectarnos desde nuestra instancia Cloud9,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mongo --host docdb-cluster.cluster-cgyttffk8nhp.us-west-2.docdb.amazonaws.com:27017 --username ccortez --password ccortez,,123
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Por ahora terminamos con este primer tutorial, espero les haya servido y por favor comenten y compartan.&lt;/p&gt;

&lt;p&gt;Estas series de DocumentDB cuentan con los siguientes post:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Primeros pasos con Amazon DocumentDB y AWS CLI&lt;/li&gt;
&lt;li&gt;Trabajando con Amazon DocumentDB desde mi local&lt;/li&gt;
&lt;li&gt;Migrando hacia Amazon DocumentDB con AWS DMS&lt;/li&gt;
&lt;li&gt;Migrando hacia Amazon DocumentDB de la manera tradicional&lt;/li&gt;
&lt;li&gt;Configurando SSL en Amazon DocumentDB&lt;/li&gt;
&lt;li&gt;Backup de Indices con Amazon DocumentDB Index Tool&lt;/li&gt;
&lt;li&gt;Creación de perfiles de consultas de ejecución lenta en &lt;/li&gt;
&lt;li&gt;Amazon DocumentDB (slow queries profiling)&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Muchas gracias, &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6vp4cr2kvfzt0tlvc1df.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F6vp4cr2kvfzt0tlvc1df.png" alt="Alt Text" width="526" height="484"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Carlos Cortez&lt;br&gt;
&lt;a href="mailto:ccortez@aws.pe"&gt;ccortez@aws.pe&lt;/a&gt;&lt;br&gt;
&lt;a class="mentioned-user" href="https://dev.to/ccortezb"&gt;@ccortezb&lt;/a&gt;&lt;br&gt;
Podcast: imperiocloud.com @imperiocloud&lt;br&gt;
twitch.tv/breakingthecloud&lt;br&gt;
cennticloud.thinkific.com&lt;/p&gt;

</description>
      <category>aws</category>
      <category>ccortezb</category>
      <category>breakingthecloud</category>
      <category>awsperu</category>
    </item>
  </channel>
</rss>
