DEV Community

Cover image for 🤖 Build Your Own AI Chatbot with AWS — Step-by-Step for Beginners
DevOps Descent
DevOps Descent

Posted on

🤖 Build Your Own AI Chatbot with AWS — Step-by-Step for Beginners

🤖 Build Your Own AI Chatbot with AWS — Step-by-Step for Beginners

Want to create an intelligent chatbot that can answer questions using your own documents? In this guide, you’ll build a self-service digital assistant using Amazon Lex, Amazon Bedrock, Amazon S3, and Opensearch—with no machine learning expertise required.

Let me walk you through it all: from setting up your data to building a fully functional Retrieval-Augmented Generation (RAG) chatbot.


🔧 Step 1: Prerequisites and Setup

Before building anything, you’ll need a few things ready:

An AWS Account

👤 IAM User with Admin Access (create a testing IAM user)

📚 Access to Bedrock Models:

  • Anthropic Claude 2 – for generating human-like responses
  • Amazon Titan Text Embeddings G1 Text – for converting documents into searchable formats

📥 How to Request Model Access

  1. Go to Amazon Bedrock in the AWS Console
  2. Click Base Models on the left menu
  3. Find Claude version 2 and Titan embeddings G1 text
  4. If access isn’t granted yet, hover over the model name and click the request link
  5. On the request page, select the models and submit. ⏱️ Approval usually takes just a few minutes!


🗃️ Step 2: Upload Your Documents to Amazon S3

Your chatbot needs information to answer questions—this comes from your own documents.

🪣 Create an S3 Bucket

  1. Go to Amazon S3 in the AWS Console
  2. Click Create bucket and give it a unique name
  3. Leave other settings as default and click Create bucket

📄 Upload Your Documents

  1. Open your bucket
  2. Click Upload and select your documents (e.g., Your own PDFs)

💡 Note: S3 is just storage—it doesn’t make your data intelligent on its own. That’s where Bedrock and OpenSearch come in next.


🧠 Step 3: Create a Knowledge Base in Amazon Bedrock

A knowledge base connects your documents to your chatbot. It transforms your content into a searchable format using embeddings—mathematical representations of text.

📘 Think of embeddings as turning sentences into numbers that capture meaning. Similar ideas = similar numbers. These go into a vector database, so the AI can find the best match when answering questions.

🧩 Create Your Knowledge Base

  1. Go to Amazon Bedrock > Knowledge Bases
  2. Click Create knowledge base
  3. Give it a name and let Bedrock create a new IAM role
  4. Select Amazon S3 as the data source and point to your bucket
  5. Choose the Titan embeddings G1 text model
  6. For the vector database, select Amazon OpenSearch Serverless, and choose Quick create a new vector store
  7. Click Create

🔄 Sync Your Data Source

  1. In the knowledge base view, go to Data sources
  2. Select your source and click Sync 📌 This process converts your documents into searchable data.

💬 Step 4: Build the Chatbot with Amazon Lex

Now, let’s create the chatbot that your users will talk to.

🛠️ Create a New Bot

  1. Go to Amazon Lex
  2. Click Create bot and choose Blank Bot
  3. Fill in the basics:
    • Bot name (e.g., DevOps Descent Bot)
    • New role with basic Lex permissions
    • Not subject to COPPA
    • Primary language (e.g., US English / Your Language)
    • Add a short description
    • Choose a voice (You can check)

👋 Set Up a Welcome Intent

  1. Lex starts you in a default intent. Rename it to Welcome Intent
  2. Add sample phrases users might say, like:
    • “hi”
    • “hello”
    • “heyaaa”
  3. Set the initial response (e.g., “Hi, how can I help you today?”)
  4. Choose “wait for user input” as the next step
  5. Click Save

🧱 Build the Bot (Initial)

Click Build to make the bot understand your welcome intent.

Image description


🧠 Step 5: Add Generative AI (Q&A Intent)

Let’s now connect the bot to your documents using Bedrock and Claude.

➕ Add a New Intent

  1. Click Add intent
  2. Choose Built-in intent and find Amazon Q&A intent
  3. Name it (e.g., Q&A bot intent)
  4. Choose:

    • Model provider: Anthropic
    • Model version: Claude V2 (or whichever you have access to)
    • Knowledge base: Paste your Knowledge Base ID from Bedrock
  5. Leave other settings as default and click Save

🧱 Build the Bot (Final)

Click Build again to include your Q&A features.


🧪 Step 6: Test Your Chatbot

Now try it out right in the console!

✅ How to Test

  1. Click Test
  2. Start with a welcome phrase: “howdy”
  3. Ask real questions based on your documents:
    • “What items can I expense?”
    • “How can I become an astronaut?”

🛑 If there’s no good match, Lex will use a fallback intent. You can customize this too!

🧑‍💻 For a full frontend experience, check out AWS GitHub repos that provide chatbot UIs.


💸 Step 7: Delete Resources to Avoid Extra Costs

Once you're done, clean up everything to prevent ongoing charges:

🧼 Clean-Up Checklist

  • 🧠 Delete the Bedrock Knowledge Base > Bedrock > Knowledge Bases > Select > Delete
  • 🧾 Delete the OpenSearch Collection > OpenSearch > Serverless > Dashboard > Delete
  • 📁 Delete S3 Bucket > Empty the bucket first, then delete
  • 🤖 Delete the Lex Bot > Lex Console > Bots > Select > Actions > Delete

🎯 You Did It!

You've built an AI chatbot that answers questions from your own documents using Retrieval-Augmented Generation—all powered by fully managed AWS services.

This architecture is scalable, secure, and production-ready. Time to put your chatbot to work!


💬 Feel free to drop a question if you get stuck anywhere or need more clarity on any step — happy to help! 😊

Top comments (0)