No OpenAI API key needed. No expensive subscriptions. Just AWS and Python.
Why I Built This
Everyone is talking about AI chatbots.
But every tutorial I found required:
- OpenAI API key (costs money)
- Complex setup
- Third party services
- Hours of configuration
Then I discovered AWS Bedrock.
Bedrock gives you access to powerful AI models — Claude, Llama, Mistral — directly through your existing AWS account. No separate subscription. No new account. Just boto3 and Python.
I built a working AI chatbot in 30 minutes. Here's exactly how. 🚀
What We're Building
By the end of this article you'll have:
- ✅ A working AI chatbot in Python
- ✅ Powered by Claude model via AWS Bedrock
- ✅ Runs from your terminal
- ✅ Remembers conversation history
- ✅ Handles errors gracefully
Let's build it. ⏱️
Prerequisites
Before starting make sure you have:
- AWS account (free tier works)
- Python installed
- boto3 installed (
pip install boto3) - AWS credentials configured (
aws configure)
That's it. Nothing else needed. ✅
Step 1 — Enable AWS Bedrock Access (5 minutes)
By default Bedrock models are not enabled. You need to request access first.
1. Go to AWS Console → Amazon Bedrock
2. Click "Model access" in left sidebar
3. Click "Manage model access"
4. Select "Claude" models (Anthropic)
5. Click "Request model access"
6. Wait 2-3 minutes for approval
Once approved you'll see green checkmarks next to Claude models. ✅
Important: Bedrock is available in specific regions. Use us-east-1 (N. Virginia) for best model availability.
Step 2 — Test Your First Bedrock API Call (5 minutes)
Before building the chatbot let's make sure everything works.
Create a file called test_bedrock.py:
import boto3
import json
def test_bedrock():
# Create Bedrock client
# Note: Use us-east-1 for best model availability
bedrock = boto3.client(
service_name='bedrock-runtime',
region_name='us-east-1'
)
# Prepare the request
body = json.dumps({
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 500,
"messages": [
{
"role": "user",
"content": "Say hello and tell me what you can help with in one sentence."
}
]
})
# Call Claude model
response = bedrock.invoke_model(
body=body,
modelId="anthropic.claude-3-haiku-20240307-v1:0",
accept="application/json",
contentType="application/json"
)
# Parse and print response
response_body = json.loads(response.get('body').read())
message = response_body['content'][0]['text']
print(f"AI Response: {message}")
test_bedrock()
Run it:
python test_bedrock.py
If you see an AI response — everything is working! 🎉
Step 3 — Build the Complete Chatbot (15 minutes)
Now let's build the real chatbot with conversation memory.
Create a file called chatbot.py:
import boto3
import json
import sys
class BedrockChatbot:
def __init__(self):
# Initialize Bedrock client
self.bedrock = boto3.client(
service_name='bedrock-runtime',
region_name='us-east-1'
)
# Model ID — Claude Haiku is fast and cost effective
self.model_id = "anthropic.claude-3-haiku-20240307-v1:0"
# Conversation history — this gives chatbot memory
self.conversation_history = []
# System prompt — defines chatbot personality
self.system_prompt = """You are a helpful AI assistant specializing in
AWS cloud services and Python development. You give clear, practical
answers with code examples when relevant. Keep responses concise and useful."""
print("🤖 AWS Bedrock Chatbot initialized!")
print("Type 'quit' to exit, 'clear' to reset conversation\n")
def chat(self, user_message):
# Add user message to history
self.conversation_history.append({
"role": "user",
"content": user_message
})
# Prepare request body
body = json.dumps({
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 1000,
"system": self.system_prompt,
"messages": self.conversation_history
})
try:
# Call Bedrock API
response = self.bedrock.invoke_model(
body=body,
modelId=self.model_id,
accept="application/json",
contentType="application/json"
)
# Parse response
response_body = json.loads(response.get('body').read())
assistant_message = response_body['content'][0]['text']
# Add assistant response to history
# This is what gives the chatbot memory of conversation
self.conversation_history.append({
"role": "assistant",
"content": assistant_message
})
return assistant_message
except Exception as e:
error_msg = f"Error calling Bedrock: {str(e)}"
print(f"❌ {error_msg}")
# Remove failed message from history
self.conversation_history.pop()
return None
def clear_history(self):
self.conversation_history = []
print("✅ Conversation history cleared!\n")
def run(self):
print("=" * 50)
print(" AWS Bedrock AI Chatbot")
print(" Powered by Claude via Amazon Bedrock")
print("=" * 50)
print()
while True:
# Get user input
try:
user_input = input("You: ").strip()
except KeyboardInterrupt:
print("\n\nGoodbye! 👋")
break
# Handle special commands
if not user_input:
continue
if user_input.lower() == 'quit':
print("Goodbye! 👋")
break
if user_input.lower() == 'clear':
self.clear_history()
continue
if user_input.lower() == 'history':
print(f"\n📝 Conversation has {len(self.conversation_history)} messages\n")
continue
# Get AI response
print("\n🤖 AI: ", end="", flush=True)
response = self.chat(user_input)
if response:
print(response)
print()
# Run the chatbot
if __name__ == "__main__":
chatbot = BedrockChatbot()
chatbot.run()
Run it:
python chatbot.py
Step 4 — See It In Action
Here's what a real conversation looks like:
==================================================
AWS Bedrock AI Chatbot
Powered by Claude via Amazon Bedrock
==================================================
You: What is AWS Lambda?
🤖 AI: AWS Lambda is a serverless compute service that runs your code
in response to events without requiring you to manage servers. You pay
only for the compute time you consume — there's no charge when your
code is not running.
Key benefits:
- No server management
- Automatic scaling
- Pay per use
- Supports Python, Node.js, Java, and more
You: Can you show me a simple Python Lambda function?
🤖 AI: Here's a simple Python Lambda function:
def lambda_handler(event, context):
name = event.get('name', 'World')
return {
'statusCode': 200,
'body': f'Hello, {name}!'
}
Notice the chatbot remembers you asked about Lambda!
You: clear
✅ Conversation history cleared!
You: quit
Goodbye! 👋
Your AI chatbot is working! 🎉
Step 5 — Make It Better (5 minutes)
Let's add a few improvements:
Add conversation saving
import json
from datetime import datetime
def save_conversation(self):
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"conversation_{timestamp}.json"
with open(filename, 'w') as f:
json.dump(self.conversation_history, f, indent=2)
print(f"✅ Conversation saved to {filename}")
Add to your chatbot's run() method:
if user_input.lower() == 'save':
self.save_conversation()
continue
Change the personality
Want a different chatbot? Just change the system prompt:
# Customer support bot
self.system_prompt = "You are a friendly customer support agent. Be helpful and empathetic."
# Code review bot
self.system_prompt = "You are an expert code reviewer. Review code for bugs, security issues, and improvements."
# Interview prep bot
self.system_prompt = "You are a technical interviewer. Ask coding and system design questions."
One line change. Completely different chatbot. 🔥
How Much Does This Cost?
This is the question everyone asks. Let me be honest.
AWS Bedrock pricing (Claude Haiku):
- Input tokens: $0.00025 per 1,000 tokens
- Output tokens: $0.00125 per 1,000 tokens
What that means in real terms:
- One average conversation (20 messages): ~$0.002 (less than 0.2 rupees!)
- 100 conversations per day: ~$0.20 per day (about ₹17)
- For personal use and learning: practically free
This is why Bedrock is perfect for developers learning AI — the cost is negligible. ✅
Common Errors and Fixes
Error 1 — AccessDeniedException
botocore.exceptions.ClientError: AccessDeniedException
Fix: You haven't requested model access yet. Go to Bedrock console → Model access → Enable Claude models.
Error 2 — Could not connect to endpoint
EndpointResolutionError: Could not resolve endpoint
Fix: Bedrock is not available in all regions. Change region to us-east-1.
Error 3 — ValidationException
ValidationException: The provided model identifier is invalid
Fix: Double check the model ID. Use exactly: anthropic.claude-3-haiku-20240307-v1:0
What You Just Built
Let's appreciate what you created in 30 minutes 👇
- ✅ Real AI chatbot powered by Claude
- ✅ Conversation memory across messages
- ✅ Customizable personality via system prompt
- ✅ Error handling
- ✅ Conversation saving
- ✅ Running on AWS infrastructure
This is not a toy. This is production-quality foundation code.
With a few more hours you could add:
- Web interface with Flask
- Deploy on AWS Lambda
- Connect to a database
- Add user authentication
Why This Matters For Your Career
Here's something important yaar.
Most developers applying for jobs in 2026 know EITHER cloud OR AI.
You now know BOTH — AWS infrastructure AND AI integration.
That combination is genuinely rare and genuinely valuable.
Put this project on your GitHub. Add "AWS Bedrock" and "LLM Integration" to your LinkedIn. Mention it in interviews.
You just joined a very small group of developers who have actually built something with AWS AI services. 💪
What's Next?
Now that you have a working chatbot — here's what to build next:
- Add a web interface — Flask + Bedrock chatbot
- Deploy to AWS Lambda — serverless chatbot API
- Add RAG — chatbot that answers from your documents
- Connect to S3 — chatbot that reads your files
I'll be writing about all of these. Follow LearnWithPrashik so you don't miss them! 🙌
Final Thoughts
AWS Bedrock removes the biggest barrier to AI development — cost and complexity.
You don't need an expensive OpenAI subscription. You don't need a new account. You don't need complex setup.
Just your existing AWS account, Python, and boto3.
30 minutes from zero to working AI chatbot.
Now go build something with it. 💪
Follow LearnWithPrashik for more practical AWS and AI development content.
Connect with me:
LinkedIn: linkedin.com/in/prashik-besekar
GitHub: github.com/prashikBesekar
Top comments (0)