In 2024, I built Curiouser an agent-to-agent (A2A) application using CrewAI with the Ollama LLM. It was my first real experience orchestrating multiple agents in a single app. As someone curious about agentic frameworks, CrewAI felt intuitive and developer-friendly.
But I ran into limitations. I was running the application locally on my 8GB RAM AMD Ryzen laptop with a 256GB SSD, and it will take an average of 2β3 hours to complete a run (no bluff). Plus, I couldnβt take it beyond my local machine.
Then AWS announced Amazon Bedrock AgentCore at the New York Summit β and it felt like it was built for people like me. It's a serverless runtime for deploying and operating AI agents securely at scale, using any framework, including CrewAI, LangGraph, LlamaIndex, and Strands Agents, as well as any foundation model in or outside of Amazon Bedrock, without managing infrastructure. So I decided to migrate my local CrewAI app to production using this new service.
This blog post is a step-by-step guide for doing exactly that.
Note: This guide assumes you already have a working CrewAI application. Iβll only walk you through the changes needed to get it running on Bedrock AgentCore.
What I Built
I built Curiouser β a multi-agent AI application designed for curious minds.
- It provides detailed yet concise summaries of tech topics.
- It generates test questions to help you evaluate your understanding.
- It uses three cooperating agents, each with a different role.
- I switched from Ollama to the Amazon Bedrock Nova Pro model.
- It also stores user-generated content in Amazon S3.
Prerequisites
To follow along, you'll need:
- A working CrewAI app
- An AWS account with access to:
- Amazon Bedrock
- Bedrock AgentCore
- S3
- Python 3.10+
- Docker installed and running
- AWS CLI installed and configured
-
bedrock-agentcore
andbedrock-agentcore-starter-toolkit
Python packages
Step 1 - Update Your Agents to Use a Bedrock-Supported LLM
Bedrock AgentCore supports only Amazon Bedrock-compatible foundational models. Youβll need to update your CrewAI agents to use one of these.
Check out the supported models here.
Step 2 - Create a Remote Entrypoint for Bedrock AgentCore
We'll convert your local app into a remote one by modifying the entrypoint. Here's what to do:
- Import AgentCore:
from bedrock_agentcore.runtime import BedrockAgentCoreApp
- Initialize the runtime app:
app = BedrockAgentCoreApp()
- Define the entrypoint handler:
@app.entrypoint
def app_entry(payload, context):
"""Handle agent invocations"""
try:
user_message = payload.get("prompt", "Web3")
result = crew.kickoff(inputs={'topic': user_message})
return {"result": result.raw}
except Exception as e:
return {"error": str(e)}
- Run the app:
if __name__ == "__main__":
app.run()
AgentCore automatically:
- Sets up an HTTP server on port 8080
- Exposes
/invocations
and/ping
endpoints - Manages content types, responses, and errors
Step 3 - Configure IAM Role for AgentCore
Youβll need a dedicated IAM role that allows your agent to run in the cloud. Below is a Bash script I wrote to:
- Create a trust policy
- Create permissions policy
- Attach the permissions policy
- Grant access to ECR, Bedrock Agentcore, and Bedrock
The script also configures your agent with AgentCore:
agentcore configure --entrypoint src/curiouser/main.py -er <ROLE_ARN>
#!/bin/bash
# Setup script for CrewAI with Amazon Bedrock AgentCore
# This script creates the necessary IAM role and configures agentcore
set -e # Exit on any error
# Configuration variables
ROLE_NAME="Curioser-CrewAI-BedrockAgentCore-Role"
POLICY_NAME="Curioser-CrewAI-BedrockAgentCore-Policy"
TRUST_POLICY_FILE="trust-policy.json"
PERMISSIONS_POLICY_FILE="permissions-policy.json"
ENTRYPOINT_FILE="src/curiouser/main.py"
NC='\033[0m'
echo -e "Setting up CrewAI with Amazon Bedrock AgentCore${NC}"
# Check if AWS CLI is installed and configured
if ! command -v aws &> /dev/null; then
echo -e "β AWS CLI is not installed. Please install it first.${NC}"
exit 1
fi
# Get AWS account ID and region
echo -e "π Getting AWS account information...${NC}"
ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
REGION=$(aws configure get region)
if [ -z "$REGION" ]; then
echo -e "β οΈ No default region set. Using us-east-1${NC}"
REGION="us-east-1"
fi
echo -e "β Account ID: $ACCOUNT_ID"
echo -e "β Region: $REGION"
# Create trust policy for the IAM role
echo -e "π Creating trust policy..."
cat > $TRUST_POLICY_FILE << EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AssumeRolePolicy",
"Effect": "Allow",
"Principal": {
"Service": "bedrock-agentcore.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"aws:SourceAccount": "$ACCOUNT_ID"
},
"ArnLike": {
"aws:SourceArn": "arn:aws:bedrock-agentcore:$REGION:$ACCOUNT_ID:*"
}
}
}
]
}
EOF
# Create permissions policy with dynamic values
echo -e "π Creating permissions policy..."
cat > $PERMISSIONS_POLICY_FILE << EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ECRImageAccess",
"Effect": "Allow",
"Action": [
"ecr:BatchGetImage",
"ecr:GetDownloadUrlForLayer"
],
"Resource": [
"arn:aws:ecr:$REGION:$ACCOUNT_ID:repository/*"
]
},
{
"Effect": "Allow",
"Action": [
"logs:DescribeLogStreams",
"logs:CreateLogGroup"
],
"Resource": [
"arn:aws:logs:$REGION:$ACCOUNT_ID:log-group:/aws/bedrock-agentcore/runtimes/*"
]
},
{
"Effect": "Allow",
"Action": [
"logs:DescribeLogGroups"
],
"Resource": [
"arn:aws:logs:$REGION:$ACCOUNT_ID:log-group:*"
]
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": [
"arn:aws:logs:$REGION:$ACCOUNT_ID:log-group:/aws/bedrock-agentcore/runtimes/*:log-stream:*"
]
},
{
"Sid": "ECRTokenAccess",
"Effect": "Allow",
"Action": [
"ecr:GetAuthorizationToken"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"xray:PutTraceSegments",
"xray:PutTelemetryRecords",
"xray:GetSamplingRules",
"xray:GetSamplingTargets"
],
"Resource": [ "*" ]
},
{
"Effect": "Allow",
"Resource": "*",
"Action": "cloudwatch:PutMetricData",
"Condition": {
"StringEquals": {
"cloudwatch:namespace": "bedrock-agentcore"
}
}
},
{
"Sid": "GetAgentAccessToken",
"Effect": "Allow",
"Action": [
"bedrock-agentcore:GetWorkloadAccessToken",
"bedrock-agentcore:GetWorkloadAccessTokenForJWT",
"bedrock-agentcore:GetWorkloadAccessTokenForUserId"
],
"Resource": [
"arn:aws:bedrock-agentcore:$REGION:$ACCOUNT_ID:workload-identity-directory/default",
"arn:aws:bedrock-agentcore:$REGION:$ACCOUNT_ID:workload-identity-directory/default/workload-identity/agentName-*"
]
},
{
"Sid": "BedrockModelInvocation",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": [
"arn:aws:bedrock:*::foundation-model/*",
"arn:aws:bedrock:$REGION:$ACCOUNT_ID:*"
]
}
]
}
EOF
# Check if role already exists
echo -e "π Checking if IAM role exists...${NC}"
if aws iam get-role --role-name $ROLE_NAME &> /dev/null; then
echo -e "β οΈ Role $ROLE_NAME already exists. Updating policies...${NC}"
# Update the role's trust policy
aws iam update-assume-role-policy --role-name $ROLE_NAME --policy-document file://$TRUST_POLICY_FILE
# Delete existing inline policy if it exists
aws iam delete-role-policy --role-name $ROLE_NAME --policy-name $POLICY_NAME &> /dev/null || true
else
echo -e "π¨ Creating IAM role..."
aws iam create-role \
--role-name $ROLE_NAME \
--assume-role-policy-document file://$TRUST_POLICY_FILE \
--description "IAM role for CrewAI with Bedrock AgentCore"
fi
# Attach the permissions policy
echo -e "${YELLOW}π Attaching permissions policy...${NC}"
aws iam put-role-policy \
--role-name $ROLE_NAME \
--policy-name $POLICY_NAME \
--policy-document file://$PERMISSIONS_POLICY_FILE
# Get the role ARN
ROLE_ARN=$(aws iam get-role --role-name $ROLE_NAME --query 'Role.Arn' --output text)
echo -e "β IAM Role ARN: $ROLE_ARN"
# Check if src/curiouser/main.py exists
if [ -f "$ENTRYPOINT_FILE" ]; then
echo -e "β Entrypoint file $ENTRYPOINT_FILE already exists and is configured for Bedrock AgentCore"
else
echo -e "β Entrypoint file $ENTRYPOINT_FILE not found. Please ensure your CrewAI project structure is correct."
exit 1
fi
# Wait a moment for IAM role to propagate
echo -e "β³ Waiting for IAM role to propagate..."
sleep 10
# Configure agentcore
echo -e "π§ Configuring agentcore..."
if command -v agentcore &> /dev/null; then
agentcore configure --entrypoint $ENTRYPOINT_FILE -er $ROLE_ARN
echo -e "β AgentCore configured successfully!"
else
echo -e "β agentcore command not found. Please install the Bedrock AgentCore CLI first."
echo -e "π‘ You can install it with: pip install bedrock-agentcore-cli"
echo -e "π Manual configuration command:"
echo -e "agentcore configure --entrypoint $ENTRYPOINT_FILE -er $ROLE_ARN"
fi
Attach the policy that your project needs
Step 4 - Generate Docker Configuration with AgentCore command
The command:
agentcore configure --entrypoint src/curiouser/main.py -er <ROLE_ARN>
This will:
- Generate a Dockerfile and
.bedrock_agentcore.yaml
- Create a .bedrock_agentcore.yaml configuration file
- Prepare our application for containerization
Important Fix
The generated Dockerfile doesnβt install bedrock_agentcore
, so be sure to add else you will run into a dependency error:
RUN pip install bedrock_agentcore
Your Dockerfile should look like this:
FROM public.ecr.aws/docker/library/python:3.10-slim
WORKDIR /app
COPY . .
RUN pip install .
RUN pip install bedrock_agentcore
RUN pip install aws-opentelemetry-distro>=0.10.0
ENV AWS_REGION=<your-region>
ENV AWS_DEFAULT_REGION=<your-region>
ENV DOCKER_CONTAINER=1
RUN useradd -m -u 1000 bedrock_agentcore
USER bedrock_agentcore
EXPOSE 8080
CMD ["opentelemetry-instrument", "python", "-m", "src.app.main"]
Step 5 - Launch the Agent to the Cloud
Once everything is ready, run:
agentcore launch
This command will:
- Build the Docker image
- Push it to Amazon ECR
- Create a Bedrock AgentCore runtime
- And deploy your agent to the cloud
Step 6 - Test Your Deployed Agent
Use the CLI to invoke your agent:
agentcore invoke '{"prompt": "Hello"}'
Or test directly from the AWS Console.
Final Thoughts
Thatβs it, youβve just seen how to take a multi-agent CrewAI application from local development to a cloud-native, production-ready deployment using Amazon Bedrock AgentCore.
Personally, moving from a painfully slow, local setup on limited hardware to a serverless runtime that scales, monitors, and secures my workload automatically was a game-changer. No more babysitting long processes or worrying about whether my laptop will crash before the task finishes. With Amazon Bedrock AgentCore, deployment is fast, observability is built-in, and your agentic logic becomes accessible from anywhere, anytime.
For developers building serious agentic applications, especially multi-agent systems with orchestration and external data workflows. Amazon Bedrock AgentCore unlocks scalability and reliability without the traditional ops overhead. And because itβs part of the AWS ecosystem, integrating other services like S3, CloudWatch, and IAM is straightforward and secure.
Whether youβre:
- Building a knowledge assistant like Curiouser,
- Orchestrating tools and LLMs across complex workflows,
- Experimenting with autonomous agents in a safe and scalable environment,
Amazon Bedrock AgentCore gives you a strong foundation to focus on the intelligence of your application, not the plumbing.
If youβve already got a local CrewAI project up and running, thereβs no better time to bring it to life in the cloud.
Let me know if you try this out or run into any blockers β always happy to connect with fellow builders.
References
- Amazon Bedrock AgentCore Documentation
- Amazon Bedrock AgentCore GitHub Samples
- Supported Foundation Models on Bedrock
- CrewAI Framework Documentation
- Amazon Bedrock Developer Guide
ββββββββββββββ
For more articles, follow my social handles:
Top comments (0)