Ths blog is English translation of The following blog.
https://zenn.dev/clouddevcode/articles/53713a1bd72fb7
We tried out a service that allows you to deploy Amazon Bedrock AgentCore, an AI agent announced at AWS Summit NYC, in a managed environment using a sample application.
https://aws.amazon.com/bedrock/agentcore/
StrandsAgent Sample Application
This is the code for using Knowledge MCP from Strands Agnent.
from strands import Agent
from mcp.client.streamable_http import streamablehttp_client
from strands.tools.mcp import MCPClient
from strands.models import BedrockModel
from bedrock_agentcore.runtime import BedrockAgentCoreApp
app = BedrockAgentCoreApp()
bedrock_model = BedrockModel(
model_id="us.anthropic.claude-sonnet-4-20250514-v1:0"
)
@app.entrypoint
async def invoke(payload):
streamable_http_mcp_client = MCPClient(lambda: streamablehttp_client(
"https://knowledge-mcp.global.api.aws")
)
# Create an agent with MCP tools
with streamable_http_mcp_client:
# Get the tools from the MCP server
tools = streamable_http_mcp_client.list_tools_sync()
agent = Agent(model=bedrock_model,tools=tools)
user_message = payload.get("prompt","hello")
stream = agent.stream_async(user_message)
async for event in stream:
print(event)
yield(event)
#
if __name__ == "__main__":
app.run()
Deploying to Bedrock AgentCore
When deploying a new revision with the agentcore run described later, the latest tag will be overwritten, so create an ECR, but make sure to set Tag immutability to Mutable.
uv run agentcore configure --entrypoint main.py -er $IAM_ROLE_ARN
uv run agentcore run
:::message
When deploying AgentCore, the following error occurred. This was caused by the installed boto3 not being compatible with Bedrock AgentCore. Please upgrade boto3.
Launch failed: Unknown service: 'bedrock-agentcore-control'. Valid service
:::
AgentCore currently only supports the ARM architecture, and there may be issues when building from an Intel architecture environment.
You will need to set up a multi-platform build environment using QEMU.
https://qiita.com/hayao_k/items/aed1b7062ba403d70a12
It seems that this can be addressed using CodeBuild, but as mentioned in the issue below, it is not yet supported in 0.1.0.
agentcore run --codebuild
Frontend
By calling invoke_agent_runtime from Streamlint, you can easily create a chat UI.
https://github.com/minorun365/bedrock-agentcore-sample/blob/main/frontend/client.py
Logging
AgentCore logs are output only to Cloudwatch Logs.
All logs from the prompt and Tool use process are also recorded.
{
"resource": {
"attributes": {
"deployment.environment.name": "bedrock-agentcore:default",
"aws.local.service": "main.DEFAULT",
"service.name": "main.DEFAULT",
"cloud.region": "us-west-2",
"aws.log.stream.names": "runtime-logs",
"telemetry.sdk.name": "opentelemetry",
"aws.service.type": "gen_ai_agent",
"telemetry.sdk.language": "python",
"cloud.provider": "aws",
"cloud.resource_id": "arn:aws:bedrock-agentcore:us-west-2:xxxxxxxxxxxx:runtime/main-rVLdVz8WIg/runtime-endpoint/DEFAULT:DEFAULT",
"aws.log.group.names": "/aws/bedrock-agentcore/runtimes/main-rVLdVz8WIg-DEFAULT",
"telemetry.sdk.version": "1.33.1",
"cloud.platform": "aws_bedrock_agentcore",
"telemetry.auto.version": "0.1.6-aws"
}
},
"scope": {
"name": "opentelemetry.instrumentation.botocore.bedrock-runtime",
"schemaUrl": "https://opentelemetry.io/schemas/1.30.0"
},
"timeUnixNano": 1752890294921734500,
"observedTimeUnixNano": 1752890294921745001,
"severityNumber": 9,
"severityText": "",
"body": {
"content": [
{
"text": "describe best practice about vpc lattice"
}
]
},
"attributes": {
"event.name": "gen_ai.user.message",
"gen_ai.system": "aws.bedrock"
},
"flags": 1,
"traceId": "687afbb1fd5cc8569d31603b101fdc6c",
"spanId": "a79e2e26dfb98b99"
}
In addition, OpenTelemetry trace information is also recorded in Cloudwatch Custom Metrics.
Since Cloudwatch Agent does not support histogram metrics, response times are not supported.
We are also concerned about the cost of writing large amounts of data to custom metrics.
We hope that Amazon will support Amazon managed for prometheus in the future.
{
"_aws": {
"Timestamp": 1752883420402,
"CloudWatchMetrics": [
{
"Namespace": "bedrock-agentcore",
"Dimensions": [
[
"http.scheme",
"http.host",
"http.flavor",
"http.method",
"http.server_name"
]
],
"Metrics": [
{
"Name": "http.server.active_requests"
}
]
}
]
},
"Version": "1",
"otel.resource.telemetry.sdk.language": "python",
"otel.resource.telemetry.sdk.name": "opentelemetry",
"otel.resource.telemetry.sdk.version": "1.33.1",
"otel.resource.service.name": "main.DEFAULT",
"otel.resource.aws.log.group.names": "/aws/bedrock-agentcore/runtimes/main-rVLdVz8WIg-DEFAULT",
"otel.resource.aws.log.stream.names": "runtime-logs",
"otel.resource.deployment.environment.name": "bedrock-agentcore:default",
"otel.resource.cloud.resource_id": "arn:aws:bedrock-agentcore:us-west-2:xxxxxxxxxxxx:runtime/main-rVLdVz8WIg/runtime-endpoint/DEFAULT:DEFAULT",
"otel.resource.cloud.platform": "aws_bedrock_agentcore",
"otel.resource.cloud.provider": "aws",
"otel.resource.cloud.region": "us-west-2",
"otel.resource.telemetry.auto.version": "0.1.6-aws",
"otel.resource.aws.local.service": "main.DEFAULT",
"otel.resource.aws.service.type": "gen_ai_agent",
"http.server.active_requests": 0,
"http.scheme": "http",
"http.host": "127.0.0.1:8080",
"http.flavor": "1.1",
"http.method": "GET",
"http.server_name": "localhost:8080"
}
Ping requests are sent every minute, so the above trace information is buried.
INFO: 127.0.0.1:40694 - "GET /ping HTTP/1.1" 200 OK
Top comments (0)