DEV Community

Ömer Berat Sezer
Ömer Berat Sezer

Posted on • Edited on

What is AWS Strands Agent? 🧬 AI App with AWS Strands, Bedrock, Nova, Fast API, Streamlit UI

AI agents are becoming the brains behind modern apps (handling decisions, calling tools, and generating smart responses in real time). But building these agents at scale requires a solid framework.

In the past three months, TWO powerful AI agent development frameworks have been released:

  • AWS Strands Agents
  • Google Agent Development Kit (ADK)

In this post, we’ll explore what the AWS Strands Agent is and how you can build an end-to-end app using Strands, Nova, FastAPI, and a Streamlit UI. Whether you’re curious about how agents actually work or looking to build one yourself, this is your starting point.

Table of Contents

What is AWS Strands Agents?

  • Strands agent is an open-source framework to develop AI agents to run anywhere:
    • VSCode, Terminal,
    • Docker Container
    • AWS Lambda
    • AWS ECS (Elastic Container Service)
    • AWS EKS (Elastic Kubernetes Service)
  • Really good documentation to view/learn agents, tools, workflows, model providers, streaming, multi-agent etc.

Motivation: Why Use an Agent Framework?

  • Structured workflows: Automates decision-making, tool use, and response generation in a clear loop.
  • Session & memory support: Keeps context across interactions for smarter, personalized behavior.
  • Multi-agent orchestration: Lets specialized agents collaborate on complex tasks.
  • Tool integration: Easily plug in MCP tools, APIs, functions that agents know when and how to use them.
  • Multi-model flexibility: Use and switch between different LLMs (e.g. GPT, Claude, Nova).
  • Production-ready: Built-in logging, monitoring, and error handling for real-world use.

Agent Loop

  • The agent loop is the mechanism through which a Strands agent interprets user input, makes decisions, uses tools, and produces responses. It enables advanced, multi-step reasoning and actions by tightly integrating tools with language models..

agent-loop

Fundamentally, the agent loop operates through the following steps:

  • Receives user input and contextual information,
  • Processes the input using a language model (LLM),
  • Decides whether to use tools to gather information or perform actions,
  • Executes tools and receives results,
  • Continues reasoning with the new information,
  • Produces a final response or iterates again through the loop.

This cycle can occur several times during a single user interaction, enabling the agent to carry out intricate, multi-step reasoning and act autonomously.

Link: https://strandsagents.com/

Develop Custom App with Strands, Nova, FastAPI and Streamlit

app-ui

Installing Libraries & Reaching LLM Model

  • Please install AWS Strands Agents & Tools libraries:
pip install strands-agents
pip install strands-agents-tools strands-agents-builder
Enter fullscreen mode Exit fullscreen mode
  • Enable AWS Bedrock model access in your region or US-West-2

    • AWS Bedrock > Bedrock Configuration > Model Access > AWS Nova-Pro, or Claude 3.7 Sonnet, or Llama 4
  • In this code, we'll use AWS Nova-Pro, because it's served in different regions by AWS.

  • After model access, give permission in your IAM to access AWS Bedrock services: AmazonBedrockFullAccess

  • 2 Options to reach AWS Bedrock Model using your AWS account:

    • AWS Config: With aws configure, to create configand credentials files
    • Getting variables using .env file: Add .env file:
AWS_ACCESS_KEY_ID= PASTE_YOUR_ACCESS_KEY_ID_HERE
AWS_SECRET_ACCESS_KEY=PASTE_YOUR_SECRET_ACCESS_KEY_HERE
Enter fullscreen mode Exit fullscreen mode

Application Code

GitHub Link: Project in GitHub

Agent App:

# agent.py
from fastapi import FastAPI, Request
from pydantic import BaseModel
from strands import Agent, tool
from strands.models import BedrockModel
from strands_tools import calculator, current_time, python_repl

MODEL = "us.amazon.nova-pro-v1:0"

bedrock_model = BedrockModel(
    model_id=MODEL,
    temperature=0.3,
    top_p=0.8,
)

@tool
def letter_counter(word: str, letter: str) -> int:
    if not isinstance(word, str) or not isinstance(letter, str):
        return 0
    if len(letter) != 1:
        raise ValueError("The 'letter' parameter must be a single character")
    return word.lower().count(letter.lower())

agent = Agent(
    model=bedrock_model, 
    system_prompt='Help user interact using available tools. \n'
        '- For all other questions, respond using your own knowledge.',
    tools=[calculator, current_time, python_repl, letter_counter])

app = FastAPI()
class QueryRequest(BaseModel):
    query: str

@app.post("/ask")
async def ask(request: Request):
    data = await request.json()
    query = data.get("query", "")
    try:
        response = agent(query, stream=False)
        return {"response": response}
    except Exception as e:
        return {"error": str(e)}

if __name__ == "__main__":
    import uvicorn
    uvicorn.run("main:app", host="0.0.0.0", port=8000)
Enter fullscreen mode Exit fullscreen mode

Run app.py:

uvicorn agent:app --host 0.0.0.0 --port 8000
Enter fullscreen mode Exit fullscreen mode

Streamlit UI:

# app.py
import streamlit as st
import requests
import json 

st.set_page_config(page_title="AWS Strands Agent Chat", layout="centered")

st.title("AWS Strands Agent")

# initialize chat history
if "messages" not in st.session_state:
    st.session_state.messages = []

user_query = st.chat_input("Ask for system file tool commands or anything...")

if user_query:
    st.session_state.messages.append({"role": "user", "content": user_query})

    try:
        response = requests.post("http://localhost:8000/ask", json={"query": user_query})
        response.raise_for_status()

        response_json = response.json()
        assistant_text = response_json["response"]["message"]["content"][0]["text"]

    except Exception as e:
        assistant_text = f"Error: {str(e)}"

    st.session_state.messages.append({"role": "assistant", "content": assistant_text})

# display all messages in order
for msg in st.session_state.messages:
    st.chat_message(msg["role"]).markdown(msg["content"])    

Enter fullscreen mode Exit fullscreen mode

Run app.py:

streamlit run app.py
or
python -m streamlit run app.py
Enter fullscreen mode Exit fullscreen mode

Demo

Demo GIF in GitHub

Conclusion

In this post, we mentioned:

  • how to access AWS Bedrock Foundation Models,
  • how to implement first AWS Strands agent with Nova-Pro, Fast API, Streamlit UI.

If you found the tutorial interesting, I’d love to hear your thoughts in the blog post comments. Feel free to share your reactions or leave a comment. I truly value your input and engagement 😉

For other posts 👉 https://dev.to/omerberatsezer 🧐

References

Your comments 🤔

  • Which tools are you using to develop AI Agents (e.g. AWS Strands, Google ADK, CrewAI, Langchain, etc.)? Please mention in the comment your experience, your interest?
  • What are you thinking about AWS Strands Agent?
  • What areas of LLM and agent development are you most interested in learning?

Top comments (16)

Collapse
 
dotallio profile image
Dotallio

Really enjoying how AWS Strands ties right into Bedrock and Nova - I've mostly been building agent flows using Langchain inside Dotallio for the flexibility, but I'm curious how Strands handles plugins or custom tools if you want to go beyond the basics?

Have you tried mixing Strands with other SDKs for more experimental side projects?

Collapse
 
omerberatsezer profile image
Ömer Berat Sezer • Edited

Thanks for your comment and question :) I'm currently developing different small projects and samples in the repo to try out different features (session, memory, mcp tool, multi-agent, etc.) of both AWS-Strands and Google ADK (repo: github.com/omerbsezer/Fast-LLM-Age... ). As I implement them, I plan to write some of them as post in DEV.to.

But my first impressions and experiences show that different LLM models can be integrated with LiteLLM (Ollama, Gemini, OpenAI, etc.), different MCP tools can be integrated with MCP.StdioServerParameters. The same thing also can be implemented with Google ADK. It would be much better if in the future, AWS Strands also supported other protocols such as A2A protocol, etc.

In fact, I think the ones that will be most accepted among developers will be the ones that can integrate with other SDKs easily, quickly and develop fast.

Collapse
 
omerberatsezer profile image
Ömer Berat Sezer • Edited

I evaluated several AI agent frameworks (LangChain, CrewAI, AWS Strands, and Google ADK) across various use cases such as multi-agent collaboration, integration with MCP tools, support for multiple models, and workflow orchestration. Among them, I find AWS Strands and Google ADK particularly compelling. Both offer similar functionality and integrate well with other open-source solutions like LiteLLM (for handling various models) and MCP tools (e.g., StdioServerParameters).

Collapse
 
pkkolla profile image
PHANI KUMAR KOLLA

This is so good! Very well explained!
Love the content 👍

Collapse
 
omerberatsezer profile image
Ömer Berat Sezer

thanks..

Collapse
 
nathan_tarbert profile image
Nathan Tarbert

pretty cool, love how you just lay it all out without gatekeeping, curious if you think focusing on open-source here makes it stick around for the long haul or if people just hop between shiny things

Collapse
 
omerberatsezer profile image
Ömer Berat Sezer • Edited

Thanks 😊 I think open source has a strong chance of sticking around, because it's community driven, transparent, and evolves fast. While people chase shiny new tools, the ones with open source and real utility tend to endure.

Collapse
 
nathan_tarbert profile image
Nathan Tarbert

Yes, I don't think open source is going anywhere :)

Thread Thread
 
omerberatsezer profile image
Ömer Berat Sezer

😂

Collapse
 
cryptosandy profile image
Crypto.Andy (DEV)

This is awesome - thanks for putting this together! Also curious: have you built any custom tools beyond the basics like calculator or python_repl?

Collapse
 
omerberatsezer profile image
Ömer Berat Sezer • Edited

Thanks. No, I haven't yet but I've tested them. These are (calculator, python_repl) aws strands tools (all strands tools => github.com/strands-agents/tools/tr...)

3 Strands tools called with these prompt =>
"I have 3 requests:

  1. What is the time right now?
  2. Calculate 3111696 / 74088
  3. Tell me how many letter R's are in the word "strawberry",

Code => github.com/omerbsezer/Fast-LLM-Age...

GIF demo => github.com/omerbsezer/Fast-LLM-Age...

Collapse
 
parag_nandy_roy profile image
Parag Nandy Roy

Love how you broke it all down ..

Collapse
 
omerberatsezer profile image
Ömer Berat Sezer

Thanks.

Collapse
 
who_ami_cde3d78503249d3a profile image
WHO AM I

hey people paste this massage everywhere to solve this problem and i know this is not right place to tell the massage
🚀 Developer Challenge: Build a Smart Tool to Speed Up Huge File Downloads! 🚀

Here’s a real problem that inspired a game-changing idea:

Today, I started installing a 10GB game. It needs two whole nights to finish. Now imagine my next game — 60GB... or even 200GB! Downloading that could take my entire life.

This frustration sparked an idea to make large downloads faster by using multiple devices at once.

The concept:

The server splits the big file into smaller chunks.

Each chunk gets a unique download URL.

The user opens these URLs on different devices (phones, tablets, laptops) connected on the same network.

Each device downloads its chunk at the same time.

A special app on the user’s main PC automatically gathers and merges all chunks into the full file.

Result? Faster downloads without needing expensive internet upgrades.

The challenge:

Developers, coders, and tech innovators — this is your chance to create a powerful tool that will change how people download huge files forever!

What’s needed:

A web library to split files and generate chunk URLs.

A PC application to collect and merge downloaded pieces.

Efficient networking to share chunks between devices.

User-friendly and secure design.

If you’re ready to solve this problem and make a difference, take on this challenge! Share, build, and help the world download faster.

Some comments may only be visible to logged-in visitors. Sign in to view all comments.