DEV Community

nash9
nash9

Posted on

AWS Bedrock with LangChain

Demo AWS Bedrock Integration with LangChain , Streamlit ,Titan model along with docker setup (Free tier)
To demonstrates how to integrate AWS Bedrock with LangChain and Streamlit using the Titan model with docker setup .

Prerequisites/Project Structure

  • requirements.txt
  • Docker file
  • AWS config local setup '=/.aws/config' and credentials '/.aws/credentials'
  • AWS Bedrock access

Code base

Main Python file:

import streamlit as st
import boto3
from botocore.exceptions import ClientError
from langchain_aws import ChatBedrock
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

st.set_page_config(page_title=" AWS Bedrock Docker", layout="wide")

st.title("🐳 AWS Bedrock + Docker + LangChain + Streamlit")
st.caption("Connected to AWS Region: `ap-south-1` via local AWS Config")
# ------------------------------------------------------------------
try:
    boto_session = boto3.Session()
    if not boto_session.get_credentials():
        st.error("❌ No AWS Credentials found. Did you mount ~/.aws in docker-compose?")
        st.stop()

    st.sidebar.success(f"AWS Profile Loaded: {boto_session.profile_name or 'default'}")

except Exception as e:
    st.error(f"AWS Config Error: {e}")
    st.stop()

model_id = st.sidebar.selectbox(
    "Select Model",
    ["anthropic.claude-3-sonnet-20240229-v1:0", "anthropic.claude-v2:1", "amazon.titan-text-express-v1"]
)

llm = ChatBedrock(
    model_id=model_id,
    region_name="ap-south-1",
    model_kwargs={"temperature": 0.5, "max_tokens": 512}
)

#----------------------------------------------
user_input = st.text_area("Enter your prompt:", "Explain how Docker containers work in 3 sentences.")

if st.button("Generate Response"):
    if not user_input:
        st.warning("Please enter a prompt.")
    else:
        try:
            with st.spinner("Calling AWS Bedrock API..."):
                prompt = ChatPromptTemplate.from_messages([
                    ("system", "You are a helpful AI assistant."),
                    ("user", "{input}")
                ])
                output_parser = StrOutputParser()

                chain = prompt | llm | output_parser

                response = chain.invoke({"input": user_input})

                st.subheader("AI Response:")
                st.write(response)

        except ClientError as e:
            st.error(f"AWS API Error: {e}")
            if "AccessDenied" in str(e):
                st.warning("👉 Hint: Did you enable this specific Model ID in the AWS Console > Bedrock > Model Access?")
        except Exception as e:
            st.error(f"An unexpected error occurred: {e}")
Enter fullscreen mode Exit fullscreen mode

Docker file :

FROM python:3.11-slim
LABEL authors="naush"

WORKDIR /chatgpt-bedrock-langchain-demo
COPY requirements.txt .
RUN pip install --upgrade pip && pip install torch --index-url https://download.pytorch.org/whl/cpu && pip install -r requirements.txt
COPY main.py .
EXPOSE 8090
CMD ["streamlit","run","main.py"]
Enter fullscreen mode Exit fullscreen mode

Requirements File

streamlit
boto3
langchain-aws
langchain-community
langchain-core

Enter fullscreen mode Exit fullscreen mode

Setup Instructions
Clone the repository:

  1. git clone or create new repo :
    cd repository_name or
    create folder and paste above files (main.py , requirements.txt,docker )

  2. Install the required packages:
    pip install -r requirements.txt or
    do run this command
    pip install streamlit boto3 langchain-aws langchain-community langchain-core

  3. Configure your AWS credentials:
    Make sure your AWS credentials are set up in ~/.aws/credentials and ~/.aws/config.

  4. Run the Streamlit app:
    streamlit run app.py
    Open your browser and navigate to http://localhost:8501 to access the url Streamlit app.

  5. Docker Setup To run the application using Docker, follow these steps: Build the Docker image:
    bash docker build -t bedrock-langchain-streamlit .
    Run the Docker container: bash docker run -p 8501:8501 bedrock-langchain-streamlit
    Open your browser and navigate to http://localhost:8501 to access the Streamlit app running in the Docker container.

hld

Enter your input in the Streamlit app interface and interact with the Titan model powered by AWS Bedrock.

demo

Top comments (0)