DEV Community

hayao-k for AWS Community Builders

Posted on • Edited on • Originally published at hayao-k.dev

AI Chatbot powered by Amazon Bedrock 🚀🤖

Introduction

I have created a sample chatbot application that uses Chainlit and LangChain to showcase Amazon Bedrock.

You can interact with the AI assistant while switching between multiple models.

Image description

This sample application has been tested in the following environments:

  • AWS Region: us-east-1
  • AWS Copilot CLI: v1.30.1
  • Python: v3.11.5
  • boto3: v1.28.57
  • LangChain: v0.0.305
  • Chainlit: v0.7.0

Note: To support the GA Version of the Bedrock API, make sure to use versions of LangChain and boto3 that are newer than those mentioned above.

Source Code

Check out my GitHub repository!

AI Chatbot powered by Amazon Bedrock 🚀🤖

Overview

Sample chatbot application to experience Amazon Bedrock, using Chainlit and LangChain.

You can interact with the AI assistant while switching between multiple models.

Please ensure you enable access to each model in Amazon Bedrock beforehand.

Features

  • Customizable Chat Settings: Through Chainlit, it offers a user-friendly interface to select models and adjust settings before initializing the chat.
  • Model Selection: Amazon Bedrock model list is automatically retrieved and can be selected.
  • Document Chat: Please refer to the official documentation for the support status of each model.
  • Image Processing (Anthropic Models Only): Supports image input for Claude 3 models.

Prerequisites

Before deploying this application, ensure you have the following:

  • AWS Copilot CLI installed
  • Enable access to each model in Amazon Bedrock

Getting Started

Deploy to AWS App Runner using AWS Copilot CLI. To deploy this application:

git clone https://github.com/hayao-k/Bedrock-AIChatbot-Sample
cd Bedrock-AIChatbot-Sample

Start using Bedrock

Before you can start using Amazon Bedrock, you must request access to each model. To add access to a model, go to the Model Access section in the Bedrock console and click Edit.

Image description

Select the models you want to use and save the settings.

Note: Third-party models such as Jurassic-2 and Claude require access through the AWS Marketplace. This means you will also be charged a Marketplace fee for using the model.

Image description

It will take some time for each model to become available; models with an Access status of Access granted are ready for use.

Note: Some models, such as Titan Text G1 - Express, are in Preview and may not be immediately available.

How to Deploy

The GitHub repository for the sample application contains a manifest file for deploying the container to AWS App Runner using the AWS Copilot CLI.

Follow these steps to deploy:

  1. Install the AWS Copilot CLI if necessary


sudo curl -Lo /usr/local/bin/copilot https://github.com/aws/copilot-cli/releases/latest/download/copilot-linux && sudo chmod +x /usr/local/bin/copilot


Enter fullscreen mode Exit fullscreen mode
  1. Deploy to App Runner!


 export AWS_REGION=us-east-1
 copilot app init bedrockchat-app
 copilot deploy --name bedrockchat --env dev


Enter fullscreen mode Exit fullscreen mode

If the deployment is successful, access the URL displayed in the message. If the following screen appears, the deployment has succeeded!

Image description

To delete an environment, execute the following command:



copilot app delete

Enter fullscreen mode Exit fullscreen mode




How to use the app

The model, temperature, and maximum token size can be changed from the Settings panel.

Image description

The sample code allows you to choose between Claude v2, Jurassic-2 Ultra, and Amazon Titan Text G1 - Express. This enables you to compare performance while switching between models.

Note: As of 9/29/2023, Amazon Titan Text G1 - Express is in Preview status. It will be rolled out gradually, so it may not be available for some accounts.

Image description

You can enjoy a free-flowing conversation with the AI assistant! The conversation history during the session is retained, allowing it to reply to you while considering the context.

Image description

I hope this will be of help to someone else.

Top comments (0)