DEV Community

Wildo Monges
Wildo Monges

Posted on • Updated on

Building a Christmas-Themed Chatbot: My ServerlessGuru Hackathon Journey

Introduction

I recently came across an exciting opportunity: the Serverless Holiday Hackathon organized by Serverless Guru. The hackathon, taking place throughout the first half of December 2023, challenges participants to showcase their creativity and skills by developing the best holiday-themed chat application, utilizing any LLM (Language Model).

As I delve into the world of serverless development and explore the possibilities of creating a holiday-themed chat application, I'm excited to share my journey and insights with you.

Demo and Architecture Explanation Videos

I recorded a quick video showing how the Frontend application works before to deep dive into the Backend implementation.

Please see the video shared on Google Drive entering in the next url:

demo video

Additional to the Demo video, you can see a Video about my explanation of the Architecture Design that is behind the Chatbot in the next link

architecture video

Github Repo

Github: https://github.com/wildomonges/christmas-theme-chatbot

This my Journey

I started investigating the following new concepts and technologies for me:

  • LLM (Large Language Model): Serverless Guru gaves us a great introduction to LLM by sharing the next Youtube Video. So I found that LLM stands for and advanced Machine Learning model designed to understand and generate Human-Like text at large scale.
  • AWS Bedrock: This is an AWS Service to build and scale generative AI application with foundation models

After to have been familiarized (at basic level) with these technologies I commenced to choose the stack of technologies to develop the application.

For the Backend I decided to use

  • AWS SAM (Serverless Application Model): To manage the AWS Resource definition for the serverless application
  • Ruby 3.2: As the programming language to code the business logic.
  • Docker: For development to isolate my environment project from others I have in my local machine.

Note: I chose this stack as per my experience building Serverless Projects and due of the lack of time I had during the Hackathon to try with another programming language as Javascript, Python or another IaC (Infrastructure as Code) like Serverless Framework, Terraform, etc.

Before to jump straight to the coding part I designed the Application Architecture using LucidChart in order to have a general overview of the AWS Services I required to achieve the goal of having a functional Chatbot.

The first design was just a single Lambda Function connected to AWS Bedrock

first-design

Once I had that Lambda -> Bedrock integration in my mind, I started initializing a basic project structure to start coding the first sendMessageHandler

christmas-theme-chatbot/
 -backend/
     - app/
         - functions/
         - layers/shared/ruby/
              - gems
              - lib
         - template.yaml
         - Gemfile
Enter fullscreen mode Exit fullscreen mode

I filled the template.yaml with the following code snippet

template.yaml


AWSTemplateFormatVersion: "2010-09-09"
Transform: AWS::Serverless-2016-10-31
Description: Chatbot resource definition

Globals:
  Function:   
    MemorySize: 256
    Runtime: ruby3.2
    Tracing: Active
    Timeout:  25
Parameters:
  Env:
    Type: String
    Default: dev
  Service:
    Type: String
    Default: christmas-theme-chatbot

Resources:
   sendMessageHandler:
    Type: AWS::Serverless::Function
    Properties:
      CodeUri: app/functions/send_message/
      Description: Handle the message received from the client side and forward to Bedrock
      Handler: handler.ChristmasThemeChatbot::Functions::SendMessage.handler
      FunctionName: !Sub "${Service}-${Env}-send-message-handler"
      Policies:
        - Statement:
          - Effect: Allow
            Action: 'bedrock:*'
            Resource: '*'
Enter fullscreen mode Exit fullscreen mode

The sendMessageHandler I placed into

- functions/
    - send_message/
         - handler.rb 
Enter fullscreen mode Exit fullscreen mode

Phase 1 - Able to connect to Bedrock within a lambda function using Ruby

In order to interact with Bedrock I had to add the following gems to my Gemfile.

# frozen_string_literal: true

source 'https://rubygems.org'

ruby '>= 3.2.2'

gem 'aws-sdk-bedrockruntime'
Enter fullscreen mode Exit fullscreen mode

Having all the dependencies in place I initiated to write a basic Ruby code to send messages to Bedrock and get a response.

require 'aws-sdk-bedrockruntime'
module ChristmasThemeChatbot
  module Functions
    # This class implements the handler function to receive the message
    # from the client side and forward to bedrock api
    class SendMessage
      class << self
        def handler(event:, context:)
           client = Aws::BedrockRuntime::Client.new
           client.invoke_model_with_response_stream(
            body: prompt,
            model_id: MODEL_ID,
            content_type: 'application/json',
            accept: 'application/json',
            event_stream_handler: callback(connection_id, endpoint)
          )
        end
        private

        ACT_AS_SANTA_CLAUS = 'You are Santa Claus, a friendly old man who talk with people about Christmas'
        MODEL_ID = 'anthropic.claude-v2'

        def callback
          event_stream_handler = Aws::BedrockRuntime::EventStreams::ResponseStream.new
          event_stream_handler.on_chunk_event do |response_event|
            chunk_response = JSON.parse(response_event.bytes)['completion']
            print chunk_response
          end

          event_stream_handler
        end
      end
    end
end
Enter fullscreen mode Exit fullscreen mode

After to experiment with few models like Meta Llama 2 and Anthropic Claude-V2 I found that this last one gave me better dialog given the text I provided to it.

Here is where the first challenge pop up to

  • Be able to send data to the Bedrock api using aws-sdk-bedrockruntime

  • Find a good prompt value in order to get a nice answer from the api. The one who worked well for me is the next prompt

ACT_AS_SANTA_CLAUS = 'You are Santa Claus, a friendly old man who talk with people about Christmas'
Enter fullscreen mode Exit fullscreen mode
  • Be able to retrieve as a stream response using the sdk because in my mind I already had that I wanted to stream the Santa's answer to the Frontend app.

This first phase took me around 2 days working 4~5 hours per day. But after that time I was able to get a basic coherent response from the api.

Phase 2 - Expose an API for the client to interact with my Bedrock handler code.

The simplest approach would be just to add an Http API Gateway connected directly with sendMessageHandler, however this could end in a bad user experience having the client to wait the sendMessageHandler to fully have the Bedrock API response before to send back to it.

Having that in mind I decided to expose a Websocket which allows to the handler function to stream the Bedrock message near to the real time to the client.

I went back to LucidChart and added the new components of the application as follow

second-design

I updated template.yaml in order to add the new resources:

  • onConnectHandler: It receives the $connect request and stores the connectionId into a Dynamodb table
  • onDisconnect: It receives the $disconnect request and removes the connectionId from the table.
  • Connection table: It stores connection ids to establish a connection between the Client and the API

I updated template.yaml defining the new resources as follow:

  ####################
  # Websocket API
  ####################

  webSocketApi:
    Type: AWS::ApiGatewayV2::Api
    Properties:
      Name: !Sub "${Service}-${Env}-websocket-api"
      ProtocolType: WEBSOCKET
      RouteSelectionExpression: "$request.body.action"

  webSocketApiLogGroup:
    Type: AWS::Logs::LogGroup
    Properties:
      LogGroupName: !Sub "/aws/apigateway/${webSocketApi}/${Env}"

  Stage:
    Type: AWS::ApiGatewayV2::Stage
    Properties:
      StageName: !Ref Env
      Description: !Sub "${Env} stage"
      DeploymentId: !Ref Deployment
      ApiId: !Ref webSocketApi
      AccessLogSettings:
        DestinationArn: !GetAtt webSocketApiLogGroup.Arn
        Format: '{"requestId":"$context.requestId","ip":"$context.identity.sourceIp", "requestTime":"$context.requestTime", "httpMethod":"$context.httpMethod", "routeKey":"$context.routeKey", "status":"$context.status","protocol":"$context.protocol", "responseLength":"$context.responseLength"}'

  Deployment:
    Type: AWS::ApiGatewayV2::Deployment
    DependsOn:
    - connectRoute
    - sendMessageRoute
    - disconnectRoute
    Properties:
      ApiId: !Ref webSocketApi

###################
  # Routes
  ###################

  connectRoute:
    Type: AWS::ApiGatewayV2::Route
    Properties:
      ApiId: !Ref webSocketApi
      RouteKey: $connect
      OperationName: connectRoute
      Target: !Join
        - '/'
        - - 'integrations'
          - !Ref connectIntegration

  connectIntegration:
    Type: AWS::ApiGatewayV2::Integration
    Properties:
      ApiId: !Ref webSocketApi
      Description: Connect Integration
      IntegrationType: AWS_PROXY
      IntegrationUri: !Sub "arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${onConnectHandler.Arn}/invocations"

  disconnectRoute:
    Type: AWS::ApiGatewayV2::Route
    Properties:
      ApiId: !Ref webSocketApi
      RouteKey: $disconnect
      OperationName: disconnectRoute
      Target: !Join
        - '/'
        - - 'integrations'
          - !Ref disconnectIntegration

  disconnectIntegration:
    Type: AWS::ApiGatewayV2::Integration
    Properties:
      ApiId: !Ref webSocketApi
      Description: Disconnect Integration
      IntegrationType: AWS_PROXY
      IntegrationUri: !Sub "arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${onDisconnectHandler.Arn}/invocations"

#######################
  # Function Permissions
  #######################

  onConnectPermission:
    Type: AWS::Lambda::Permission
    DependsOn:
      - webSocketApi
    Properties:
      Action: lambda:InvokeFunction
      FunctionName: !Ref onConnectHandler
      Principal: apigateway.amazonaws.com
      SourceArn: !Sub "arn:aws:execute-api:${AWS::Region}:${AWS::AccountId}:${webSocketApi}/${Env}/$connect"

  onDisconnectPermission:
    Type: AWS::Lambda::Permission
    DependsOn:
      - webSocketApi
    Properties:
      Action: lambda:InvokeFunction
      FunctionName: !Ref onDisconnectHandler
      Principal: apigateway.amazonaws.com
      SourceArn: !Sub "arn:aws:execute-api:${AWS::Region}:${AWS::AccountId}:${webSocketApi}/${Env}/$disconnect"
Enter fullscreen mode Exit fullscreen mode

In this step is where the second challenge pop up! I had issues trying to invoke the onConnect and onDisconnect handlers when testing using wscat. This was due of lack of permissions on the lambda functions. Unfortunately sam deploy is not creating automatically the permissions that you can see at the Permissions section, that's why I had to manually add them into the template.yaml fie.

To test and debug the integration I used wscat as follow

wscat -c wss://WEBSOCKET_API_ID.execute-api.us-east-1.amazonaws.com/dev/
> connected

{"action": "sendMessage", "data": "Hello Santa!"}
Enter fullscreen mode Exit fullscreen mode

Once that I fixed the issue that was preventing the api gateway to invoke the onConnect function I started the phase 3

Phase 3 - Saving and removing connections

To interact with the Dynamodb table I used a gem called dynamoid. I created a Connection model in order to use it like Connection.create(connection_id) and Connection.find(connectionId).delete.

# frozen_string_literal: true

require 'dotenv/load'
require 'dynamoid'

Dynamoid.configure do |config|
  config.namespace = nil # to avoid having the prefix dynamoid_ as part of the table name
end

module ChristmasThemeChatbot
  module Layers
    module Shared
      module Models
        # This class is used to interact with dynamodb connections table
        class Connection
          include Dynamoid::Document

          table name: ENV['CONNECTIONS_TABLE']

          field :connectionId

          validates_presence_of :connectionId
        end
      end
    end
  end
end

Enter fullscreen mode Exit fullscreen mode

once defined this module, I used it as follow within the onConnect and onDisconnect handlers
Eg. onConnectHandler

# frozen_string_literal: true

require 'models/connection'

module ChristmasThemeChatbot
  module Functions
    # This class implements the handler function to connect to the websocket
    class OnConnect
      class << self
        include ChristmasThemeChatbot::Layers::Shared::Models  

        def handler(event:, context:)
          connection_id = event.dig('requestContext', 'connectionId')

          Connection.create(connectionId: connection_id)

          { statusCode: 200, body: 'Successfully created a new connection' }
        end
      end
    end
  end
end

Enter fullscreen mode Exit fullscreen mode

Note: At this point I was able to connect to the websocket, invoke the Bedrock api, receive Santa's message and disconnect from the api successfully

Phase 4 - Improving the code

In order to have a cleaner code I started

  • Refactoring the code, adding log information and encapsulating the methods just exposing the handlers one. The full code you can see at my repository here
  • Adding some unit tests
  • Reusing code using Lambda Layers

Phase 5 - Work on the Frontend application building the Chat UI

I chose React as the library to implement the UI and I used some libraries like react-chatbot-kit for the chat component and react-use-websocket to manage the connection with the backend API.

It is a simple UI which I created using the next command

npx create-react-app christmas-theme-chatbot
Enter fullscreen mode Exit fullscreen mode

frontend-app

The src folder contains the files to render the Chatbot Interface to chat with Santa.
The major files are:

  • App.js: it's the main wrapper, it imports the Chatbot component provided by react-chatbot-kit.
  • MessageParser.js: Handles the user messages and trigger the action to execute with the message.
  • ActionProvider.js: Using the react-use-websocket library, it creates a websocket connection to the serverless backend api and sends the input message. It receives the api response and update the messages state to render in the UI Santa's chat.

ActionProvider implements the handleSendMessage function which uses the sendJsonMessage built in function to send the message to the websocket api as follow

const { sendJsonMessage } = useWebSocket(socketUrl, {
    onMessage: (event) => {
      setTokens((prevTokens) => [...prevTokens, event.data]);
    },
  });
Enter fullscreen mode Exit fullscreen mode

As soon as a response is received the onMessage will update the tokens list to render automatically the Santa's message in the UI.

Phase 6 - Deploying the React App on Amplify Hosting

I chose AWS Amplify to host the React application. I have setup it manually on AWS Console connecting my Github account to Amplify and selecting the Github repository I want to build and deploy.

Image description

The production way to do this is to add the trigger to deploy the Frontend application from the CI/CD pipeline on merging code to main branch with a step like

 aws amplify start-job --app-id <<parameters.amplify_app_id>> --branch-name <<parameters.amplify_branch_name>> --commit-id $CIRCLE_SHA1 --job-type RELEASE        
Enter fullscreen mode Exit fullscreen mode

However I decided to kept it simple and manually configure the amplify application.

Phase 7 - Feature to discover Gift and Child name and store into a new table

After to have a basic version of the chat working, I thought of adding a new feature to helps Santa Claus to figure out the Gifts requested by each Child.

There I went back to LucidChart and I introduced new resources like messageAnalyzerHandler, giftsQueue , giftRegistrationHandler and GiftsTable as follow

Gifts Analyzer Integration

How it works?

  1. The input message is sent by sendMessageHandler to a SQS queue called messagesQueue. This stores messages in the next format
{
  "message": "Hi Santa! My name is Wildo, and I would like  Tshirt for Christmas",
  "connection_id": "THE_CONNECTION_ID"
}
Enter fullscreen mode Exit fullscreen mode

The code which sends the message from the handler is this one

SendQueueMessage.new(message: { message: user_message, connection_id: connection_id }.to_json, queue_url: ENV['MESSAGES_QUEUE']).call
Enter fullscreen mode Exit fullscreen mode

I implemented a service class called SendQueueMessage which is a wrapper to encapsulate the business logic around the invocation of the api method send_message of the aws-sdk-sqs gem.
The service code can be found here

  1. The lambda messageAnalyzer consumes the message from the SQS and invokes Bedrock API with a special prompt
 ACT_AS_GIFT_DISCOVER = 'Given the next message sent by a child to Santa, extract the name of the gift and the name' \
                               ' of the child  in the next format ' \
                               ' "the name of the gift is \"GIFT_NAME\" and the name of the child is \"CHILD_NAME\"'
 MODEL_ID = 'anthropic.claude-v2'
Enter fullscreen mode Exit fullscreen mode

Bedrock responds with a completion like this

{
    "completion": " the name of the gift is \"I would like to\" and the name of the child is \"Wildo\"",
    "stop_reason": "stop_sequence",
    "stop": "\n\nHuman:"
}
Enter fullscreen mode Exit fullscreen mode

Passing this payload to a regular expression I was able to get the Child and Gift name. For reference code here

  1. Then, It sends a new payload to another queue called giftsQueue in the next format
{
    gift: 'Tshirt',
    username: 'Wildo',
    connection_id: 'THE_CONNECTION_ID'
}
Enter fullscreen mode Exit fullscreen mode
  1. The previous message is consumed by giftRegistrationHandler which using the Connection model saves the record into giftsTable
Gift.create(connectionId: data['connection_id'], username: data['username'], gift: data['gift'])
Enter fullscreen mode Exit fullscreen mode

Phase 8 - Protecting the Websocket API

In this step I was stuck for 2 days (~ 10 hours) trying to implement a lambdaRequestAuthorizer to allow and deny access to the $connect route checking the Authorization header, however when I was able to make it work at the backend side, I figured it out that using react-use-websocket does not support to set new headers.

So, I ended in removing all the resources and code provisioned for the authorization feature and implementing a workaround. It works as follow

class SendMessage
      class << self
        include ChristmasThemeChatbot::Layers::Shared::Models
        include ChristmasThemeChatbot::Layers::Shared::Services

        def handler(event:, context:)
          Logger.instance.info("Handle sendMessage. Event: #{event.to_json}")

          data = JSON.parse(JSON.parse(event['body'])['data'])
          access_token = data['accessToken']
          request_context = event['requestContext']
          endpoint = "https://#{request_context['domainName']}/#{request_context['stage']}"
          connection_id = request_context['connectionId']

          if access_token != ENV['ACCESS_TOKEN']
            Logger.instance.info('Access unauthorized')

            Connection.find(connection_id).delete
            Logger.instance.info("Connection #{connection_id} deleted!")

            websocket(endpoint).delete_connection(connection_id: connection_id)

            return { statusCode: 401, body: 'Unauthorized'}
          end

          # Continue execution
          user_message = data['message']
          prompt = build_prompt(user_message).to_json
Enter fullscreen mode Exit fullscreen mode

SendMessageHandler expects to receive in the body the accessToken. If a valid static token is provided, it continues the normal flow, in another hand, it finds the connection and delete it from the db as well as It notifies to the client using the code

   websocket(endpoint).delete_connection(connection_id: connection_id)
Enter fullscreen mode Exit fullscreen mode

Final Architecture Design

This is the final design

final-design

MVP - PoC

The current application is just a Proof of Concept to play with AWS Bedrock to see if is possible to build a Christmas Theme Chatbot, even if it's online here It's far away to be on production
mode to be shared with the public in general.

Improvements I would like to have added

Frontend

  • Provide a better UI/UX (for multi devices like mobile and tables)
  • Add login page for Santa or Elf to be as an admin of the app
  • Add page to allow Santa to visualize the list of gifts grouped by childs

Backend

  • Add better unit tests
  • Add integration tests
  • Add CICD with multi stages like dev -> qa -> staging > prod
  • Improve the prompt used to generate the conversation as well as the one used for gifts and child discover.
  • Improve the log format using json objects.
  • Add Bugsnag or Rollbar to monitor errors in the application.
  • Build monitoring dashboards to track informations like Amount of Requests to the Chatbot, Gifts Most Requested, etc.
  • Add an incident management system like PagerDutty
  • Add Loading Testing
  • Provide a Swagger documentation about how to interact with the Websocket api
  • Add a better Authentication and Authorization mechanism.
  • Set a reserved and provisioned concurrency analyzing the behavior of each lambda function

Conclusion

It was a great journey coding a Christmas Theme Chatbot, I learned how to use AWS Services like Websocket API and AWS Bedrock in a basic way. I challenged myself trying to deliver a functional chatbot knowing the lack of time I had because of my Full Time job as a Senior Lead Software Developer at Decisiv Inc and other personal project I had to maintain during the Hackathon.

Top comments (0)