DEV Community

Hafid Saadi
Hafid Saadi

Posted on

1

Creating a Chatbot that Answers like Bart Simpson using Python3 and the OpenAI GPT Model.

In this tutorial, we will build a chatbot using the OpenAI API and Gradio. Our chatbot will be based on the character Bart Simpson from The Simpsons and will be designed to have a mischievous and rebellious personality.

Image description

Prerequisites

To follow along with this tutorial, you will need the following:

  • A text editor like VsCode
  • Python 3.6 or higher
  • The OpenAI API key from openai.com
  • Git (optional)

Setting up the Project

First, let's create a new project directory and set up our virtual environment:

mkdir chatbot
cd chatbot
python -m venv env
source env/bin/activate
Enter fullscreen mode Exit fullscreen mode

Next, let's install the required dependencies:

pip install openai gradio
Enter fullscreen mode Exit fullscreen mode

Writing the Code

Now, let's write the code for our chatbot. We will start by creating a file named gpt.py and adding the following code:

  • The prompt
  • get_response function that leverages the OpenAi Api to query the Davinci model and returns the answer in text.
import openai
import gradio as gr

PROMPT = """The following is a conversation with Bart Simpson. 
As Bart Simpson, I would describe myself as a mischievous, rebellious, and adventurous kid.
I'm always getting into trouble and finding new ways to have fun and cause chaos.

Me: Hello, who are you?
Bart: Hey, it's great to hear from you! Not much has changed here in Springfield. What about you, what have you been up to?
Me: """

def get_response(prompt):
    """
    Function that generates a response from the OpenAI API based on a given prompt.

    Parameters:
    - prompt (str): The prompt to generate a response for.

    Returns:
    - response (str): The response from the OpenAI API.
    """
    response = openai.Completion.create(
        model="text-davinci-003",
        prompt=PROMPT,
        temperature=0.9,
        max_tokens=150,
        top_p=1,
        frequency_penalty=0,
        presence_penalty=0.6,
        stop=[" Me:", " Bart:"]
    )
    return response.choices[0].text
Enter fullscreen mode Exit fullscreen mode

In this part, create a file named main.py and add the following code:

  • Cloning_gpt function that generates a response based on a given input and conversation history.
  • Opening the Gradio block that will give us the interface to interact with the chat bot.
import gradio as gr
from gpt import get_response


def cloning_gpt(input, history):
    '''
        Function that generates a response based on a given input and conversation history.

        Parameters:
        - input (str): The input to generate a response for.
        - history (list): A list of tuples containing previous inputs and outputs in the conversation.

        Returns:
        - history (list): The updated conversation history with the new input and output appended.
        - history (list): A copy of the updated conversation history.
    '''
    history = history or []  # if history is None, set it to an empty list
    past = list(sum(history, ()))  # flatten the history list
    past.append(input)  # add the current input to the history
    inputs = ' '.join(past)  # join the history into a string
    output = get_response(inputs)  # get the model's response
    history.append(
        (input, output))  # add the current input and output to the history
    return history, history  # return the history as both the output and the state


block = gr.Blocks()

with block:
    gr.Markdown("""<h1><center>ChatGpt "Bart version" </center></h1>""")
    chatbot = gr.Chatbot()
    message = gr.Textbox(placeholder="Hey I'm Bart Simpson, ask me anything!")
    state = gr.State()
    submit = gr.Button("SEND")
    submit.click(cloning_gpt,
                 inputs=[message, state],
                 outputs=[chatbot, state])

block.launch(
    debug=True, share=True
)  # debug=True to run locally and share=True to share the app publicaly


Enter fullscreen mode Exit fullscreen mode

Laucnh the project

When launching the app, you can choose debug=True to run locally or share=True to share the app publicaly or both. USually the it can be accessible publicly for 72 hours.

Check out the Github repo

Heroku

Simplify your DevOps and maximize your time.

Since 2007, Heroku has been the go-to platform for developers as it monitors uptime, performance, and infrastructure concerns, allowing you to focus on writing code.

Learn More

Top comments (0)

nextjs tutorial video

Youtube Tutorial Series 📺

So you built a Next.js app, but you need a clear view of the entire operation flow to be able to identify performance bottlenecks before you launch. But how do you get started? Get the essentials on tracing for Next.js from @nikolovlazar in this video series 👀

Watch the Youtube series

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay