Here is a post about creating a simple AI bot with Python3 and OpenAI library. You can find the full source code here: https://github.com/vietjovi/openai-chatbot
Build Your Own OpenAI-Powered Chatbot with Python: A Deep Dive into the OpenAI Chatbot Repository
Creating a chatbot that can converse with users intelligently has become easier than ever with OpenAI’s API. In this guide, we’ll explore the OpenAI Chatbot repository, which provides a structured way to build, customize, and deploy your own AI-powered chatbot. The code leverages OpenAI's ChatCompletion
API to create dynamic conversations, retain context, and store past messages for a personalized experience.
Let’s dive into the code and see how you can set up, run, and expand upon this AI chatbot project.
Project Overview
This chatbot implementation is designed to respond naturally to user questions, with the flexibility to recall past interactions for continuity. The chatbot is built with Python, allowing for easy configuration, prompt customization, and adaptation to various applications.
Key Features
- Conversation Context: Tracks past messages to maintain context within a conversation.
- Simple Configuration: Allows easy setup and customization of OpenAI API parameters.
- Error Handling: Handles cases where API calls might fail, ensuring smoother user experiences.
Getting Started
Prerequisites
- OpenAI API Key: Obtain an API key from OpenAI.
-
Python Environment: Ensure Python 3.7+ and
pip
are installed. - Git: Clone the project repository.
Step 1: Clone the Repository
Begin by cloning the repository to your local machine:
git clone https://github.com/vietjovi/openai-chatbot.git
cd openai-chatbot
Step 2: Install Dependencies
This project requires specific libraries, which you can install from requirements.txt
:
pip install -r requirements.txt
Step 3: Configure the Chatbot
Before running the chatbot, set up the configuration in the config.py
file:
-
API Key: Set
api_key
to your OpenAI API key. -
API Endpoint and Engine: Configure
api_endpoint
,api_type
,api_version
, andengine
(e.g.,gpt-3.5-turbo
). -
Data Directory: Set
data_dir
to specify where past conversations are saved.
Here's an example of what your config.py
might look like:
# config.py
api_key = 'your-api-key'
api_endpoint = 'https://xxxxxxxx.openai.azure.com/' #Remove this configuration if you use api.openai.com
api_version = "2023-03-15-preview"
api_type = "azure" #Remove this configuration if you use *.openai.com
engine = "gpt-3.5-turbo"
version = "v1.0"
data_dir = './data/'
past_message_included = "5" #How many message history(Q&A) will be kept. It makes a price increase - https://openai.com/pricing
Code Walkthrough
Let’s explore the main sections of the code, focusing on how it initializes, handles messages, and saves conversations.
1. The Core Functions
openAIRequest(question, userId)
The openAIRequest
function sends a single question to the OpenAI API and receives a response. Here’s a breakdown of the function:
def openAIRequest(question="", userId=""):
openai.api_key = config.api_key
openai.api_base = config.api_endpoint
openai.api_type = config.api_type
openai.api_version = config.api_version
response = openai.ChatCompletion.create(
engine=config.engine,
messages=[
{"role": "system", "content": "You are a helpful AI assistant."},
{"role": "user", "content": question},
],
user=userId
)
return response['choices'][0]['message']['content']
openAIRequestWithPastMessages(pastMsg, question, userId)
This function builds upon openAIRequest
by including past conversation messages, providing continuity in responses. The function uses loadPastMessages
to retrieve stored messages and appends them to the current question:
def openAIRequestWithPastMessages(pastMsg=[], question="", userId=""):
openai.api_key = config.api_key
openai.api_base = config.api_endpoint
openai.api_type = config.api_type
openai.api_version = config.api_version
msg = [{'role': 'system', 'content': 'You are a helpful AI assistant.'}]
for m in pastMsg:
msg.append(m)
msg.append({'role': 'user', 'content': question})
try:
response = openai.ChatCompletion.create(
engine=config.engine,
messages=msg,
user=userId
)
answer = response['choices'][0]['message']['content']
except Exception as e:
answer = "Sorry, I can not answer your question. Please try again."
print(e)
pastMsg.append({'role': 'user', 'content': question})
pastMsg.append({'role': 'assistant', 'content': answer})
savePastMessages(userId, pastMsg)
return answer
2. Loading and Saving Past Messages
To retain the context across sessions, the chatbot stores past messages in a file unique to each user. This allows the chatbot to reference previous exchanges for more natural conversations.
loadPastMessages(uid)
This function retrieves past conversations from a file:
def loadPastMessages(uid):
pastMessages = []
dataPath = config.data_dir + "/" + uid
if os.path.exists(dataPath):
try:
with open(dataPath, "r") as f:
pastMessages = f.read()
except:
return []
return ast.literal_eval(pastMessages) if pastMessages else []
savePastMessages(uid, msg)
This function saves the current conversation messages, maintaining a rolling context by removing the oldest entries if a set limit is reached:
def savePastMessages(uid, msg):
dataPath = config.data_dir + "/" + uid
if len(msg) > int(config.past_message_included * 2):
msg = msg[2:]
try:
with open(dataPath, "w") as f:
f.write(str(msg))
return True
except:
return False
3. Running the Chatbot
The main()
function initiates the chatbot in the terminal, continuously prompting the user for input until they type “exit” or “bye.”
def main():
userInput = ''
userId = "user_123"
print(f'OPENAI CHATBOT {config.version} - Engine: {config.engine}')
print("Enter 'bye' or 'exit' to quit")
while ((userInput.lower() != 'exit') and (userInput.lower() != 'bye')):
userInput = input("You: ")
if userInput.lower() in ['exit', 'bye']:
print('Goodbye. See you!')
sys.exit(0)
print("OpenAI-Bot: " + openAIRequestWithPastMessages(loadPastMessages(userId), userInput, userId))
print('Goodbye. See you!')
To run the chatbot, execute the following command:
python chatbot.py
Customizing the Chatbot
This chatbot is highly customizable. Here are a few ideas:
-
Personalize System Messages: Change
{"role": "system", "content": "You are a helpful AI assistant."}
to define a new persona. -
Limit Context for Efficiency: Modify
past_message_included
inconfig.py
to change how much context is retained. -
Experiment with API Parameters: Adjust
temperature
,max_tokens
, or other API settings to influence response style and length.
Deploying the Chatbot
Once tested, consider deploying the chatbot on a platform like Heroku or an internal server. Update environment variables to securely store your OpenAI API key.
Conclusion
This OpenAI Chatbot project is a straightforward yet flexible way to harness the power of OpenAI's language models. Whether for support, education, or entertainment, this chatbot provides a strong foundation for engaging, AI-driven conversations.
To dive into the code, explore the OpenAI Chatbot repository on GitHub, and start customizing your own AI assistant!
Happy coding!
Top comments (0)