DEV Community

Cover image for How to create your own custom ChatGPT like chatbot in less than 5 minutes using your own data and no OpenAI API

How to create your own custom ChatGPT like chatbot in less than 5 minutes using your own data and no OpenAI API

Wilbert Misingo on May 19, 2023

INTRODUCTION In today's world, businesses are constantly looking for new ways to improve their customer service and engagement. One way to do this...
Collapse
 
jack_89 profile image
Jack

Hello!
I have a problem from step 06 onwards:
I set model_name = "facebook/opt-iml-max-1.3b", as 30b seemed too big.
When i run this piece of code it gives me this error "The Kernel crashed while executing code in the the current cell or a previous cell. Please review the code in the cell(s) to identify a possible cause of the failure. Click here for more info. View Jupyter log for further details."
I haven't found anything on the internet. At some point I re-ran the code and it seemed to work but now it gives me this error again.
Can you please help me solve this problem?
Thank you and good day

Collapse
 
wmisingo profile image
Wilbert Misingo

Hello Jack, sorry for the late reply, i think the problem may be that your PC runs out of processing power, for better results i would recommend running the script on Google Collab with GPU on

Collapse
 
scruff profile image
Mark Arb

Hi Wilbert,

I'm having issues with the " from llama_index import LLMPredictor, ServiceContext, QuestionAnswerPrompt" library that it cannot import name 'QuestionAnswerPrompt' from 'llama_index' . I've Google searched for a solution, and had no luck. I've got Python311 installed. Are you able to help here with a work-around?

Thanks in advance.
Mark

Collapse
 
wmisingo profile image
Wilbert Misingo

Hello Mark,

If you are using the latest version of llama index, this its because the references to legacy prompt subclasses such as QuestionAnswerPrompt, RefinePrompt. These have been deprecated (and now are type aliases of PromptTemplate). Now you can directly specify PromptTemplate(template) to construct custom prompts. But you still have to make sure the template string contains the expected parameters (e.g. {context_str} and {query_str}) when replacing a default question answer prompt.

docs.llamaindex.ai/en/stable/modul...

Please check the link above for a quick overview of the changes and how the new code is supposed to be

I hope this would be helpful, feel free to check me again in case of anything.

Cheers.

Collapse
 
anumber8 profile image
anumber8

Great article, congratulations.

As this is a very new concept, it would be good if you could kindly share your sample code on github. Thanks

Collapse
 
wmisingo profile image
Wilbert Misingo

Thanks!!

I am really glad that you have found it of great use.

About the pushing the code to GitHub, sadly I didn't, since after the project had downloaded the LLM from huggingface, I ended having a large sized project.

I also didn't think of sharing the codes on GitHub since I have just shared the project code snippets on the article.

Collapse
 
dshaw0004 profile image
Dipankar Shaw

I was planning to create a personal chatbot assistant. This post will help me to do that

Collapse
 
wmisingo profile image
Wilbert Misingo

Thanks Dipankar, I am glad you have found this helpful

Collapse
 
aleksandar_devedzic profile image
Aleksadnar Devedzic

Hello!
Amazing code!
One question, in what format should files in data folder be?

Like, one CSV file, or multiple .txt files?

Collapse
 
wmisingo profile image
Wilbert Misingo • Edited

Hello Aleksadnar, regarding the file format of the data files in the data folder, the SimpleDirectoryReader('./data') function accepts data files only in the format of .pdf, .txt, and .docx, .csv etc. although a few modifications can be made by passing certain arguments to the function to allow only one kind of data file to be processed e.g. .pdf only

You could learn more from here gpt-index.readthedocs.io/en/latest...