DEV Community

Cover image for Building a RAG-Powered Support Chatbot in 24 Hours of Hackathon

Building a RAG-Powered Support Chatbot in 24 Hours of Hackathon

Akshay Gupta on February 16, 2025

Coffee? ✅ Chai? ✅ Determination to automate admin support? Double ✅ In a recent 24-hour hackathon at annual PeopleGrove offsite, my team tackled a...
Collapse
 
sohaib_ahmadjamil_cd199f profile image
Sohaib Ahmad Jamil

I would like to learn on making this but via Colab cause my laptop a potato. Is there a repo for this I can ask ?

Collapse
 
akshay_gupta profile image
Akshay Gupta

Hi, I wish I could share the complete code. But I cannot do that as this is a private company code, and thus for obvious reasons I cannot share that. But you can follow this article (marktechpost.com/2025/01/27/buildi...) that I have mentioned as well.

And instead of using DeepSeek on your laptop, you can use Gemini or OpenAI in the free tier (just create new account in Google Cloud to get free credits to use Google Gemini or create a developer account in OpenAI and use ChatGPT's APi), this way your potato laptop will not do all the heavy lifting. I hope this helps. ✌️

Collapse
 
sohaib_ahmadjamil_cd199f profile image
Sohaib Ahmad Jamil

Np still big thanks. Currently I'll do Groq API or Gemini if I make ir good enough then OpenAI API for sure. Thank you in advance for advice!🫡

Collapse
 
lancemdev profile image
Lance Munyao

You have no idea how much I needed this, thank you so much

Collapse
 
akshay_gupta profile image
Akshay Gupta

Thanks amazing! So happy that this could help you in some way! :D

Collapse
 
yash_patil_9654 profile image
Yash Patil

Loved your article! It gave me a clear roadmap for building a similar thing i was trying to build since last month. Super helpful though.

i hope my free tier quota won't get exceeded. cause my context is pretty huge.

what would you suggest if i want to run on my local with smaller models. it would require training but i am noob at python.

please share your thoughts.

Thanks in Advance.

Collapse
 
akshay_gupta profile image
Akshay Gupta

Thanks so much for the kind words! 😊 I’m really happy you found the article helpful!

If you’re worried about your free-tier quota, running a smaller model locally is a great idea. I’d recommend checking out DeepSeek with Ollama — it’s super easy to get started. Just install Ollama, and you’ll be running a local model in no time.

It’s lightweight and perfect if you have a large context but want to stay within your limits. Let me know if you need any help — happy building!

Collapse
 
yash_patil_9654 profile image
Yash Patil

Thanks for quick reply.

i've tried the t5-small & t5-base but the results are not very promising. do you have any idea custom model tuning to get better results.
i've tried creating small subset of query and answers to train the model. but it seems to be not very promising.
though i'll try deepseek once but i'm not sure if my machine would handle it.

Image description

Thread Thread
 
akshay_gupta profile image
Akshay Gupta

You're welcome! Glad to help.

Yeah, t5-small and t5-base can be a bit limited, especially for complex tasks. If you're looking for better results, fine-tuning can help, but that requires a decent dataset and some tweaking. Since you already tried training on a small subset, you might want to look into LoRA (Low-Rank Adaptation), I have read that these let you fine-tune larger models efficiently without heavy GPU requirements.

For local models, DeepSeek is definitely worth a shot. If you're worried about hardware limits, you can try running lower param model like 1.5b. You can also try Mistral 7b, it is also lightweight.

Collapse
 
savecohub profile image
Naz Hussain

Thank you for this wonderfully insightful foray into the world of RAG - very useful. Obviously you can't share code as it's proprietary but can we access the app to see it working?

Collapse
 
akshay_gupta profile image
Akshay Gupta

Thank you for your kind words! I'm glad you found the article useful.

Unfortunately, I won’t be able to share the application either, as our sandboxed environments are part of the product and are proprietary as well. I hope you understand.

Collapse
 
savecohub profile image
Naz Hussain

No worries, Akshay, all the best.