Hey folks π Today we're launching Gradio 5.0 on Product Hunt. If you've never heard of Gradio, let's fix that!
What is Gradio?
Gradio is an open-source library that allows you to create customizable web interfaces for your machine learning models with Python. Whether you have your own model, or you want to try out a cool new open-source LLM without any hassle, Gradio is the answer. It's designed to be intuitive for developers (both newbies and experienced) and accessible for users.
Quick Start Tutorial
Here's a super simple demonstration of how you can create and deploy a demo via Hugging Face Spaces with a Gradio Chatbot template and customize it to our liking.
Create a Space for your Gradio demo here, and select the Chatbot template.
If you'd rather just run a Gradio demo locally, you can skip the next couple of steps, run
pip install gradio
and copy and paste the script further down.
You'll see your Gradio demo building at first, and then voila! You should be able to try out your own chatbot.
What if we want to customise this a little bit?
Let's change "friendly chatbot" to "pirate chatbot". I also want to test out the Mistral 7B Instruct LLM instead. You can either edit the app.py
file in your Space repo, or you can clone the repo and run it locally with the command gradio app.py
.
Here's my script with the changes:
import gradio as gr
from huggingface_hub import InferenceClient
client = InferenceClient("mistralai/Mistral-7B-Instruct-v0.3")
def respond(
message,
history: list[tuple[str, str]],
system_message,
max_tokens,
temperature,
top_p,
):
messages = [{"role": "system", "content": system_message}]
for val in history:
if val[0]:
messages.append({"role": "user", "content": val[0]})
if val[1]:
messages.append({"role": "assistant", "content": val[1]})
messages.append({"role": "user", "content": message})
response = ""
for message in client.chat_completion(
messages,
max_tokens=max_tokens,
stream=True,
temperature=temperature,
top_p=top_p,
):
token = message.choices[0].delta.content
response += token
yield response
demo = gr.ChatInterface(
respond,
additional_inputs=[
gr.Textbox(value="You are a pirate Chatbot.", label="System message"),
gr.Slider(minimum=1, maximum=2048, value=512, step=1, label="Max new tokens"),
gr.Slider(minimum=0.1, maximum=4.0, value=0.7, step=0.1, label="Temperature"),
gr.Slider(
minimum=0.1,
maximum=1.0,
value=0.95,
step=0.05,
label="Top-p (nucleus sampling)",
),
],
)
if __name__ == "__main__":
demo.launch()
If you're running this locally, simply run gradio app.py
(or whatever you named this script file). If you edited this via the Hugging Face Spaces platform, you should just see it building and running in Spaces pretty quickly!
Now let's give it a go.
That's it! How cool is that? Plus, users of your space can also simply edit that pirate
text with anything they want as well via the Additional Inputs section in the UI.
We can embed this Space anywhere, or you can share it via a Share link.
So what's new in Gradio 5.0?
- Server Side Rendering: Massive performance improvements
- UI Components: We've updated our UI components and given them a sleeker look
- Major security enhancements: Safely deploy your web app without worrying about web security
...and a lot more.
Give it a try and see how quickly you can turn your ideas into shareable web apps! If you're stuck on how to start, have a play around with our Gradio Playground and have our LLM build a Gradio app for you.
Don't forget to check out Gradio 5.0 on Product Hunt and if you have a moment, we'd love for your support if you find it useful.
Happy coding! π
Top comments (0)