DEV Community

Cover image for Deconstructing Hugging Face Chat: Explore open-source chat UI/UX for generative AI
Nitya Narasimhan, Ph.D for Microsoft Azure

Posted on • Updated on

Deconstructing Hugging Face Chat: Explore open-source chat UI/UX for generative AI

3 Resources To JumpStart Your Open-Source Exploration:

  1. Hugging Face Chat - open-source repo powering Hugging Chat!
  2. Deconstructing Hugging Face Chat - watch the video replay of my talk.
  3. Build Generative AI Code First With Azure AI - my curated collection

Welcome to the fifth post in my This Week In AI News series. Today, I want to talk about open-source communities and software in AI - and I want to start that conversation off by talking about Hugging Chat - an open-source chat UI/UX that is fun, functional and fantastic for learning! Let's Go!!


1 | Backstory

I believe that behing every passion project is a curious human being who wants to create something and is looking for the tools and knowledge to do it. That was the case with me when I joined AI Advocacy in mid-Oct 2023. It was like jumping into the deep end of a pool - only to find that was the wide blue sea. You could see various things on the horizon but it was a bit overwhelming to figure out which direction I should swim in, to get myself oriented towards my final destination.

This is when I found myself on Hugging Face. It was well known as a model hub, but it was also a community of people who were curious and creative and sharing their knowledge in the open. I needed a concrete goal - so I did what I always do, and wrote a conference talk proposal figuring I would work on the content and submit to multiple places in parallel. I assumed I would have a few months before the talk would get accepted.

I was wrong. I submitted it as a bonus proposal to the Dec 2023 Global AI Conference and to my complete shock, that was the one they accepted.

Alright then - time to get to work.


2 | Hello, Hugging Chat!!

So, what was my learning objective with this talk? You can read the full abstract here but this was my motivation:

Want to build a ChatGPT-like experience for your Generative AI use case but don't know where to start? What if you had an open-source project that was exploring the same goals, so you can learn from it and contribute back or transfer knowledge to your own development journey? How do you get started exploring it, especially if you are unfamiliar with the relevant programming framework or technology community?

To give you more context, I firmly believe that open source samples are a critical resource for learning to build real-world applications by teaching you good design patterns, end-to-end development workflows, devops best practices. In a previous post, I talked about our 4-part series to "deconstruct an enterprise-grade serverless app on azure" - where we taught learners how to use tools like GitHub Codespaces, GitHub Copilot, Playwright and Azure Developer CLI, to look under the hood of unfamiliar codebases.

My thought process here was similar:

By deconstructing the open-source repo, we can not only educate ourselves on how to build such applications, but we can then do three things:

  • Contribute back - provide feedback or enhancements to repo.
  • Customize & use - make this the chat UI for your generative AI apps.
  • Extend & explore - use it as a sandbox to experiment with ideas.

Armed with that thought, let's get a quick look at what the Hugging Chat experience provides. In the next section, I'll share a quick look at the user experience in Hugging Chat.


3 | Hugging Chat User Experience

You can try the guest experience out right now - go to https://huggingface.co/chat and select the option to use it as a guest. You should see something like this:

HF Guest Experience

3.1 | Default UI/UX Features

Note some of the features of the default chat UI/UX:

  • Login - you can provide differentiated guest/user experiences
  • Theme - built-in support for dark/light theme switching
  • Settings - ability to select amongst different models
  • Assistants - 🆕 capability we will cover later in this post

The default chat UI/UX lets you pick from a selected set of models (here we are using Model: mistralal/Mixtral-8x/b-Instruct-vO.1). However, the open-source project allows you the option to configure these to use other models, including custom versions or versions deployed elsewhere.

3.2 | Chat UI Examples

The UI/UX is evolving rapidly so it looks a bit different from the version I showed in my December talk (which I talk about below). But first, let's try their default chat examples:

  1. "Write an email from a bullet list" - will trigger an input prompt like this: As a restaurant owner, write a professional email to the supplier to get these products every week: .... This shows the ability of the chat AI to generate text that is contextualized by the primary content you provide.

  2. "Code a basic snake game in python, give explanations for each step" - will trigger a step-by-step description of the steps (with code snippets) to build such a game. This shows the ability of the chat AI to generate code

  3. "How do I make a delicious lemon cheesecake?" gives you output as shown in the screeshot below. This example also reflects one of the more popular use cases, which is a Q&A experience where users are looking for information as opposed to providing instructions for task execution.

Cheesecake

This let's us see another aspect of the UI/UX - which is the conversational view for the interaction with the chat AI.

  • The main panel shows the result from the last user question and will maintain a history for multi-turn conversations.
  • The side panel shows other chat conversation sessions you may have had. You can see my history shows the 3 examples.

3.3 | Logged-in Experience

So, what happens if you login? In this case, your default user experience will change based on your recent activity. Here is what mine looks like:

Nitya Profile

As you can see, mine shows off the new Assistants feature released recently. I have been working on this assistant and you can see that my default user experience shows the conversation sessions I've had with my assistant as I tried to debug it.

We'll talk about Assistants later but let's now talk about "deconstructing" the open-source repo.


4 | "Deconstructing" Hugging Chat

4.1 | Explore the Code

The talk was based on this branch of my fork of their repo - as you can see, it is currently 100+ commits behind the source, but I've preserved the branch for reference against the video.

The first thing this provides is a devcontainer configuration, so you can launch the repo directly into a GitHub Codespaces, and use it to run a local dev preview of the chat interface, for exploration and debugging. Just launch a Codespace session as shown in the screenshot below.

Launch Codespaces

This should take a few minutes to get setup. But once it does, you should have an environment ready to go with a Node.js runtime and dependencies installed. Just type npm run dev to launch the default SvelteKit-implemented chat UI in your local development environment. You should see something like this:

Run preview

GitHub Codespaces will automatically forward the port for the dev preview, allowing you to view that experience on your browser. You should see something like this indicating that the dev server was able to run the chat UI successfully. Note: this branch hosts an older version of the UI.

Screeshot of GitHub Codespaces forwarding port

Click "Start Chatting" to get the UI/UX you expect. Here is an example of how that works when I ask it a very important question:

Screeshot of dialog to start chatting

🚨 | You may get an error starting the dev server.

This is likely because you need to set 2 environment variables:

  • HF_TOKEN (with your Hugging Face token)
  • MONGODB_URL with a link to a local or hosted MongoDB server

The first is required for access to the Hugging Face inference models and inference endpoint, while the latter is needed to store your conversation history.

I don't see this issue when using GitHub Codespaces for development since I have them setup as GitHub secrets. Now, when the repository is opened in Github Codespaces, those variables are auto-magically populated for me and I don't have to worry about either remembering them, or being afraid of accidentally checking them in.

For more details, check out my talk (slides, video below) and read up my "Azurify It" section to learn about a more detailed set of posts/tutorials that I hope to publish later in summer.

4.2 | View the Slides

The talk provides an introduction to the Hugging Face ecosysyem before diving into my preliminary look at Hugging Face chat, where I show how you can use GitHub Codespaces to setup your local development server, and get the app running locally - for further exploration.

4.3 | Watch the Talk

The Dec 2023 conference talk is also available for replay below. Watch it and leave me comments and questions on this post. This talk is 2 months old now, and a lot has happened since. But I will be revisiting the project in the upcoming months and want your feedback on what I can cover, that helps best.

4.4 | Get Explainers From GitHub Copilot

At this point, you have a local development enviroment setup and have validated the ability to build the app and preview it. But now you need to understand the codebase. Where do you even start?

The open-source application is based on SvelteKit. Want to get familiar with the framework, syntax and tooling? I can't recommend the SvelteKit interactive tutorial enough if you are new to the framework and want to learn it from the basics.

Or, you might have experience with other frameworks enough to figure out the basic structure and operation - and just need help understanding specific code snippets, patterns or settings. This is where GitHub Copilot can help. Activate the extension (requires a paid plan) then ask it to explain things as you go. Try making changes, testing (with preview) and asking for help making fixes (if errors occur).

The greatest value of open-source repositories is that we can always find the answers in the code, and the community.
Alright - that was a lot, but hopefully it got you excited to try exploring these on your own next. Let me wrap up with two final thoughts.


5 | Bring On the Assistants

You might be asking yourself - what can I do with the open-source chat UI once I figure it out?. Think of the fact that this is the template for any chat-based assistant you might want to build and deliver.

As a matter of fact, the Hugging Face team just added support for a new Assistants feature where you can create and publish your own assistants that can run on this platform simply by crafting the right system prompt and using the optimal model. Here is the one I created

It's a very basic prompt that simply asks the assistant to extract a relevant periodic table element from the user query, then craft a limerick based on that element as a response.

The first iterations were quite fun. But more importantly, it was the iterations and the ability to seamlessly try out different model endpoints, system prompts, and personas - that gives us a more better understanding of prompt engineering in action.

Screeshot of Assistant prompts

Try building your own assistant - then think about how you could use the open-source app to do something more with that basic idea.


6 | Let's Azurify It?

Of course, as an AI Advocate at Microsoft, my day job involves implementing and exploring various AI services, solutions and tools. And it occured to me that it would be great to integrate Azure services into this open-source application in ways that could help it fit into end-to-end architectures.

Here are a few things I am currently exploring. Look for an updated post later in the year as I dive in.

6.1 Use Azure CosmosDB as your database

This is the easiest integration. Use the Azure CosmosDB for MongoDB option to get a hosted database with an endpoint you can use for your MONGODB_URL environment variable. This is what I currently use in my demo - so the conversation above is actually mapped to this database entry.

Screeshot of Azure CosmosDB usage

6.2 Use Azure App Service for hosting the app

This is technically possible using either Azure Static Web Apps (hosted as a static site) or the Azure App Service Web Apps (hosted as a Node.js app) - both of which allow you to deploy a SvelteKit application to Azure.

In practice, this requires a little more refactoring of the codebase given some constraints (e.g., route conflicts for SWA and environment variable setup for App Service). Watch for an updated tutorial later this year.

6.3 Use Azure OpenAI Deployed Models

This is also technically possible with the default open-source project. The updated documentation shows an example for how the chat application can be used with an OpenAI model deployed on Azure with a configuration change.

Screeshot of Azure OpenAI models

In practice, this requires a little more experimentation. In particular, the configuration treats text embedding models differently and I'm exploring how to setup both in the context of a different application I want to integrate with this UI/UX.

6.4 Use Azure Custom Model Endpoints

What if you built your own generative AI application with a hosted chat API endpoint? Can we integrate this app with that endpoint as a handy demo experience? The documentation for the Hugging Chat app indicates it may be possible with tweaks to the configuration - but a better approach might be to actually extend the codebase to add a custom endpoint configuration for use with Azure.

SUMMARY

Well that was quite a whirlwind tour - and I hope you found it useful. What you should takeaway from this is:

  • Hugging Face Chat is an open-source reference implementation for a chat UI/UX that you can use for generative AI applications.
  • You can use it with a devcontainer and GitHub Codespaces to get yourself a pre-build development environment that just works, for local development and code exploration.
  • You can explore the code itself using tools like GitHub Copilot to understand the structure and purpose of various components. And, you can explore configuration changes to have it work with external models or deployed app endpoints - for example, on Azure.

This was just the start of my journey deconstructing the open-source project. Want to learn more - follow me here or on LinkedIn to get updates.


3 Resources To JumpStart Your Open-Source Exploration:

  1. Hugging Face Chat - open-source repo powering Hugging Chat!
  2. Deconstructing Hugging Face Chat - watch the video replay of my talk.
  3. Build Generative AI Code First With Azure AI - my curated collection

Top comments (0)