DEV Community

Cover image for Gemini: ChatSession with Kendo Conversational UI and Angular
Dany Paredes
Dany Paredes

Posted on • Originally published at danywalls.com

Gemini: ChatSession with Kendo Conversational UI and Angular

I continued experimenting with Gemini, and after showing @Jörgen de Groot my demo of my first chat using Gemini and the Conversational UI, he asked how to maintain the chat history with Gemini to avoid sending the initial prompt and to preserve the context.

This is the second part of the article "Create Your Personalized Gemini Chat with Conversational UI and Angular"

This is an important consideration because sending the prompt repeatedly incurs an expense. Additionally, the first version does not support maintaining the initial context or preserving the conversation history.

However, Gemini offers a chat feature that collects our questions and responses, enabling interactive and incremental answers within the same context. This is perfect for our Kendo ChatBot, so let's implement these changes.

The Chat Session

In the first version, we directly used the model to generate content. This time, we will employ the startChat method with the model to obtain a ChatSession object, which offers a history and initial context with the prompt.

The Gemini model offers an option to initiate a ChatSession, where we establish our initial prompt and conversation using the startChat feature. The ChatSession object contains a sendMessage method, which enables us to supply only the second prompt from the user.

First, declare a new object chatSession with the initial history, which should include the initial prompt and the initial answer, for example:

 #chatSession = this.#model.startChat({
    history: [
      {
        role: 'user',
        parts: this.#prompt,
      },
      {
        role: 'model',
        parts: "Yes, I'm a Angular expert with Kendo UI",
      },
    ],
    generationConfig: {
      maxOutputTokens: 100,
    },
  });
Enter fullscreen mode Exit fullscreen mode

Our next step is to use the chatSession instead of directly sending the parts and user role to the model each time:

     const result = await this.#model.generateContent({
          contents: [{ role: 'user', parts }],
        });
Enter fullscreen mode Exit fullscreen mode

Replace the model with the chatSession and utilize the sendMessage method:

 const result = await this.#chatSession.sendMessage(
          textInput.message.text,
       );
Enter fullscreen mode Exit fullscreen mode

Done! 🎉 Our chatbot now supports history and continuous interaction, without sending the full prompt every time, saving our tokens 😊😁

Checkout the demo: 👇

demo 2

Recap

Yes, it was quite easy to add history support to our chat, saving tokens and providing a significantly better experience for users interacting with our chat.

We learned how to improve the functionality of a Gemini Chatbot by maintaining chat history and preserving context, thus avoiding the repeated sending of initial prompts.

Using the chat feature, which collects questions and responses, for interactive and incremental answers using ChatSession and provides a better user experience, and also saves tokens by not sending the full prompt every time. 💰🎉

Source Code https://github.com/danywalls/conversational-with-gemini/tree/feature/turn-to-chat

Top comments (0)