DEV Community

Cover image for The Chitchat of the other kind - ChatGPT in Power Virtual Agents
Holger Imbery
Holger Imbery

Posted on • Originally published at the.cognitiveservices.ninja on

The Chitchat of the other kind - ChatGPT in Power Virtual Agents

Picture of Christina @ wocintechchat.com on Unsplash

Note: a lot has changed since the release of the article in Jan 2023

Everything described below is still working (2023.05.22), but I would suggest using OpenAI Services on Azure instead of OpenAI directly. as with Azure OpenAI Services you have control regarding your Data.

Motivation

While working with bots, you must consider two points,

how to present the answer and

where does the answer come from?

With all articles you might have read in the last few days about openais ChatGPT, you could think - why not integrate this in my installation as a last line of defense to answer when my system is incapable of doing so - at least for chitchat.

This article demonstrates the "how". But do not forget to read the entire article and find some thoughts about the "why" in the conclusions.

OpenAi

To use openai, you need to create an account with them at www.openai.com and generate an API token.

After creating an account and adding some funds as pocket money,

Click top right on your account and the on "View API Keys"

Figure 1: Account Settings

Figure 1: Account Settings

Generate a new API Key


Figure 2: Create new secret key

Figure 2: Create new secret key

Power Virtual Agent

Activate the Fallback topic in Power Virtual Agent

Power Virtual Agent has a built-in topic that can be a hook to the outside world, the Fallback system topic.

To activate the Fallback system topic, go to Settings within the editing canvas (click on the cog symbol), then System fallback, and then +Add.

Figure 3: activated fallback topic

Figure 3: activated fallback topic

Edit the fallback topic

Create a message box with a meaningful message and a new action.
Figure 4: Fallback topic with message and action node

Figure 4: Fallback topic with message and action node

In Power Automate, create a new text-based variable, "UnrecognizedUserInput"

Figure 5: text variable as input

Figure 5: text variable as input

and an HTTP Node.

The syntax for openai is straightforward. We need to authenticate ourselves within the header and give the model, the question, the temperature, and max_tokens as the body. [documentation].

temperature: Higher values mean the model will take more risks. Try 0.9 for more creative applications, and 0 (argmax sampling) for ones with a well-defined answer.

Figure 6: syntax sample

Figure 6: syntax sample

I translated this to Power Automate; it will look like the following in the HTTP node.

The token you created above goes directly behind Bearer in the header.

Figure 7: complete HTTP Request

Figure 7: complete HTTP Request

As a next step, we create a String variable.

Figure 8: Initialize a Variable

Figure 8: Initialize a Variable

Create a "Parse JSON node" to analyze the Body of the answer we get from openai.

Figure 9: Parse JSON

Figure 9: Parse JSON

You can use the following as a schema or generate it with the Body of the outcome of the previous step.

{
    "type": "object",
    "properties": {
        "id": {
            "type": "string"
        },
        "object": {
            "type": "string"
        },
        "created": {
            "type": "integer"
        },
        "model": {
            "type": "string"
        },
        "choices": {
            "type": "array",
            "items": {
                "type": "object",
                "properties": {
                    "text": {
                        "type": "string"
                    },
                    "index": {
                        "type": "integer"
                    },
                    "logprobs": {},
                    "finish_reason": {
                        "type": "string"
                    }
                },
                "required": [
                    "text",
                    "index",
                    "logprobs",
                    "finish_reason"
                ]
            }
        }
    }
}

Enter fullscreen mode Exit fullscreen mode

After this, we create an "Apply to each" node with a "Set variable"

and we close by creating an output value.

Figure 10: Last step

Figure 10: Last step

Back to Power Virtual Agent, we adjust the created action to the correct input and output values and create a new message box to speak/display the answer on openai.

Figure 11: Completing Action Node

Figure 11: Completing Action Node

Test the result

Lets test the result within our configured channels as text chat or voice with Dynamics Customer Service 365 or AudioCodes VoiceAI.

Figure 12: Output

Figure 12: Output

Conclusion

ChatGPT gives us answers, most of the time correct answers, and the models are getting better and better.

Is there a downside? Why should we integrate it or why should we not to do so?

Do the user or us will "like" the answer or is it appropriate? Here we have no control; we do not own the model.

If you check the logs of your bots, you will see questions asked by users you would not even expect in your darkest dreams.

There are situations where a technically correct answer is not the answer you will give to your users when they are, e.g., in highly emotional situations.

Top comments (0)