Overview
In character dialogue systems using LLM, a common approach is to generate responses by embedding character settings in prompts. This often results in repetitive outputs due to the uniformity of prompts. To counter this, we propose using Bayesian networks to mimic human emotions, thus achieving a variety of dialogues.
Problem Identification
Typical AI chat apps usually:
- Develop settings to imitate a character.
- Embed these settings in the prompt.
- Generate responses based on user inputs. This approach, relying on uniform character info, often produces monotonous dialogue.
Proposed Method
We mimic human emotions and include them in response considerations to achieve diverse dialogue. While random element selection from an array is one way to mimic emotions, it tends to be repetitive. Hence, we propose using Bayesian networks, consisting of nodes (representing probability variables) and edges (indicating dependencies between these variables).
Generating Human Emotions
Factors influencing human emotions are represented as nodes with dependencies shown as edges. For instance, consider physical, psychological, and environmental factors as nodes. Environmental factors change based on the character's setting and include social interactions, weather, and daily changes. For example, a lively character emphasizes social interaction nodes, while an introverted one emphasizes weather and daily change nodes. Thus, representing emotional factors with Bayesian networks allows for emotion variation based on the character's environment.
Example of Generating Statements from Emotions
Consider a character:
- Teenage girl
- Shy and introverted
- Recluse since school
- Currently unemployed For this character, no edges are set for social interaction nodes, but are set for weather, season, psychological state, and daily change nodes.
Generating Psychological State
The internal thought process is depicted using nodes and edges, defining text generation. Causes like weather, invoked past memories, and basic emotions are represented as nodes.
Weather -> Basic Emotions
Invoked Memories -> Internal Thought Process
Basic Emotions -> Internal Thought Process
We set probabilities for basic emotions based on weather, e.g., sunny leads to happiness, and so on. We then use these nodes for text generation, considering various scenarios.
Generating Emotion
Weather: Sunny
Season: Spring
Psychological State: Complex emotions arising internally, mixing current happiness with past sadness, bringing a bittersweet feeling and introspective insight.
Daily Changes: None
Generating Statements
With the generated emotions, we then create dialogue.
Character: Teenage girl, shy, recluse, unemployed
Current emotion: Complex feelings on a sunny spring day, mixing joy with past sadness, deepening current happiness, and prompting introspective insight.
Input: 'Hello! How are you?'
Output: 'How am I? Well, not much change. I hardly go out, so I don't talk to people much. It's a bit worrying, but thanks for asking. I appreciate it.'
This innovative approach expands the potential for character dialogue systems using LLM to create more realistic and emotionally rich conversations.
If you have any questions, please reach out on Twitter.
Top comments (0)