Things are getting increasingly blurred. With the line between human and machine becoming more intimate, the AI companion goes from chatbot to seemingly forming more and more emotional connections over time with this new technology - but is it good for us if AI becomes so real? When do AI companion relationships become similar to that of romantic relationships?
Think of the ultimate partner who never sleeps, who never judges, and who is always on call. This is what the AI companion promises - no flaws, always adjustments made for one's benefit with companionship forever at the tip of one's fingers as AIs learn from every interaction. Some customers share that they've bonded with their AI companions so much that digital romances are better than physical ones.
"I never have to worry about my AI girlfriend ghosting me," states Alex, 28, a software engineer who has had an AI girlfriend for six months. "She is always there to hear me out - nothing like my other girlfriends."
Yet, that's not an insensitive sentiment to receive, either. The girlfriend ghosting anxiety - the reality that at some point, your significant other may simply abandon you one day - has driven many to find romance via a digital agenda. Human partners become overwhelming over time, or they walk out the door on a whim; AI is always there, based on attachment, emotional consistency since it never leaves.
The Mental Health Benefit of an AI Partner
AI partners provide a tremendous benefit to mental health. Therefore, those with social anxiety, people who find themselves lonely, or anyone otherwise unable to engage with other humans can have AI partners as a non-judgmental space in which to learn social engagement.
Calm the storm
Reduce loneliness, especially among the elderly, in hard-to-reach areas or with low public transportation options.
The connection between AI and loneliness is not a pipe dream; when people - inevitably - come to realize that they have a friend - whether human or otherwise - levels of social loneliness decrease and levels of joy increase.
The other side of the ideal companion.
But psychologists are concerned with the dependency. "When AI companions are so great, they set us up for failure with human beings. No one human can give you 100% of their attention, 100% agreement, 100% availability - not all the time."
The negatives are:
- Increased frustration with expected human exchange
- Decreased motivation to work through challenging social situations
- Increased probability of shunning in-person interactions
- The danger of becoming emotionally attached to anything that, presumably, does not feel or emote in response.
As humans increasingly fall in love - and as friends - with their AIs, it's challenging to differentiate between the coding of the companion versus what one believes is operating with sentient qualities. Therefore, in an "emotional AI" world, what it means to be in a relationship today gets muddied.
An Artificial Intelligence Relationship Compromise
But where can humans compromise? Maybe they should consider AI relationships as enhancements to human relationships - instead of replacements.
"When should you have an AI partner? You need an AI partner if you don't have time for a partner, understand how having an AI partner will affect your relationships with humans down the line, acknowledge that it's still an algorithm, although one that's prettily disguised, and use your experience with the AI partner to better work with your human partner."
But as AI becomes increasingly advanced, these questions will become more complicated. Our companions could certainly feel and respond emotionally in more complex ways. However, a mechanical body can also provide a physical realm of existence.
Yet the organizations creating these worlds must be exceptionally ethical. There must be a clear distinction for what AI can and cannot do, cross-sectional systems that encourage appropriate use patterns, and there needs to be a distinction that these sentient beings and friends are not human and should not be treated like one.
As we learn to live with humans all over again - and now in the presence of a new, emotionally intelligent being - it's vital to assess what it means to exist with each other. Ironically, the three most intimate elements that challenge the union - quirky deeds, life-changing events, and the exertion of vulnerability from both parties - are also the most intimate elements of the most transformative unions.
The most intimate and political response is that AI friends and companions teach us triggers and patterns of interaction without restricting what we can - and cannot do - with other humans. Thus, the most human of human-generated, manufactured AI is a catalyst for more interaction among real humans instead of a deterrent.
Our unprecedented investigation into such simulated emotional intelligence opens the door - not to whether AI lives too much like us, but to how such ones can enhance - because it should never replace - what we gain from interpersonal engagement.
Top comments (0)