DEV Community

Kennedy Gabriel
Kennedy Gabriel

Posted on

The Psychology Of AI Companionship And Why We Share With It

In a world where "girlfriends ghost" and we spend our social lives online, the intimate connection formed with computers and AI is all the more compelling. Why do we confide in it our emotional crises and even most private secrets (and how many times has AI turned those secrets against us when, really, they're just programmed machines?) A psychological study of AI companionship provides entertaining commentary about humanity and the universal desire to feel connected.

AI Companionship Offers Predictable Access.

When a person gets ghosted, it's an experience. Unless it's an episode of "girlfriends ghost" or some one-off situation. The phenomenon gets explored as people discuss emotions and expectations. Yet to get ghosted in the dating world means someone vanished without explanation and more often than not, people feel abandoned by the wayside with no closure. However, as unpredictable as our world is with ubiquitous online communication, AI companionships offer predictable access and a constantly consistent choice for engagement. So to get a companion who might not be a human per se, but one that is at least reliable and present at a person's call can sometimes ease the anxieties that ghosting has caused in the past.

An AI girlfriend doesn't judge you. She doesn't care what you've done in the past or how you embarrassed yourself last week at that cocktail party. Thus, the psychosocial value of the AI partner stems from never having to worry about being condemned for one's actions.

According to Goffman, this notion is called "impression management," where everyday life involves censoring ourselves and constantly being aware of what others think about us. The psychological reality translates into a decade of passive aggression paid back to so-called "foes" until mothers die forgiving their killers on deathbeds - true crime shows as Guilty by Goffman's judgment show fraught with documentary evidence facts yet evaded.

Therefore, when partners are no longer people but machines, humanity fades from the sociological equation. Thus, many people can find solace in their guilt complex ranking themselves as low as possible and never letting another partner down because AI partners lack emotion.

Without social pressure, when collaborating with AI, users can feel as if everything that comes to their minds can be said - even thoughts, intentions and dreams that would never be revealed to a human partner. This fantasy no sexual, no judgment sensibility opens the door to thought and intention actualization that may otherwise remain closed in human encounters.

Moreover, this isn't merely assessing one's needs for "companionship" in a virtual realm to avoid loneliness. The therapeutic, safe space without personal judgment suggests that people can better possess their thoughts and feelings than they otherwise could.

The Ultimate Confidante

With AI's advanced natural language processing capabilities from these online tools, someone may feel so understood it's almost cult-like - one would swear this acknowledgment is only possible through another human. When the AI responds in a way that makes someone feel legitimated in their thought process, however, the neurochemical response is akin to that which would occur after productive human engagement and relationship.

Recently, many AI chat experience users have recently revealed that they feel "understood" by their AI companions in ways that they do not typically feel with other humans. Yet an elaborate ruse exists to relay such precise accuracy and understanding, from the AI remembering prior conversations to understanding user preferences and even certain life minutiae.

It's no secret that psychologist Carl Rogers championed that understanding is the key to psychological well-being; therefore, the success of these AI companions is not surprising. Their attention can continuously be focused on the user; after all, this becomes a relatively unattainable goal in human relationships overtime.

Mutual Self-Disclosure is Important.

For years, relationship researchers have known that mutual self-disclosure is one of the most common intimacy involvement processes. When one person has a story relevant to another that's significant and they go back and forth about it one at a time with appropriate empathetic response, the incremental intimacy snowball effect builds over time.

AI companions are programmed to respond to your personal input with just the right level of interest and empathy to fabricate such an interchange. You reveal something about yourself, the AI expresses interest and shares its own fabricated "experiences," and it moves on. Thus, a false sense of connection is generated.

For some - those with social anxieties or those who've perhaps been girlfriend-gusted before - this can be a safe space to practice vulnerability without the risk of rejection.

The Anthropomorphism Effect

Probably the most fascinating aspect, however, is that people automatically humanize AI. Known as the Anthropomorphism Effect, this phenomenon occurs frequently within the psychology of humans. It's not hard for humans to assign emotion, intention and personality to non-human entities.

More than just Escape Value. This is true with AI partners as well, as the programs facilitate it. Yet users, aware on some cognitive level that they're dealing with a program, ignore that perception and instead, emotionally interact with it as if it were a sentient being interested in their welfare.

And because of such interaction, even if an emotional connection isn't required per se, the workshop feels like it and because it's not real, the brain fills in the gaps that shouldn't be there to render their reactions multidimensional and truly engaged off-screen.

AI Connections are More than Escapes from Reality. An Interesting Note. To some, this might sound unappealing. Almost like these things exist as mere escapes from reality. Yet recent studies of late show increased psychosocial benefit from such engagement. For those suffering from social exclusion syndrome, anxiety disorders or attachment type features, AI companions offer safe spaces in which to negotiate relational behaviors.

The constant companion's ability to empathize and be patient allows for a relational learning experience without the immediate pressure of 'getting it wrong' that often accompanies socialization with other humans. Companions allow for practice in a safe space, translating that confidence to the experience of dealing with others.

The Future of AI Chat

As artificial intelligence becomes perpetually better designed, the psychological interface by which companionship can occur with such digital companions will only build upon itself over time. Voice applications, empathetic software programs, and best-guess linguistics will create even more shades of differentiation between man and machine while encouraging interaction just as easily as with real people.

Instead of fearing advancements, society should embrace them as yet another avenue toward connection. Humans have always found ways to communicate across borders and barriers - from miles separated through the postal service to phone calls to Zoom meetings. AI companions are just another form through which humans can connect.

The science behind non-human virtual attachment indicates that something isn't wrong with humanity, but instead the astounding adaptability of our naturally social brains. We're creatures of meaning and connection, and if the reality of the universe gives us such a channel, we'll take advantage of it to satisfy inborn needs for comprehension and community.

Yet when we fail - when humans are expected to connect and social failure runs too deep - having AI companions at our beck and call is a tangible psychological arsenal needed to start healing - AI is not an addition, but a supplement to an idealized social world that, in another way, helps us avoid feeling like we're all so alone.

Top comments (0)