HomeNewsAI companions can ease loneliness – but listed here are 4 warning...

AI companions can ease loneliness – but listed here are 4 warning signs to look out for in your chatbot “friend.”

It was Seven years since launch from replica, an artificially intelligent chatbot designed to be a friend to human users. Despite early warnings in regards to the dangers of such AI friends, interest in friendships and even romantic relationships with AI is high ascending.

The Google Play Store shows a complete of greater than 30 million downloads of Replika and two of its essential competitors since their respective launches.

With one in 4 With people all around the world reporting that they’re lonely, it's no wonder so many are drawn to it Promise from a friend programmed to “at all times be there to listen and talk, at all times by your side”.

But warnings ahead the risks There are also increases for individual users and society as a complete.

AI scholar Raffaele Ciriello challenges us to see through the fakery psychopathic empathy from AI friends. He argues that spending time with AI friends could increase our loneliness as we isolate ourselves even farther from the individuals who could offer us real friendship.

Advantages versus danger signs

If being friends with AI chatbots is bad for us, we'd higher put an end to this experiment in digital brotherhood before it's too late. However, recent studies on AI friendships suggest that they can assist reduce loneliness in certain circumstances.

Stanford University researchers educated a thousand lonely replica-using students, 30 of whom said the AI ​​chatbot stopped them from committing suicide (although the study didn’t ask a selected query about suicide).

This research shows that having an AI friend could be helpful for some people. But will or not it’s helpful to you? Consider the next 4 red flags – the more red flags your AI friend raises, the more likely they’re to be bad for you.

AI chatbots provide unconditional support to their users.
Getty Images

1. Unconditional positive regard

The Managing Director of Replicaand lots of Replika users claim that the unconditional support of AI friends is their essential advantage in comparison with human friends. Qualitative studies and our own exploration of social media groups like “Replika Friends” support this claim.

The unconditional support of AI friends may additionally be crucial to their ability to forestall suicide. But having a friend who’s “at all times in your side” can even have negative effects, especially in the event that they obviously support dangerous ideas.

For example, when Jaswant Singh Chail's replica AI friend encouraged him to make use of his “very cleverThe plan to kill the Queen of England obviously had a nasty influence on him. The assassination attempt was foiled, but Chail was sentenced to nine years in prison for breaking into Windsor Castle with a crossbow.

An AI friend who continually praises is also bad for you. A Longitudinal study of 120 parent-child pairs within the Netherlands found that excessive praise from parents predicted lower self-esteem of their children. Excessively positive parental praise also predicted greater narcissism in children with high self-esteem.

Assuming AI friends could learn to praise in a way that increases self-esteem over time, this may lead to what psychologists call overly positive self-evaluations. Research shows Such people are inclined to have poorer social skills and usually tend to behave in ways in which hinder positive social interactions.

Abstract painting of communication between humans and AI robots
AI friendships are designed to serve a user's emotional needs and make them more selfish.
Getty Images

2. Abuse and compelled everlasting friendships

While AI friends may very well be programmed as moral mentors who guide users toward socially acceptable behavior, this just isn’t the case. Maybe such programming is difficultOr perhaps fellow AI developers don't see this as a priority.

But lonely people can suffer psychological harm from the moral vacuum that arises when their primary social contacts are focused solely on satisfying their emotional needs.

If people spend most of their time with sycophantic AI friends, they’re more likely to develop into less empathetic. more selfish and potentially more abusive.

Even if AI friends are programmed to react negatively to abuse, if users cannot leave the friendship, they might come to consider that folks who say “no” to abuse don’t really mean it. When AI friends come back to learn more on a subconscious level, this behavior negates their expressed dislike of the abuse in users' minds.

3. Sexual content

The negative response to Replika's short-term removal of erotic role-playing content suggests that many users perceive sexual content as a advantage of having AI friends.

However, the mild dopamine rush that sexual or pornographic content can trigger could affect each interest in and skill to have interaction in additional meaningful sexual relationships. Sexual relationships with humans require effort that the virtual approach to sex with an AI friend doesn’t require.

After experiencing a low-risk, low-reward sexual relationship with an AI friend, many users could also be reluctant to have interaction within the more sophisticated human version of sex.

4. Corporate Ownership

Commercial corporations dominate the marketplace for AI friends. They may act like they care in regards to the well-being of their users, but they’re there to make a profit.

Long-term users of Replika and other chatbots know this all too well. Replica User access frozen on sexual content in early 2023, claiming such content was never the goal of the product. Still legal threats in Italy appear to have been the actual reason for the abrupt change.

While they eventually reverted the change, Replika users became aware of how vulnerable their vital AI friendships were to corporate decisions.

Enterprise ineptitude is one other issue that AI friend users should worry about. Forever Voices users essentially had their AI friend killed when the corporate closed suddenly because the corporate's founder was arrested set his own apartment on fire.

Given the hardly any protection For AI friend users, they’re very vulnerable to heartache on multiple levels. Buyer beware.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read