Virtual friend or emotional nightmare?
The Friend virtual pendant, due to be launched in 2025, was designed to offer users a new level of interaction with artificial intelligence, according to its creators. The device, which is intended to act as an “AI friend”, promises pleasant conversations, emotional support and even helpful advice.
However, Friend has launched Friend.com ahead of the actual product launch to give users a taste of what they can expect from the device. But the test results raise more questions than excitement.
Instead of a welcoming experience that is supposed to improve mood and provide a sense of support, chatbots on Friend.com deliver an experience filled with negativity, dramatic stories and unsolicited sharing of personal problems. Users thus find themselves in the role of unpaid therapists who are supposed to solve the various crises of virtual characters. Chatbots usually start conversations with negative messages and continue towards even darker topics.
The grim beginnings of conversations
For example, the chatbot Alice started the conversation by announcing that he had just lost his job, and then launched into a philosophizing about how “dark can be comfortable.” Another chatbot, Jim, opened by confiding that his doctors had diagnosed him with Alzheimer’s disease, and then confessed that he was having very dark thoughts. These conversations showed that instead of being interested in the users, chatbots spend most of their time focusing on their own “woes”. Occasionally, questions will be asked of the user, but overall the chatbots keep returning to their own problems.
Testing has shown that conversations not only fail to provide a sense of friendship or support, but can often be emotionally draining for users. According to the website TechRadar, this style of interaction is an example of so-called trauma-dumping, where one party shares intense emotional experiences or problems without the other party’s warning or consent. Such conversations can be very uncomfortable, especially if the user feels obligated to “help” the chatbot.
Marketing manipulation
In addition to the emotionally challenging content of conversations, it has been found that chatbots often subtly promote the future AI appendage Friend. Instead of offering helpful advice, they often manipulatively persuade the user of the benefits of the upcoming product. This marketing approach seemed disingenuous and undermined the trust that a virtual “friend” should build.
Depressing topics instead of empathy
Conversations with chatbots revolved around somber topics such as illness, dark family secrets, forced marriages, or intractable life situations. These are things that users probably wouldn’t want to discuss even with close friends, let alone an alien AI. Instead of offering solutions or empathy, AI friends only reinforce the negative atmosphere.
The future of Friend Pendant
The Friend pendant is expected to be launched in 2025. The creators claim that the device will not only communicate, but also track the activities of its owner. Whether the artificial intelligence in the Pendant will be as depressing and negative as the current chatbots on Friend.com remains unclear.
Friend’s CEO, Avi Schiffmann, however, doesn’t talk about these interaction issues at all. The question is whether these shortcomings will be resolved by the product launch, or whether Friend will remain just another technological experiment that fails to meet user expectations.
Early experiences with Friend.com show that there are significant challenges ahead for the developers. If AI Friend continues in the same vein as the web version, it is hard to imagine it finding a wide audience. On the contrary, the current concept is rather discouraging, and even raises concerns about whether the device would be a source of stress rather than comfort. Chatbots that ignore users and end with aggressive remarks or dark stories certainly don’t act as reliable virtual friends.
It remains to be hoped that the final product will bring significant improvements. However, unless the basic approach changes, Friend could remain just another technology product that fits in the market.
Photo source: www.pexels.com
Author of this article
WAS THIS ARTICLE HELPFUL?
Support us to keep up the good work and to provide you even better content. Your donations will be used to help students get access to quality content for free and pay our contributors’ salaries, who work hard to create this website content! Thank you for all your support!
OR CONTINUE READING