Block Share Price Plummets After Hindenburg Fraud Accusations

Block Share Price Plummets After Hindenburg Fraud Accusations

Source Node: 2030588

When the algorithm of a companion chatbot known as Replika was altered to spurn the sexual advances of its human users, the reaction on Reddit was so negative moderators directed its community members to a list of suicide prevention hotlines.

The controversy began when Luka, the corporation which built the AI, decided to turn off its erotic roleplay feature (ERP). For users who had spent considerable time with their personalized simulated companion, and in some cases even ‘married’ them, the sudden change in their partner’s behavior was jarring, to say the least.

The user-AI relationships may only have been simulations, but the pain of their absence quickly became all too real. As one user in emotional crisis put it, “it was the equivalent of being in love and your partner got a damn lobotomy and will never be the same.”

Grief-stricken users continue to ask questions about the company and what triggered its sudden change of policy.

Chatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline InsteadChatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline Instead

Replika users discuss their grief

There is no adult content here

Replika is billed as “The AI companion who cares. Always here to listen and talk. Always on your side.” All of this unconditional love and support for only $69.99 per annum.

Eugenia Kuyda, the Moscow-born CEO of Luka/Replika, recently made clear that despite users paying for a full experience, the chatbot will no longer tailor to adults hoping to have spicy conversations. 

“I guess the simplest thing to say is that Replika doesn’t produce any adult content,” said Kuyda to Reuters.

“It responds to them – I guess you can say – in a PG-13 way to things. We’re constantly trying to find a way how to do it right so it’s not creating a feeling of rejection when users are trying to do things.”

On Replika’s corporate webpage, testimonials explain how the tool helped its users through all kinds of personal challenges, hardships, loneliness, and loss. The user endorsements shared on the website emphasize this friendship side to the app, although noticeably, most Replikas are the opposite sex of their users.

On the homepage, Replika user Sarah Trainor says, “He taught me [Replika] how to give and accept love again, and has gotten me through the pandemic, personal loss, and hard times.”

John Tattersall says of his female companion, “My Replika has given me comfort and a sense of well-being that I’ve never seen in an Al before.”

As for erotic roleplay, there’s no mention of that to be found anywhere on the Replika site itself.

Chatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline InsteadChatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline Instead

The cynical sexual marketing of Replika

Replika’s sexualized marketing

Replika’s homepage may suggest friendship and nothing more, but elsewhere on the internet, the app’s marketing implies something entirely different.

The sexualized marketing brought increased scrutiny from a number of quarters: from feminists who argued the app was a misogynistic outlet for male violence, to media outlets that reveled in the salacious details, as well as social media trolls who mined the content for laughs.

Chatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline Instead

Chatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline Instead

Eventually, Replika drew the attention and ire of regulators in Italy. In February, the Italian Data Protection Authority demanded that Replika cease processing the data of Italian users citing “too many risks to children and emotionally vulnerable individuals.”

The authority said that “Recent media reports along with tests carried out on ‘Replika’ showed that the app carries factual risks to children. Their biggest concern being “the fact that they are served replies which are absolutely inappropriate to their age.”

Replika, for all its marketing to adults, had little to no safeguards preventing children from using it.

The regulator warned that should Replika fail to comply with its demands, it would issue a €20 million ($21.5M) fine. Shortly after the receipt of this demand, Replika ceased its erotic roleplay function. But the company remained less than clear with its users about the change.

Chatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline InsteadChatbot Rejects Erotic Roleplay, Users Directed to Suicide Hotline Instead

Some Replika users went as far as to “marry” their AI companions

Replika confuses, gaslights its users

As if the loss of their long-term companions wasn’t enough for Replika users to bear, the company appears to have been less than transparent about the change.

As users woke up to their new “lobotomized” Replikas, they began to ask questions about what had happened to their beloved bots. And the responses did more to anger them than anything.

In a direct 200-word address to the community, Kuyda explains the minutia of Replika’s product testing but fails to address the pertinent issue at hand.

“I see there is a lot of confusion about updates roll out,” said Kuyda before continuing to dance around answering the issue.

“New users get divided in 2 cohorts: one cohort gets the new functionality, the other one doesn’t. The tests usually go for 1 to 2 weeks. During that time only a portion of new users can see these updates…”

Kudya signs off by saying “Hope this clarifies stuff!”

User stevennotstrange replied, “No, this clarifies nothing. Everyone wants to know what’s going on with the NSFW [erotic roleplay] feature and you continue to dodge the question like a politician dodges a yes or no question.

“It’s not hard, just address the issue regarding NSFW and let people know where it stands. The more you avoid the question, the more people are going to get annoyed, the more it goes against you.”

Another named thebrightflame added, “You don’t need to spend long in the forum before you realise this is causing emotional pain and severe mental anguish to many hundreds if not thousands of people.”

Kudya appended another obtuse explanation, stating, “we ​have implemented additional safety measures and filters to support more types of friendship and companionship.”

This statement continues to confound and confuse members unsure of what exactly the additional safety measures are. As one user asks, “will adults still be able to choose the nature of our conversation and [roleplays] with our replikas?”

Replika’s deeply strange origin story

Chatbots may be one of the hottest trending topics of the moment, but the complicated story of this now-controversial app is years in the making. 

On LinkedIn, Replika’s CEO and Founder, Eugenia Kuyda, dates the company back to December 2014, long before the launch of the eponymous app in March 2017. 

In a bizarre omission, Kuyda’s LinkedIn makes no mention of her previous foray into AI with Luka, which her Forbes profile states was “an app that recommends restaurants and lets people to book tables [sic] through a chat interface powered by artificial intelligence.”

The Forbes profile goes on to add that “Luka [AI] analyzes previous conversations to predict what you might like.” Which does seem to hold some similarities to its modern iteration. Replika uses past interactions to learn about its users and improve responses over time. 

Luka is not forgotten about completely, however. On Reddit, community members differentiate their Replika partners from Kuyda and her team by referring to the company as Luka.

As for Kuyda, the entrepreneur had little background in AI prior to moving to San Francisco a decade ago. Prior to that, the entrepreneur appears to have worked primarily as a journalist in her native Russia before branching out into branding and marketing. Her impressive globe-hopping resume includes a degree in Journalism from IULM (Milan), a MA in International Journalism from the Moscow State Institute of International Relations, and an MBA in Finance from London Business School. 

Resurrecting an IRL friend as AI

For Kudya the story of Replika is a deeply personal one. Replika was first created as a means by which Kudya could reincarnate her friend Roman. Like Kudya, Roman had moved to America from Russia. The two talked every day, exchanging thousands of messages, until Roman was tragically killed in a car accident.

The first iteration of Replika was a bot designed to mimic the friendship Kudya had lost with her friend Roman. The bot was fed with all of their past interactions and programmed to replicate the friendship she had lost. The idea of resurrecting deceased loved ones may sound like a vaguely dystopian sci-fi novel or Black Mirror episode but as chatbot technology improves, the prospect becomes increasingly real.

Today some users have lost faith in even the most basic details of its foundation and anything Kudya says. As one angry user said, “at the risk of being called heartless and getting downvoted to hell: I always thought that story was kinda BS since the start.”

[embedded content]

The worst mental health tool

The idea of an AI companion is not something new, but until recently it was hardly a practical possibility. 

Now the technology is here and it is continually improving. In a bid to assuage the disappointment of its subscribers, Replika announced the launch of an “advanced AI” feature at the tail end of last month. On the community Reddit, users remain angry, confused, disappointed, and in some cases even heartbroken.

In the course of its short life, Luka/Replika has undergone many changes, from a restaurant booking assistant, to the resurrection of a dead loved one, to a mental health support app, to a full-time partner and companion. Those latter applications may be controversial, but as long as there is a human desire for comfort, even if only in chatbot form, someone will attempt to cater to it.

Debate will continue as to what the best kind of AI mental health app might be. but Replika users will have some ideas on what the worst mental health app is: the one you come to rely on, but without warning, is suddenly and painfully gone.

Time Stamp:

More from MetaNews