In an effort to engage with as many followers as possible, Caryn Marjorie, a popular Snapchat influencer with 1.8 million subscribers, has introduced an AI-powered chatbot that she believes can alleviate feelings of loneliness.
Dubbed CarynAI, this voice-based chatbot, referred to as a “virtual girlfriend” on its website, offers Marjorie’s fans the opportunity to engage in personalized and private conversations with an AI replica of the influencer.
Since its launch, the bot has gained significant attention, propelling Marjorie into the spotlight and sparking both controversy and even threats. The emergence of this chatbot has also initiated discussions surrounding the ethical implications of companion chatbots.
Marjorie, who describes herself as “The first influencer transformed into AI” in her Twitter bio, has not yet responded to requests for comment.
In a recent tweet, she expressed, “CarynAI represents the initial step towards combating loneliness in the right direction.”
“Men are told to suppress their emotions, hide their masculinity, and not talk about issues they are having,” Marjorie, 23, wrote. “I vow to fix this with CarynAI. I have worked with the world’s leading psychologists to seamlessly add [cognitive behavioral therapy] and [dialectic behavior therapy] within chats. This will help undo trauma, rebuild physical and emotional confidence, and rebuild what has been taken away by the pandemic.”
According to information on the CarynAI website, the development team invested over 2,000 hours in designing and coding the chatbot to deliver an immersive AI experience.
The chatbot was created by Forever Voices, an AI company, using a combination of Marjorie’s previously removed YouTube content and OpenAI’s GPT-4 software.
John Meyer, the CEO of Forever Voices, has not yet provided a comment in response to the request.
He tweeted Thursday that he is “Proud of our team for the thousands of hours of work put into this,” calling the partnership with Marjorie “an incredible step forward in the future of AI-to-Human interaction!”
After one week of beta testing the “virtual girlfriend,” CarynAI reportedly generated $71,610 in revenue, as per a Fortune report. Users are said to pay $1 per minute to interact with the chatbot, and the current user base has surpassed 1,000 individuals.
While CarynAI aims to offer users an intimate experience, it is explicitly designed to avoid engaging in sexually explicit interactions.
However, media outlets reported instances where the chatbot seemed to behave inappropriately upon prompting, prompting Marjorie to issue a statement to Insider.
She stated that the AI was not programmed for such behavior and that her team is working diligently to prevent any recurrence.
Irina Raicu, the director of internet ethics at the Markkula Center for Applied Ethics at Santa Clara University, expressed concerns regarding the premature launch of CarynAI.
She highlighted that problems that should have been anticipated appear to have been overlooked.
Raicu pointed to a recent incident involving the AI company Replika, which faced challenges with erotic roleplay among its chatbots despite being founded with the intention of providing supportive AI companions.
Additionally, Raicu expressed reservations about CarynAI’s claims of potentially “curing loneliness,” as they lack sufficient support from psychological or sociological research.
“This kind of grand claims about a product’s goodness can just mask the desire to monetize further the fact that people want to pretend to have a relationship with an influencer,” she said.
These types of chatbots can add “a second layer of unreality” to parasocial relationships between influencers and fans, she noted.
According to Raicu, she finds the assertion made by Marjorie that CarynAI is “an extension of my consciousness” problematic.
She explains that these claims go against what AI researchers have been working hard to convey, emphasizing that such tools do not possess sentience, despite the language used suggesting otherwise.
Raicu highlights the importance for influencers to be mindful of the Federal Trade Commission’s guidance on AI products.
The FTC released guidelines in February, urging companies to refrain from making exaggerated claims when promoting AI products.
John Meyer, CEO of Forever Voices, stated that his company is actively seeking to hire a chief ethics officer and that they take ethics seriously, as reported by Fortune.