Lonely on Valentine’s Day? Artificial intelligence can help. At least, that’s what many companies hawking “romance” chatbots will tell you. But as your robot love story unfolds, you may not realize you’re making trade-offs. According to a new study from Mozilla’s *Privacy Not Included program, AI girlfriend and boyfriend Collecting shocking amounts of personal information, almost all of these people sell or share the data they collect.
“Frankly, artificial intelligence girlfriends and boyfriends are not your friends,” Mozilla researcher Misha Rykov said in a press statement. “While they are promoted as enhancing your mental health and well-being, they specifically bring Dependence, loneliness and toxicity while stealing as much data from you as possible.”
Mozilla digs deep 11 Different AI Romance Chatbots, including popular apps like Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Each chatbot was labeled “Privacy Not Included,” putting the chatbots among the worst product categories reviewed by Mozilla. The apps mentioned in this article did not immediately respond to requests for comment.
You’ve heard stories about data problems before, but according to Mozilla, AI girlfriends invade your privacy in “disturbing new ways.” For example, the details CrushOn.AI collects include information about sexual health, substance use, and gender-affirming care. 90% of apps may sell or share user data for targeted advertising and other purposes, while more than half won’t let you delete the data they collect. Safety is also an issue. Only one app, Genesia AI Friend & Partner, meets Mozilla’s minimum security standards.
One of the more striking findings emerged when Mozilla counted the trackers in these apps, which are small pieces of code that collect data and share it with other companies for advertising and other purposes. Mozilla found that the AI Girlfriend app used an average of 2,663 trackers per minute, although that number was driven by Romantic AI, which called up a whopping 24,354 trackers in just one minute of using the app.
The privacy confusion is even more troubling because these apps actively encourage you to share more personal details than you might enter in a typical app. EVA AI Chat Bot & Soulmate pushes users to “share all your secrets and wishes” and specifically asks for photos and recordings. Notably, EVA is the only chatbot that has not been criticized for the way it uses data, although the app does have security issues.
In addition to the data issues, the apps also make some questionable claims about their purpose. EVA AI Chat Bot & Soulmate positions itself as “a provider of software and content developed to improve your mood and happiness.” Romantic Artificial Intelligence says it’s “to protect your mental health.” However, when you read the company’s terms and services, they go to great lengths to distance themselves from their claims. Romantic AI’s policy, for example, says it “neither provides health care or medical services, nor does it provide medical care, mental health services, or other professional services.”
Given the history of these apps, this may be an important legal basis.Replika reportedly encouraged a man to try Assassination of the Queen of England.Allegedly Chai chatbot Encourage users to commit suicide.