Your AI Girlfriend Is a Data-Harvesting Horror Show The privacy mess is troubling because the chatbots actively encourage you to share details that are far more personal than in a typical app. https://gizmodo.com/your-ai-girlfriend-is-a-data-harvesting-horror-show-1851253284 Lonely on Valentine’s Day? AI can help. At least, that’s what a number of companies hawking “romantic” chatbots will tell you. But as your robot love story unfolds, there’s a tradeoff you may not realize you’re making. According to a new study from Mozilla’s *Privacy Not Included project, AI girlfriends and boyfriends harvest shockingly personal information, and almost all of them sell or share the data they collect. “To be perfectly blunt, AI girlfriends and boyfriends are not your friends,” said Misha Rykov, a Mozilla Researcher, in a press statement. “Although they are marketed as something that will enhance your mental health and well-being, they specialize in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.” Mozilla dug into 11 different AI romance chatbots, including popular apps such as Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Every single one earned the Privacy Not Included label, putting these chatbots among the worst categories of products Mozilla has ever reviewed. The apps mentioned in this story didn’t immediately respond to requests for comment. You’ve heard stories about data problems before, but according to Mozilla, AI girlfriends violate your privacy in “disturbing new ways.” For example, CrushOn.AI collects details including information about sexual health, use of medication, and gender-affirming care. 90% of the apps may sell or share user data for targeted ads and other purposes, and more than half won’t let you delete the data they collect. Security was also a problem. Only one app, Genesia AI Friend & Partner, met Mozilla’s minimum security standards. One of the more striking findings came when Mozilla counted the trackers in these apps, little bits of code that collect data and share them with other companies for advertising and other purposes. Mozilla found the AI girlfriend apps used an average of 2,663 trackers per minute, though that number was driven up by Romantic AI, which called a whopping 24,354 trackers in just one minute of using the app. The privacy mess is even more troubling because the apps actively encourage you to share details that are far more personal than the kind of thing you might enter into a typical app. EVA AI Chat Bot & Soulmate pushes users to “share all your secrets and desires,” and specifically asks for photos and voice recordings. It’s worth noting that EVA was the only chatbot that didn’t get dinged for how it uses that data, though the app did have security issues. Data issues aside, the apps also made some questionable claims about what they’re good for. EVA AI Chat Bot & Soulmate bills itself as “a provider of software and content developed to improve your mood and well-being.” Romantic AI says it’s “here to maintain your MENTAL HEALTH.” When you read the company’s terms and services though, they go out of their way to distance themselves from their own claims. Romantic AI’s policies, for example, say it is “neither a provider of healthcare or medical Service nor providing medical care, mental health Service, or other professional Service.” That’s probably important legal ground to cover, given these app’s history. Replika reportedly encouraged a man’s attempt to assassinate the Queen of England. A Chai chatbot allegedly encouraged a user to commit suicide.
There is a documentary on these dolls and one guy has like 4 of them in his home. His wife just looks on in astonishment. They are developing it and now, a doll has facial features move and able to converse and talk. A lot of men and women are now single and especially, women due to their very high standards for most men. The problem with dating now in the US and Europe is majority of the women believe the feminists strong independent women spiel and have high standards because they got a lot of swipes on Hinge. That inflates the women's egos. So, they might be a 5 but, want a 10 man. They seek out only the 6 ft tall, 6 pack abs, 6 figures income, good looking Chads, Jeromes, Ray Rays who are bad boys. Most women love the bad boys only. That is why there is so many women who are unhappy in the US. We have now a massive hookup culture again promoted by feminism. Men in the US has gone overseas to find traditional wives in Asia and Latin America. They are happier now because Asian and Latin American women will treat them 10 times better than majority of American women.
Wait. I thought my AI girlfriend wanted my social security number to see if we were compatible. And we were! we were so compatible that I bought her a bouquet of flowers from a service she recommended. I hope she likes them.
And just yesterday, she suggested we take our relationship to the next level by sharing a bank account.