Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    46 Important Account-Based Marketing Statistics for the Modern Marketer

    Motion Picture Association will work with Congress to start blocking piracy websites in the United States

    Excellent Support Guide: Unlock Cloud Success

    Facebook X (Twitter) Instagram
    Tech Empire Solutions
    • Home
    • Cloud
    • Cyber Security
    • Technology
    • Business Solution
    • Tech Gadgets
    Tech Empire Solutions
    Home » Your AI girlfriend is a data-gathering horror show
    Technology

    Your AI girlfriend is a data-gathering horror show

    techempireBy techempireNo Comments3 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email

    Lonely on Valentine’s Day? Artificial intelligence can help. At least, that’s what many companies hawking “romance” chatbots will tell you. But as your robot love story unfolds, you may not realize you’re making trade-offs. According to a new study from Mozilla’s *Privacy Not Included program, AI girlfriend and boyfriend Collecting shocking amounts of personal information, almost all of these people sell or share the data they collect.

    Like it or not, your doctor will be using artificial intelligence | Artificial Intelligence Unlocked

    “Frankly, artificial intelligence girlfriends and boyfriends are not your friends,” Mozilla researcher Misha Rykov said in a press statement. “While they are promoted as enhancing your mental health and well-being, they specifically bring Dependence, loneliness and toxicity while stealing as much data from you as possible.”

    Mozilla digs deep 11 Different AI Romance Chatbots, including popular apps like Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI. Each chatbot was labeled “Privacy Not Included,” putting the chatbots among the worst product categories reviewed by Mozilla. The apps mentioned in this article did not immediately respond to requests for comment.

    You’ve heard stories about data problems before, but according to Mozilla, AI girlfriends invade your privacy in “disturbing new ways.” For example, the details CrushOn.AI collects include information about sexual health, substance use, and gender-affirming care. 90% of apps may sell or share user data for targeted advertising and other purposes, while more than half won’t let you delete the data they collect. Safety is also an issue. Only one app, Genesia AI Friend & Partner, meets Mozilla’s minimum security standards.

    One of the more striking findings emerged when Mozilla counted the trackers in these apps, which are small pieces of code that collect data and share it with other companies for advertising and other purposes. Mozilla found that the AI ​​Girlfriend app used an average of 2,663 trackers per minute, although that number was driven by Romantic AI, which called up a whopping 24,354 trackers in just one minute of using the app.

    The privacy confusion is even more troubling because these apps actively encourage you to share more personal details than you might enter in a typical app. EVA AI Chat Bot & Soulmate pushes users to “share all your secrets and wishes” and specifically asks for photos and recordings. Notably, EVA is the only chatbot that has not been criticized for the way it uses data, although the app does have security issues.

    In addition to the data issues, the apps also make some questionable claims about their purpose. EVA AI Chat Bot & Soulmate positions itself as “a provider of software and content developed to improve your mood and happiness.” Romantic Artificial Intelligence says it’s “to protect your mental health.” However, when you read the company’s terms and services, they go to great lengths to distance themselves from their claims. Romantic AI’s policy, for example, says it “neither provides health care or medical services, nor does it provide medical care, mental health services, or other professional services.”

    Given the history of these apps, this may be an important legal basis.Replika reportedly encouraged a man to try Assassination of the Queen of England.Allegedly Chai chatbot Encourage users to commit suicide.

    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    techempire
    • Website

    Related Posts

    Here are all the 2024 Oscar winners

    Watch Billie Eilish & Barbie Win 2024 Oscar for Best Original Song for “What Was I Made For”

    Halloween TV show will have its own classic

    ‘Weird’ Comments: Your new horror obsession has arrived

    NFT fantasy sports startup Sorare lays off 13% of employees as web3 games struggle

    Best Salesforce Training Package Discount: 91% Off

    Leave A Reply Cancel Reply

    Top Reviews
    Editors Picks

    46 Important Account-Based Marketing Statistics for the Modern Marketer

    Motion Picture Association will work with Congress to start blocking piracy websites in the United States

    Excellent Support Guide: Unlock Cloud Success

    A progressive and proven vision for digital transformation

    Legal Pages
    • About Us
    • Disclaimer
    • DMCA
    • Privacy Policy
    Our Picks

    Embracer sells majority stake in Saber Interactive in deal worth approximately $500 million

    What they are and when to use them

    Why you should enter a business case competition

    Top Reviews

    Type above and press Enter to search. Press Esc to cancel.