While artificial intelligence (AI) has made it easy for consumers to find recipes, write thank-you cards, or even do homework assignments, some chatbots have been designed for people to build relationships.
These sites serve as a platform for consumers to build any kind of relationship – platonic, romantic, professional, etc. – with a chatbot.
Though it may seem harmless at the outset to have an outlet to vent or share things, Mozilla’s *Privacy Not Included guide has done some deep diving, discovering that these platforms can actually be dangerous when it comes to consumers’ privacy and safety.
The company analyzed data from 11 of the most popular relationship chatbots and determined that none provided adequate levels of privacy, security, and safety for users.
“Today, we’re in the wild west of AI relationship chatbots,” said Jen Caltrider, director of *Privacy Not Included. “Their growth is exploding and the amount of personal information they need to pull from you to build romances, friendships, and sexy interactions is enormous.
“And yet, we have little insight into how these AI relationship models work. Users have almost zero control over them. And the app developers behind them can’t even build a website or draft a comprehensive privacy policy. That tells us they don’t put much emphasis on protecting and respecting their users’ privacy. This is creepy on a new AI-charged scale.”
Privacy and security are at risk
The data from this analysis will be in *Privacy Not Included’s 2024 Valentine’s Day buyer’s guide. The goal is to help open consumers’ eyes to the security and privacy risks that come with utilizing these services.
For starters, Mozilla identified over 24,000 data trackers after using the Romantic AI app for just one minute. Once the app collects users’ data, they can share it with marketing companies, advertisers, social media platforms, and more.
Another security flaw that Mozilla discovered: 10 of the 11 chatbots didn’t require users to make strong passwords. This makes users’ accounts even easier for hackers or scammers.
It’s also important to note that consumers have no control over how their data or personal information is used by these platforms. This opens the door for these chatbots to utilize and manipulate users’ personal information as they please, which comes with several privacy and security risks.
“One of the scariest things about the AI relationship chatbot is the potential for manipulation of their users, “Caltrider said. “What is to stop bad actors from creating chatbots designed to get to know their soulmates and then using that relationship to manipulate those people to do terrible things, embrace frightening ideologies, or harm themselves or others? This is why we desperately need more transparency and user control in these AI apps.”
Photo Credit: Consumer Affairs News Department Images
Posted: 2024-02-14 12:14:45