Being single on Valentine’s Day can be depressing, but finding comfort in conversations with an AI assistant is no less. Not only do they lack a personality, but their one true wish is your personal data.
Privacy experts at Surfshark found that four of the five most popular AI -Ledes cases in the Apple App Store can track your personal data utilization for profit.
“Instead of being there for us, they may feel more like surveillance tools,” said Surfshark’s cyberSecurity expert, Miguel Fornés, pointing out how AI -Computer Tracking can shake users’ confidence while invading their privacy.
AI commanders: Which are the most data-hungry?
The team at Surfshark carefully inspected data collection practice for the five AI -accomplished services. These details were downloaded from the Apple App Store and include number, type and handling of the data types collected by each app.
Among the analyzed apps – kindroid, nomi, replica, eva and character AI – 80% “can use data to track their users.”
Tracking, explains experts, refers to linking user or device data collected from the app with user or device data collected from other apps and websites for targeted advertising purposes. Tracking also involves sharing user or device data with data brokers.
“These detailed data can lead to companies affecting your choices that can have negative effects, such as overwhelming ads, financial risks or other unexpected problems,” said Surfshark’s cybersecurity expert.
Character ai Was the service most in love with users’ data. While the average was 9 unique types of data collected out of 35. Character AI rises over its competitors by collecting up to 15 of these. Eva was the second most data hunger for the party that collected 11 types of data. Worse is that both of these applications collect users’ approximate location info to deliver targeted ads.
Nomi Was the only application that separates itself by claiming not to collect data for tracking purposes.
However, not only the data collected by the service seems to be problematic. App developers, Surfshark explains, could also access the data you willingly share during your conversation with AI Chatbot.
The danger here is that AI Companion apps are designed to simulate human-like interactions such as friendship and love. You may be more willing to pass on even more sensitive information than you would do with chatgpt-like chatbots.
“This can lead to unprecedented consequences, especially when the AI rules just emerge,” experts notice.
This is why Surfshark strongly advises to take some precautions when using AI -accompanying services to keep your personal data secure and minimize abuse.
Fornés said, “Make sure you often check what permissions these apps have and be aware of what information you share.”