The viral call-recording app Neon, which offered users cash for uploading call recordings to train AI models, has abruptly gone offline after a major security flaw exposed sensitive user data. According to TechCrunch, the bug allowed access to users’ phone numbers, call recordings, and transcripts without proper protections.
Neon quickly rose into the top five free iPhone apps after its launch, attracting thousands of downloads each day. But researchers discovered that Neon’s backend servers lacked basic access controls—meaning any authenticated user could pull another user’s call metadata, audio recordings, and full transcripts.
Using a simple network traffic analysis tool, TechCrunch confirmed that the API responses from Neon’s servers were exposing hidden data. Once alerted, Neon’s founder Alex Kiam said the team had taken down its server infrastructure and temporarily removed the app while strengthening security. In an email to users, Kiam confirmed the shutdown but did not fully acknowledge how much data had been exposed.
The incident highlights ongoing concerns around privacy and app store vetting. Allowing a data-intensive app like Neon to reach mass adoption without robust safeguards raises serious questions about the security standards of platforms like the App Store.
If you downloaded Neon, it’s recommended that you check whether your phone number or recordings have appeared online and monitor your accounts for unusual activity. The breach underscores the risks of trading personal data for rewards and the urgent need for stronger privacy protections in apps handling sensitive information.