App Pays You For Your Phone Calls

Category: Software | Published: 2025-10-02

What Neon Is And Who Is Behind It?

Neon is a consumer app that pays users to record their phone calls and sells the anonymised data to artificial intelligence companies for use in training machine learning models. Marketed as a way to _“cash in”_ on phone data, it positions itself as a fairer alternative to tech firms that profit from user data without compensation. The app is operated by Neon Mobile, Inc., whose New York-based founder, Alex Kiam, is a former data broker who previously helped sell training data to AI developers.

Only Just Launched

The app launched in the United States this month (September 2025). According to app analytics tracking, Neon entered the U.S. App Store charts on 18 September, ranking 476th in the Social Networking category. Amazingly, by 25 September, it had climbed to the No. 2 spot, and reached the top 10 overall ! On its peak day, it was downloaded more than 75,000 times. No official launch has yet taken place in the UK.

How Does The App Work?

Neon allows users to place phone calls using its in-app dialler, which routes audio through its servers. Calls made to other Neon users are recorded on both sides, while calls to non-users are recorded on one side only. Transcripts and recordings are then anonymised, with personal details such as names and phone numbers removed, before being sold to third parties. Neon says these include AI firms building voice assistants, transcription systems, and speech recognition tools.

Users are then paid in cash for the calls, credited to a linked account. The earnings model actually promises up to $30 per day, with 30 cents per minute for calls to other Neon users and lower rates for calls to non-users. Referral bonuses are also offered. While consumer data is routinely collected by many apps, Neon stands out because it offers direct financial incentives for the collection of real human speech, a form of data that is more intimate and sensitive than most.

The Legal Language Behind The Data Deal

Neon’s terms of service give the company an unusually broad licence to use and resell recordings. This includes a worldwide, irrevocable, exclusive right to reproduce, host, modify, distribute, and create derivative works from user submissions. The licence is royalty-free, transferable, and allows for sublicensing through multiple tiers. Neon also claims full ownership of outputs created from user data, such as training models or audio derivatives. For most users, this means permanently giving up control over how their voice data may be reused, sold, or processed in future.

Why The App Took Off So Quickly

Neon’s rapid growth appears to have been driven by a combination of curiosity, novelty, and, of course, cash and referral-led incentives. Many users were drawn in by the promise of payment for something they do every day anyway, i.e., talking on the phone. The idea of monetising phone calls is also likely to have appealed particularly to users who are increasingly aware that their data is being collected and sold elsewhere.

Social media posts promoting referral links and earnings screenshots also seem to have really helped fuel viral growth. At the same time, widespread interest in AI tools has normalised the idea of systems that listen, learn, and improve through exposure to large datasets.

What Went Wrong?

Unfortunately, it seems that shortly after Neon became one of the most downloaded apps in the U.S., independent analysis revealed a serious security flaw. The app’s backend was found to be exposing not only user recordings and transcripts but also associated metadata. This included phone numbers, call durations, timestamps, and payment amounts. Audio files could be accessed via direct URLs without authentication, creating a significant privacy risk for anyone whose voice was captured.

Neon’s response was to take the servers offline temporarily. In an email to users, the company said it was _“adding extra layers of security”_ to protect data. However, the email did not mention the specific details of the exposure or what user information had been compromised. The app itself remained listed in the App Store, but was no longer functional due to the server shutdown.

Legal And Ethical Concerns Around Recording

Neon’s approach raises a number of legal questions, particularly around consent and data protection. For example, in the United States, phone call recording laws differ by state. Some states require consent from all participants, while others allow one-party consent. By only recording one side of a call when the other participant is not a Neon user, the company appears to be trying to avoid falling foul of two-party consent laws. However, experts have questioned whether this distinction is sufficient, especially when metadata and transcript content may still reveal personal information about the other party.

In the UK, where GDPR rules apply, the bar for lawful processing of voice data is much higher. Call recordings here are considered personal data, and companies must have a lawful basis to record and process them. This could be consent, contractual necessity, legal obligation, or legitimate interest. In practice, UK organisations must be transparent, inform all parties at the start of a call, and apply strict safeguards around storage, retention, and third-party sharing. If the recording includes special category data, such as health or political views, the legal threshold is even higher.

Why The Terms May Create Future Risk

The app’s terms of service not only cover the use of call data for AI training, but also grant Neon the right to redistribute or modify that data without further input from the user. That includes the right to create and sell synthetic voice products based on recordings, or to allow third-party developers to embed user speech in new datasets. This means that, once the data is sold, users have no real practical way of tracking where it ends up, who uses it, or for what purpose. That includes the potential for misuse in deepfake technologies or other forms of AI-generated impersonation.

Trust Issue For Neon?

The exposure of call data so early in the app’s lifecycle does seem to have caused (not surprisingly) a major trust issue. While the company has said it is fixing the security problem, it will now be subject to much higher scrutiny from app platforms, data buyers, and regulators. If Neon wants to relaunch, it may need to undergo independent security audits, publish full transparency reports, and add explicit call recording notifications and consent features. Commercially, the setback may impact deals with AI firms if those companies decide to distance themselves from controversial datasets.

What About The AI Companies Using Voice Data?

For companies developing speech models, the incident highlights the importance of knowing exactly how training data has been sourced. For example, buyers of voice datasets will now need to ask more detailed questions about licensing, user consent, jurisdiction, and security. Any material flaw in the source of data can invalidate models downstream, especially if it leads to legal challenges or regulatory action. Data provenance and ethical sourcing are likely to become higher priorities in due diligence processes for commercial AI development.

Issues For Users

While Neon claims to anonymise data, voice recordings generally carry an inherent risk. For example, voice is increasingly used as a biometric identifier, and recorded speech can be used to train systems that replicate tone, mannerisms, and emotional expression. For individuals, this could lead to impersonation or fraud. For businesses, there is a separate concern. If employees use Neon to record work calls, they may be exposing cl