Mental health and productivity apps may be leaking your behavioral data

Productivity, wellness and focus apps are getting increasingly more popular. However, these tools, created for well-being, could also have a significant, yet hidden, data leak and privacy problem.

Indrabati Lahiri

By 

Indrabati Lahiri

Published 

Jan 29, 2026

Mental health and productivity apps may be leaking your behavioral data

The use of productivity, wellness and focus apps have boomed significantly in the last few years – a trend accelerated by the global shift to remote work, as well as a desire to digitally detox, minimize distractions and overcome social media addiction. 

However, these productivity and focus tools could also have a significant, yet hidden, data leak and privacy problem. They often collect and analyse far more personal data than strictly required to perform core functions, usually relying on uninformed user consent and feeding it back to the same attention economy users actively try to escape. 

Every timer, every blocked app, routine and habit entered into these apps could potentially leave a trail of personal data that is being quietly monitored by these apps. This data goldmine can then be shared with a number of advertisers, employers and third-party analytics companies. 

The irony? This is the exact same thing that social media giants like Instagram, Facebook and TikTok, among others do as well.

Recent data leakage incidents like the BetterHelp scandal have shed more light on this growing issue. 

“The reality with these apps is much darker. They didn’t lose the data. They sold it. The FTC found that BetterHelp wasn’t the victim of a breach; they were intentionally feeding private patient metadata to Facebook,” Joel Thomas Blackstock, clinical director of Taproot Therapy Collective, pointed out. 

“This allowed Facebook to target users with ads based on their specific mental health conditions. They monetized the mental illness of their users for profit.”

Mozilla Foundation’s ongoing research into mental health apps has also flagged several apps like Abide, Mindshift CBT, Rainn, Headspace, BetterHelp and Cerebral among others as apps which track and share intimate user data like moods, thoughts and biometrics with third parties and social media for profit. 

Similarly, a number of privacy concerns have been raised about productivity tools often required by major companies like ChatGPT, Otter.ai and Time Doctor as well. 

So are focus and productivity geared apps really protecting their users better than social media? Or is it the same level of data collections, but disguised as self-care in the new wellness economy?

What’s driving the productivity and wellness boom? 

The COVID-19 pandemic  was one of the key drivers of the global productivity boom in recent years, causing a surge in digital tools created to collaborate, manage tasks and maintain efficiency in remote work situations. 

It also led to increased mental health awareness, with wellness, therapy and mindfulness apps like Calm, BetterHelp and Headspace seeing major growth due to pandemic-induced stress, burnout and anxiety. 

The Global Wellness Institute estimates that the worldwide wellness economy will hit almost $9.8 trillion by 2029. 

This has supported the creation of the attention economy, where human attention, time, mental energy and focus are all seen as scarce, valuable assets to be tracked, managed and optimised: both by individuals themselves and the companies harvesting, analysing and selling this data. 

Several productivity, focus and wellness tools now entice users with promises of “peak productivity”, while quietly gathering behavioral data and insights to sell downstream. 

Rising digital fatigue and growing interest in the self-optimisation culture has further contributed to this shift, along with factors such as added gamification features and the integration of artificial intelligence in productivity tools. 

The personal data being collected without you realising

One of the most common types of data being collected by these tools is behavioral data, which is especially valuable in the new attention economy. This includes focus sessions, app usage, idle time, productivity scores, problem areas and much more. 

“When you wake up, when you're most and least productive, your social media habits, and even your emotional state are all inferred from your app usage,” Danie Strachan, senior privacy counsel at VeraSafe, pointed out. 

She added: “Imagine you hired a personal assistant who not only organizes your schedule but also secretly logs every conversation you have, every place you go, and every mood swing you experience throughout the day. You’d fire them immediately!”

Other key data such as IP addresses, device identifiers, cross-device syncing, location and time patterns are also commonly collected. 

In workplace situations, companies can also require employees to install productivity trackers and agree to them collecting usage analytics, which can easily be missed or overlooked in the chaos of onboarding in a new job. 

“In reality, the app logs keystrokes, captures active window titles, records idle time down to the second. This data feeds performance reviews, flags 'unproductive' employees and sometimes leaks through the same security holes we find in consumer apps,” Sergey Toshin, founder of Oversecured, noted. 

All this data, which is usually far more than is strictly required for core functions, can then potentially be shared with a huge and often unseen ecosystem of third parties. This includes data brokers, advertisers, analytics firms, employers, insurance companies, social media, associated product companies and others. 

“What does this mean in practice? A malicious app on the same device can read your 'focus time' logs and know exactly when you're distracted, stressed, or procrastinating. A data broker builds a profile showing you check Instagram 47 times a day during work hours,” Toshin explained. 

“That profile gets sold, ends up with insurers or employers. An attacker uses exposed API keys to pull usage data for thousands of users at once. Your 'digital well-being' app becomes a liability report on your habits,” he added. 

Why consent is a grey zone

Alarmed users may be wondering how do these third parties even access all this data, when they didn’t consent to sharing it.

However, consent in the context of these productivity and wellness apps can be a very murky minefield, despite the implementation of privacy and data protection laws like the GDPR and CCPA. 

One of the main disadvantages of free productivity and wellness apps is that they usually require a high level of access and default permissions, while making opting out a complicated nightmare in some cases. 

App companies may very often operate on partial or non-existent transparency, with very few people actually thoroughly reading privacy policies before rushing to consent to them. These policies may contain vague language about sharing data with “service providers” and “affiliates” for “research purposes”. 

“When users share their data with one of these apps, they’re implicitly trusting an entire supply chain they can't see or verify. It's like giving your house key to a contractor and finding out they've made copies for a dozen subcontractors you've never met,” Strachan noted. 

In workplace situations, consent is even more complicated, as it can be tied to mandatory tools, with compliance refusal having the potential to severely impact an employee’s position at the company. 

“The problem is that workplace consent isn’t truly voluntary. True consent must be freely given, but an employee cannot freely refuse a tool if their job depends on it. They’ll sign anything in order to land or keep a job,” Strachen explained. 

She added: “This creates a significant ethical problem. Legally, this form of consent is on very shaky ground under the GDPR.”

Many mindfulness, wellness and mental health apps also sidestep health regulations while still gathering the data to build diagnostic profiles through user habits and health data. 

“From a clinical perspective, these apps are collecting unregulated health data. When an app tracks exactly when a user loses focus, how often they pick up their phone, and what sites they impulsively click, they are building a diagnostic profile for ADHD and executive dysfunction,” Blackstock pointed out. 

He added: “By labelling themselves as organisational tools rather than medical devices, they bypass HIPPA. This allows them to strip-mine behavioral health data that would be legally protected in a clinical setting and feed it into the algorithmic ad machinery.”

Is productivity being sold at the cost of privacy? 

While many productivity tools have undoubtedly helped people, especially younger generations, focus more and cut out the noise of social media and digital distractions, this growing data leak problem remains a concerning issue. 

Users can attempt to protect themselves by choosing paid versions of apps, and looking for tools prioritizing local storage over cloud, which can increase data privacy. 

However, as the attention economy gathers speed, the real question remains: is privacy now going to be the true cost of focus, well-being and productivity? 

Related Posts