Mental Health Month: Mozilla Data Shows Some Apps Fail To Protect Privacy

May is Mental Health Awareness Month. Mozilla released data Tuesday around mental health apps, citing survey results published last year that estimate 90% of people in the United States believe the country is facing a mental health crisis.

As the search and demand for mental health services continues to rise, Mozilla’s latest  *Privacy Not Included research reveals that mental health apps are failing to protect user privacy and security.

Fifty-nine percent of the top apps analyzed were given a *Privacy Not Included warning label, while 40% have gotten worse in the past year, according to the research related to Mental Health Awareness Month. 

Exposing shady data practices can motivate some tech companies to do better.

Nearly one-third of apps analyzed made some improvements compared with 2022 performance, and apps from PTSD Coach, and the artificial intelligence (AI) chatbot Wysa, received a Best Of citation from Mozilla, which uses a product page to spotlight the apps doing privacy and security right. It also lists Mental Health Apps. And while it doesn’t endorse any apps, it does provide a comprehensive list.



More than 255 hours of research, including more than eight hours of research per product, went into creating the 2023 mental health app guide. During the past six years, Mozilla reviewed more than 100 apps and 300 internet-connected devices under the initiative. 

Several popular apps like Youper and Woebot made positive changes. Youper landed in the “most-improved app” running for significantly strengthening both its password requirements and privacy policy. Woebot updated its privacy policy to explain that all users now have the same rights to access and delete their own data. Modern Health improved its policy’s transparency by now stating clearly that it doesn’t “sell, disclose, and/or share personal information.”

Not all made positive changes. Mozilla named Replika: My AI Friend as one of the worst apps due to weak password requirements, sharing of personal data with advertisers, and recording of personal photos, videos, and voice and text messages consumers shared with the chatbot.

The app Cerebral was packed with trackers, about 799 within the first minute of download. Talkspace, Happify, and BetterHelp pushed consumers into taking questionnaires up front without asking for consent or showing their privacy policies first. 

Apps that Mozilla investigates connect users with therapists, feature AI chat bots, run community support pages, offer mood journals and well-being assessments and more. Despite these apps dealing with depression, anxiety, suicidal thoughts, domestic violence, eating disorders and PTSD, the worst routinely share data, target vulnerable users with personalized ads, allow weak passwords, and feature vague and poorly written privacy policies. 

Next story loading loading..