Despite their sensitive content, therapy apps have some of the worst privacy protections, an investigation finds
© Getty Images / hxyume
Despite their warm fuzzy packaging and seemingly altruistic intentions, mental health and prayer apps are “worse than any other product” when it comes to user privacy and security, an analysis by browser firm Mozilla revealed on Monday.
Read more
“The vast majority of mental health and prayer apps are exceptionally creepy,” Mozilla’s Jen Caltrider, the primary creator of the firm’s “Privacy not Included” guide, which evaluated 32 such apps on their respect for users’ personal data, told the Verge on Monday. Caltrider noted that the apps one might think would recognize the sensitive nature of their data instead “track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.”
Apps slurping up biometric data and other seemingly private information are nothing new, but given the assumption of privacy that comes with the real-life relationship between a therapist and patient, or a believer and religious institution, one might hope that app developers would at least attempt to replicate such safe spaces when taking the whole process online.
However, the Privacy not Included guide found that out of 32 such apps, fully 29 received a privacy warning label, noting that they either stored large amounts of personal data under vague privacy policies or otherwise maintained poor security practices, such as weak passwords. One popular app, Talkspace, stashes entire chat transcripts between user and therapist. Another, an AI therapy chatbot called Woebot, actually collects information about users from third parties, then shares it for advertising purposes, all while pretending to be their silicon shoulder to cry on.
READ MORE:
Probe launched into TikTok’s impact on children
Mozillla researcher Misha Rykov referred to the apps his team analyzed as “data-sucking machines with a mental health app veneer,” or “a wolf in sheep’s clothing.” But with real-life therapy increasingly expensive and the process of finding a therapist who ‘matches’ a patient hit or miss, the lure of a friendly voice just a click away is difficult to resist for some. But given the apps’ apparent core purpose of data mining and selling users’ deepest darkest secrets, it might be a better idea to hold one’s tongue until one can meet up with a qualified therapist or at least a trusted friend – ideally in real life.