Report: Most Mental Health, Prayer Apps Have Abysmal Security And Privacy Standards
from the the-more-things-change dept
From the Internet of very broken things to telecom networks, the state of U.S. privacy and user security is arguably pathetic. It’s 2022 and we still don’t have even a basic privacy law for the Internet era, in large part because over-collection of data is too profitable to a wide swath of industries, which, in turn, lobby Congress to do either nothing, or the wrong thing.
Apps routinely aren’t much of an exception. Mozilla’s latest *Privacy Not Included guide analyzed the privacy and security standards of 32 mental health and prayer apps, and gave 29 of them a “privacy not included” warning label indicating they failed to adhere to even basic user privacy standards:
“The vast majority of mental health and prayer apps are exceptionally creepy. They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data. Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”
The problems included an over-collection and sale of data (including the collection of some mental health chat transcripts), poor password creation standards, and nebulous and undercooked privacy policies. Better Help, Youper, Better Stop Suicide, Woebot, Talkspace, and Pray.com were deemed the worst offenders. Only three of the 32 app makers responded to a Mozilla request for comment.
The discovery shouldn’t be particularly surprising. Back in February Politico revealed that a top suicide help hotline was caught collecting and selling “anonymized” (a useless term) user data.
The U.S. isn’t known for quality mental health care, but online mental health apps and services are booming, with a particular focus on the sale of ketamine and psychedelics for therapeutic use. But many of these services have all the kinds of problems you might expect (shoddy therapy, incorrect doses) before you even get to the potential privacy problems that will ultimately and inevitably appear.
Again, abysmal federal security and privacy standards and feckless, under resourced U.S. privacy regulators are an intentional feature, not a bug.
It’s not that difficult to pass a baseline privacy law for the Internet era that at least erects some basic guard rails and base-level accountability for bad actors and executives. But we have no such law because a huge array of industries have lobbied Congress into apathy and dysfunction, with the cost being repeatedly borne by ordinary Americans.
It will keep happening until there’s a privacy and security scandal so idiotically ferocious that the problem will be impossible to ignore (probably involving either significant deaths, or the extremely sensitive and personal data of powerful people). Even then, there’s no guarantee a grotesquely corrupt U.S. Congress will be willing or able to respond competently to the challenge.