Suicide Hotline Collected, Monetized The Data Of Desperate People, Because Of Course It Did

from the money-matters dept

Another day, another privacy scandal that likely ends with nothing changing.

Crisis Text Line, one of the nation’s largest nonprofit support options for the suicidal, is in some hot water. A Politico report last week highlighted how the company has been caught collecting and monetizing the data of callers… to create and market customer service software. More specifically, Crisis Text Line says it “anonymizes” some user and interaction data (ranging from the frequency certain words are used, to the type of distress users are experiencing) and sells it to a for-profit partner named Loris.ai. Crisis Text Line has a minority stake in Loris.ai, and gets a cut of their revenues in exchange.

As we’ve seen in countless privacy scandals before this one, the idea that this data is “anonymized” is once again held up as some kind of get out of jail free card:

“Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly ?anonymized,? stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world ? in Loris? case, by making ?customer support more human, empathetic, and scalable.”

But as we’ve noted more times than I can count, “anonymized” is effectively a meaningless term in the privacy realm. Study after study after study has shown that it’s relatively trivial to identify a user’s “anonymized” footprint when that data is combined with a variety of other datasets. For a long time the press couldn’t be bothered to point this out, something that’s thankfully starting to change.

Also, just like most privacy scandals, the organization caught selling access to this data goes out of its way to portray it as something much different than it actually is. In this case, they’re acting as if they’re just being super altruistic:

“We view the relationship with Loris.ai as a valuable way to put more empathy into the world, while rigorously upholding our commitment to protecting the safety and anonymity of our texters,? Rodriguez wrote. He added that “sensitive data from conversations is not commercialized, full stop.”

Obviously there are layers of dysfunction that have helped normalize this kind of stupidity. One, it’s 2021 and we still don’t have even a basic privacy law for the internet era that sets out clear guidelines and imposes stiff penalties on negligent companies, nonprofits, and executives. And we don’t have a basic law because it’s hard (though writing any decent law certainly isn’t easy), but because a parade of large corporations, lobbyists, and revolving door regulators don’t want the data monetization party to suffer even a modest drop in revenues from the introduction of modest accountability, transparency, and empowered end users. It’s just boring old greed. There’s a lot of tap dancing that goes on to pretend that’s not the reason, but it doesn’t make it any less true.

We also don’t adequately fund mental health care in the states, forcing desperate people to reach out to startups that clearly don’t fully understand the scope of their responsibility. We also don’t adequately fund and resource our privacy regulators at agencies like the FTC. And even when the FTC does act (which it often can’t in terms of nonprofits), the penalties and fines are often pathetic in scale of the money being made.

Even before these problems are considered, you have to factor that the entire adtech space reaches across industries from big tech to telecom, and is designed specifically to be a convoluted nightmare making oversight as difficult as possible. The end result of this is just about what you’d expect. A steady parade of scandals (like the other big scandal last week in which gay/bi dating and Muslim prayer apps were caught selling user location data) that briefly generate a few headlines and furrowed eyebrows without any meaningful change.

Filed Under: , , , ,
Companies: crisis text line, loris.ai

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Suicide Hotline Collected, Monetized The Data Of Desperate People, Because Of Course It Did”

Subscribe: RSS Leave a comment
32 Comments
This comment has been deemed insightful by the community.
That One Guy (profile) says:

There's indifference, there's evil, and then there's THIS...

One, it’s 2021 and we still don’t have even a basic privacy law

Typo or article just been sitting in the queue for a long time?

Moment of levity aside it takes a truly spectacular type of scumbag to look at a suicide prevention hotline job and ask yourself ‘How can we not only monetize this but do so in a way that has a very real chance of decreasing the willingness of people to contact us, potentially with lethal results?’, so it looks like barely a month into the year and we’ve already got a two-for-one contender for ‘Biggest Asshole of 2022’.

Whatever greedy wastes of flesh that were involved need to be publicly fired, with an announcement that such behavior is absolutely out of bounds because as it stands they just made clear that as much as they might claim to care about the people who contact them they have no problem exploiting those people for a quick buck.

That Anonymous Coward (profile) says:

Re: Re: There's indifference, there's evil, and then there's THI

It is a defense mechanism, not a poor word choice.

In the stream of horrible shit in the world, this company’s actions actually found something lower than we thought was possible.
My brain was trying to puzzle out what her next company would be but even I couldn’t string together inflicting pain on small children to train an AI if its a cry of pain or something else.

We run a free daycare!
Your child might come home with some scrapes and bruises, ignore those its fine.
New from Nani.ai smart devices to make sure your nanny isn’t actually harming your child by listening.

Samuel Abram (profile) says:

Re: There's indifference, there's evil, and then there's THIS...

t takes a truly spectacular type of scumbag to look at a suicide prevention hotline job and ask yourself ‘How can we not only monetize this but do so in a way that has a very real chance of decreasing the willingness of people to contact us, potentially with lethal results?’, so it looks like barely a month into the year and we’ve already got a two-for-one contender for ‘Biggest Asshole of 2022’.

I wouldn’t just say whoever did such a thing is merely an "asshole", but what they really are is a monster.

Ninja says:

Re: There's indifference, there's evil, and then there's THIS...

March 2020 had a brief hiccup in the second half of 2021 when ppl got their vaccines and thought we could finally move on with our lives. But they forgot greed (African and other poor countries with almost no vaccination) and anti-vaxxers so we are now stuck in the second half of 2021.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

O_O
I…just…fuck this.

"by making “customer support more human, empathetic, and scalable.""

We sell our product to Comcast so that we can tell when we’ve finally pushed the customer to hard & might end up being sued by a distraught family after we push them over the edge.

So lets see…
Lets cash in on people in one of the worst days of their lives.
Lets record keywords and how many times they tell us they feel worthless & hopeless and then SELL that data.
Let us TOTALLY betray any trust these people might have had in our desire to help them.
Did they keep me talking so they would have more data to sell?
(Look at what we KNOW they were doing & tell me you’ve never heard about phone bank workers milking customers on the phones.)
We’re here to help… as long as we can make a few bucks from your suffering.
You can trust us, we’re part owner of the company we sell your worst day to and boy this pandemic has made us rich!

This is one of those moments I regret not being the ELE required to cleanse the world of the human race in the past.

Once upon a time ‘bad guys’ had a code…
No women, no children.
They had respect for themselves.

Preying on suicidal people for a buck… they found a way to be worse than Prosperity Preachers.

Of course nothing will happen, except the next asshole doing this will keep it quiet longer & use more shell companies to distance themselves.

Loris.ai turning your suicidal thoughts into customer service.

I admit this is a moment when I wish I could still tweet, pretty sure the people I hung out with would be as horrified as I am & would have no problem finding every scrap of public information about the bastards who did this & making their lives real pleasant.

You thought Tucker Carlsons advertisers fled quick think how fast Loris.ai would be losing customers when a couple thousand people ask corporations if exploiting suicidal people to deliver better customer service was worth it.

You think the Wordle backlash was bad… you ain’t seen nothing yet.

That Anonymous Coward (profile) says:

I heart moderation…

Did we mention all of the counselors are volunteers?
30 hours of training & sometimes Nancy would offer suggestions like just listening to some Taylor Swift. o_o

Did we mention Nancy Lubin got run out of CTL?
Teen Vogue did.
google "nancy lublins crisis text line what happened"

Mentioning weight watchers to plump employees.
Running a Happiness Survey then suggesting those unhappy leave.

But its AI and it trains CSRs how to handle hard calls.

something something lacking depth and warmth

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re: Re:

I don’t care that its a joke. Its deadly serious. Paranoia is a large driver of suicide and fears over this approach is something the program has been working on for years. This revelation will completely undermine trust in the suicide prevention hotline program among groups that absolutely need it. Have a fucking flag.

Bruce C. says:

In the case of mental health...

Actually there is a privacy law. This would seem to fall under HIPAA, as they are dealing with people who are suffering a mental health crisis. Just how the crisis line could get informed, rational consent to release health information from a person suffering a mental health crisis is questionable at best.

nasch (profile) says:

Re: In the case of mental health...

Just how the crisis line could get informed, rational consent to release health information from a person suffering a mental health crisis is questionable at best.

Suffering from a mental health crisis doesn’t indicate a lack of competence to consent to something. Such a person may or may not be able to give consent, and that is something that would have to be evaluated in each case, regardless of whether they called a hotline, walked into an emergency room, or whatever else (incidentally refusing treatment also cannot be used to declare someone incompetent).

Anonymous Coward says:

Re: In the case of mental health...

HIPAA is useless as a privacy law. I don’t think privacy was supposed to be a major element of HIPAA anyway. The HIPAA Privacy Rule doesn’t apply to employers, many government officials, the media, data brokers, FAANGM/MAMAA (Facebook, Google, etc.), sellers of wearables, and any manner of people/companies which should have no business with your medical information.

From the US Department of Health and Human Services:

The HIPAA Privacy Rule establishes national standards to protect individuals’ medical records and other individually identifiable health information (collectively defined as “protected health information”) and applies to health plans, health care clearinghouses, and those health care providers that conduct certain health care transactions electronically. [Emphasis added.]

Crisis Text Line is a non-profit organization which offers a texting service. It’s definitely not covered by HIPAA.

The US needs a privacy law which is stronger than the GDPR. The GDPR wasn’t enough. All it has done was increase the number of "consent to data collection" popups on websites. There’s no guarantee that the "reject" buttons do anything. The GDPR has also failed to stop Google’s deplorable real-time bidding (RTB).

Anonymous Coward says:

Re: Re: In the case of mental health...

HIPAA’s privacy provisions work just fine. People often misunderstand the scope and intent of the law. The law is and only ever was designed to cover healthcare providers and the associated admin infrastructure. And mostly deals with who, how, and when a party can access medical records. I can only speak for myself and what I’ve seen, but my healthcare organization was serious as a heart-attack about protected health information.

Anonymous Coward says:

Re: Re: In the case of mental health...

The US needs a privacy law which is stronger than the GDPR. The GDPR wasn’t enough. All it has done was increase the number of "consent to data collection" popups on websites. There’s no guarantee that the "reject" buttons do anything.

No law can ever prevent anything. All the law can do is define rules and reparations for breaking those rules.

The only option is to never give the data to them in the first place. Period. Full stop. I still can’t figure out why despite decades of everyone chanting "Only put it online, if you are ready for everyone in the world to know about it." People still expect that putting crap on Facebook / YouTube / Twitter / Instagram / etc. is somehow exempt from the rule.

As for the compelled disclosure, read: mandatory surveillance, that has infected so much of the IT industry and everything else… That crap needs to be outlawed in it’s entirety. Along with a mandatory minimum 10% of global revenue per instance penalty for any company found doing it. This crap needs to die, and the only way that will happen is to poison the fountain of profits these companies have found for themselves.

The GDPR has also failed to stop Google’s deplorable real-time bidding (RTB).

The only option there is to install a javascript blocker. They can’t use your electricity and bandwidth if you deny them the CPU cycles they need to do it with. If they want to run the damn code, then they can pay for it themselves.

That Anonymous Coward (profile) says:

Re: In the case of mental health...

Hippo wouldn’t apply here even on a good day.
They are voluntarily contacting a volunteer, not an actual medical professional.
I’m sure buried in the fine print, that on a good day most people ignore so can’t we fault someone in crisis not checking it, is some obscure way of saying that maybe something might be used elsewhere but we’ll totes keep your name out of it.

Anonymous Coward says:

The only way to achieve data anonymization

is to not collect data at all, or to collect as little data as possible to fulfill whatever service the client needs and to purge that data as soon as possible. Without a federal general privacy law stronger than the GDPR, almost no company, organization, or government agency would do any of that voluntarily.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...