When Tech Companies Do It, The NY Times Calls It ‘Dark Patterns,’; When The NY Times Does It, It’s Called ‘Being Smart’

from the seems-moderately-hypocritical dept

This post was inspired by a Benedict Evans’ tweet.

One of the many phrases that has become popular to an annoying degree over the last few years is the concept of “dark patterns.” These are, we’re told, sneaky, ethically dubious ways, in which companies — usually “big tech” — trick users into doing what the companies want. I’m not saying that companies don’t do sketchy stuff to try to make money. Lots of companies do. Indeed, we’ve spent decades calling out some pretty sketchy behavior by companies to get your money. But the phrase “dark patterns” has such a connotation to it, and it is now used in cases that, um, don’t even seem that bad (and, yes, I’ve used the term myself, once, but, that was demonstrating specific behavior that was pretty clearly fraudulent).

Of course, the NY Times is among the media orgs which have really popularized the phrase. It wrote one of the earliest popular articles about the concept, and has called it out multiple times when talking about tech companies. That last one is particularly notable because it was written by a member of the NY Times, Greg Bensinger. He really, really doesn’t like “dark patterns” that manipulate users into… say, “signing up for things.”

These are examples of “dark patterns,” the techniques that companies use online to get consumers to sign up for things, keep subscriptions they might otherwise cancel or turn over more personal data. They come in countless variations: giant blinking sign-up buttons, hidden unsubscribe links, red X’s that actually open new pages, countdown timers and pre-checked options for marketing spam. Think of them as the digital equivalent of trying to cancel a gym membership.

He’s pretty sure we need legislation to take down dark patterns.

Companies can’t be expected to reform themselves; they use dark patterns because they work. And while no laws will be able to anticipate or prevent every type of dark pattern, lawmakers can begin to chip away at the imbalance between consumers and corporations by cracking down on these clearly deceptive practices.

Companies can’t be expected to reform themselves.

Anyway. That’s called foreshadowing.

Now, let’s talk about a new piece written by a data scientist at the NY Times, about “how the NY Times uses machine learning to make its paywall smarter.”

The company’s paywall strategy revolves around the concept of the subscription funnel (Figure 1). At the top of the funnel are unregistered users who do not yet have an account with The Times. Once they hit the meter limit for their unregistered status, they are shown a registration wall that blocks access and asks them to make an account with us, or to log in if they already have an account. Doing this gives them access to more free content and, since their activity is now linked to their registration ID, it allows us to better understand their current appetite for Times content. This user information is valuable for any machine learning application and powers the Dynamic Meter as well. Once registered users hit their meter limit, they are served a paywall with a subscription offer. It is this moment that the Dynamic Meter model controls. The model learns from the first-party engagement data of registered users and determines the appropriate meter limit in order to optimize for one or more business K.P.I.s (Key Performance Indicators).

Cool. (And, yes, this is legitimately cool to see how the company handles the meter in a more dynamic way — more companies should be more dynamic like that).

But, um, also, isn’t that… a dark pattern? I mean, it’s not clear to the end user. It’s designed to — and I’ll quote here — “get consumers to sign up for things, keep subscriptions they might otherwise cancel or turn over more personal data.”

The writeup by the data scientist is pretty clear what they’re trying to do here:

Thus, the model must take an action that will affect a user’s behavior and influence the outcome, such as their subscription propensity and engagement with Times content.

I mean, this is all kind of interesting, but… it sure sounds like what the NY Times editorial side is complaining about as a dark pattern.

And that’s where some of the problem with the term comes into play. There’s a spectrum of behavior — some of which is just smart business and tech practices, and some of which is more nefarious. But using the term “dark patterns” to broadly describe anything that we don’t understand, or can’t see that is designed to get you to do something… becomes problematic pretty quickly. I don’t have a problem with the way the NY Times runs its paywall. It’s trying to make it work in a reasonable way that converts more users into subscribers.

Clearly, this is not as nefarious as services that make it impossible to, say, cancel your subscription without first having to talk to a human (oh wait, the NY Times does that too?). Um, okay, it’s not as nefarious as literally tricking people into signing up for recurring payments rather than a one time thing. That’s even worse.

But, it is still a spectrum. And when we refer to any optimization efforts as “dark patterns” then the problematic parts of “dark patterns” lose their meaning. It becomes way too easy to smear perfectly reasonable efforts as somehow nefarious. It’s fine to talk about sketchy things that companies do, but we should be specific about what they are and why they’re sketchy, rather than just assuming anything that is designed to drive conversions is inherently problematic.

The fact that the NY Times on the tech/business side uses exactly what the editorial side condemns in order to be able to pay the editorial side’s salaries, should at least inform the framing of some of this discussion.

Filed Under: , , ,
Companies: ny times

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “When Tech Companies Do It, The NY Times Calls It ‘Dark Patterns,’; When The NY Times Does It, It’s Called ‘Being Smart’”

Subscribe: RSS Leave a comment
22 Comments
Kinetic Gothic says:

Here’s the thing, “dark patterns” isn’t just used for processes that manipulate behavior, if they did every store offering members clubs or loss leaders would be engaging in dark patterns it’s predominantly for those that have an addition element of being deceptive as well, something that they do mention in the quoted article. And here not only are the NYT’s paywalls fairly plain and straightforward, but THEY WROTE AN ARTICLE TELLING PEOPLE WHAT THEY DO

This comment has been deemed insightful by the community.
Mike Masnick (profile) says:

Re:

And here not only are the NYT’s paywalls fairly plain and straightforward, but THEY WROTE AN ARTICLE TELLING PEOPLE WHAT THEY DO

I don’t know about that. They don’t reveal how many free articles you will get to see before a paywall. That’s a secret that each individual will experience on their own. The fact that that is variable, still strikes me as a dark pattern, in which the system is secretly changing things based on the machine learning algorithm’s decision of what’s most likely to get you to subscribe.

How is that different from Amazon recommending what it thinks you’ll want to purchase or Google popping up ads?

PaulT (profile) says:

Re: Re: Re:

“Google doesn’t force me to log in to use it, so its adverts are untargeted and in Dutch.”

I use Google’s podcast app, not because it’s good but because it’s convenient and it’s “good enough” (I know I should shop around but I’m lazy, etc.). I’m in Spain, and sometimes when I listen to UK podcasts, the dynamic ads it inserts are in German. There’s no indication from anything I’ve ever done apart from a weeks’ holiday in Berlin 10 years ago that such a thing is relevant to me.

For all the panic about what they know about me and how they can control me, their core ad business not pushing something I can understand gives me hope.

Anonymous Coward says:

Re:

Yeah, that was noted in the article. What Mike didn’t note is that a lot of people, presumably including reporters, want to keep their jobs, and are therefore reluctant to criticize their employers too openly. So, they’ll write about “dark patterns” in a generic, abstract sense, and I guess not too many will attempt to write a headline like “Times hassles subscribers who wish to cancel; better regulation needed”. But you can bet that some of their employees feel exactly that way, and other people would be reluctant to work for the Times because of this. (Maybe there’s someone at the Times tracking “key performance indicators” of how many people have left the paper or turned down job offers for this reason. Maybe not, because why collect measurements that might make you look bad?)

Of course a data scientist is gonna be excited to talk about all the ways they use and examine data. It doesn’t imply anything about the employees that were not part of the decision to do that.

TKnarr (profile) says:

Not a dark pattern here

No, the NYT’s Dynamic Meter isn’t a “dark pattern”. A dark pattern is one designed and intended to obscure the choice being offered in order to entice the user into making a choice they wouldn’t make if it were offered plainly. An example would be a service offering a free trial, but to get the free trial you have to provide payment information. Why? If it were a free trial, there’s no need for payment information because no payment is being asked for. If you read the fine print, the “free trial” will automatically convert into a paid subscription to the service unless the customer cancels before the free trial period is over. There will be no reminder sent before the conversion, only a notice that you’ve been charged afterwards. The service hopes that by the time the free trial expires most users will have forgotten about it and they’ll be able to start charging for the subscription. If users were presented with a clear choice, a notice at the end of the free trial that it was over and that if they wanted to continue using the service they’d have to pay for it, most users would turn it down. That is a dark pattern.

By contrast, the NYT offers a clear choice up front: if you want to see the content, you have to register. The amount of free content you get before being offered the choice varies based on what the Dynamic Meter thinks will best tempt customers to register, but there isn’t anything obscuring the choice being offered or registering users for taking actions they believe won’t result in registration.

PaulT (profile) says:

Re:

“If you read the fine print, the “free trial” will automatically convert into a paid subscription to the service unless the customer cancels before the free trial period is over.”

That’s just.. standard, though? Who, in this day and age, signs up for something labelled as a trial without also knowing that you’ll get charged if you don’t cancel after the trial period? That’s how they got you with CDs in the 80s, so age isn’t an excuse.

btr1701 (profile) says:

Bypass

Thus, the model must take an action that will affect a
user’s behavior and influence the outcome, such as
their subscription propensity and engagement with
Times content.

The Times model affects my behavior such that I know to quickly deactivate JavaScript before clicking on a Times link so that I can bypass all their ridiculous paywalls and dark patterns.

Leave a Reply to Coyne Tibbets Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...