Yet Again We Remind Policymakers That “Standard Technical Measures” Are No Miracle Solution For Anything

from the for-the-umpteenth-time-still-nope dept

I’m starting to lose count of how many regulatory proceedings there have been in the last 6 months or so to discuss “standard technical measures” in the copyright context. Doing policy work in this space is like living in a zombie movie version of “Groundhog Day” as we keep having to marshal resources to deal with this terrible idea that just won’t die.

The terrible idea? That there is some miracle technological solution that can magically address online copyright infringement (or any policy problem, really, but for now we’ll focus on how this idea keeps coming up in the copyright context). Because when policymakers talk about “standard technical measures” that’s what they mean: that there must be some sort of technical wizardry that can be imposed on online platforms to miraculously eliminate any somehow wrongful content that happens to be on their systems and services.

It’s a delusion that has its roots going back at least to the 1990s, when Congress wrote into the DMCA the requirement that platforms “accommodate and […] not interfere with standard technical measures” if they wanted to be eligible for its safe harbor protections against any potential liability for user infringements. Even back then Congress had no idea what such technologies would look like, and so it defined them in a vague way, as technologies of some sort “used by copyright owners to identify or protect copyrighted works [that] (A) have been developed pursuant to a broad consensus of copyright owners and service providers in an open, fair, voluntary, multi-industry standards process; (B) are available to any person on reasonable and nondiscriminatory terms; and (C) do not impose substantial costs on service providers or substantial burdens on their systems or networks.” Which is a description that even today, a quarter-century later, correlates to precisely zero technologies.

Because, as we pointed out in our previous filing in the previous policy study, there is no technology that could possibly meet all these requirements, even just on the fingerprinting front. And, as we pointed out in this filing, in this policy study, even if you could accurately identify copyrighted works online, no tool can possibly identify infringement. Infringement is an inherently contextual question, and there is no way to load up any sort of technical tool with enough information needed to be able to correctly infer whether a work appearing online is infringing or not. As we explained, it is simply not going to know:

(a) whether there’s a valid copyright in the work at all (because even if such a tool could be fed information directly from Copyright Office records, registration is often granted presumptively, without necessarily testing whether the work is in fact eligible for a copyright at all, or that the party doing the registering is the party entitled to do it);

(b) whether, even if there is a valid copyright, if it is one validly claimed by the party on whose behalf the tool is being used to identify the work(s);

(c) whether a copyrighted work appearing online is appearing online pursuant to a valid license (which the programmer of the tool may have no ability to even know about); or

(d) whether the work appearing online appears online as a fair use, which is the most contextual analysis of all and therefore the most impossible to pre-program with any accuracy – unless, of course, the tool is programmed to presume that it is.

Because the problem with presuming that a fair use is not a fair use, or that a non-infringing work is infringing at all, is that proponents of these tools don’t just want to be able to deploy these tools to say “oh look, here’s some content that may be infringing.” They want those tools’ alerts to be taken as definitive discoveries of infringement that will force a response from the platforms to do something about them. And the only response that will satisfy these proponents is (at minimum) removal of this content (if not also removal of the user, or more) if the platforms want to have any hope of retaining their safe harbor protection. Furthermore, proponents want this removal to happen irrespective of whether the material is actually infringing or not, because they also want to have this happen without any proper adjudication of that question at all.

We already see the problem of platforms being forced to respond to every allegation of infringement as presumptively valid, as an uncheckable flood of takedown notices keep driving offline all sorts of expression that is actually lawful. What these inherently flawed technologies would do is turn that flood into an even greater tsunami as platforms are forced to credit every allegation they automatically spew forth every time they find any instance of a work, no matter how inaccurate such an infringement conclusion actually is.

And that sort of law-caused censorship, forcing expression to be removed without there ever being any adjudication of whether the expression is indeed unlawful, deeply offends the First Amendment, as well as copyright law itself. After all, copyright is all about encouraging new creative expression (as well as the public’s access to it). But forcing platforms to respond to systems like these would be all about suppressing that expression, and an absolutely pointless thing for copyright law to command, whether in its current form as part of the DMCA or any of the new, equally dangerous updates proposed. And it’s a problem that will only get worse as long as anyone thinks that these technologies are any sort of miracle solution to any sort of problem.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Yet Again We Remind Policymakers That “Standard Technical Measures” Are No Miracle Solution For Anything”

Subscribe: RSS Leave a comment
13 Comments
ECA (profile) says:

could YT be mean? YEP

Lets see.
A USA corp gets upset and does the DMCA.
Wellll YT, gets it and restricts USA distribution/showing of the vid.
COOL, everyone else can see it.
Other countries complain? what rights do they have? They are not in the USA. Even if its the same Company, over there. Only that country can complain??

Lets try this in court.
For every DMCA, YT can limit access. But for those that WANT to see the movie, they WONT get upset, and will allow it.
So, Who has the power now?
Corps or Gov? or the people?

This comment has been deemed insightful by the community.
That One Guy (profile) says:

The collateral damage is a feature, not a bug

The core problem I’d say is that those pushing for such measures either are indifferent to the collateral damage such a system would have or consider said damage a positive side-effect if not goal.

Add that on top of the overwhelming self-entitlement and ‘guilty until proven innocent’ mindset that infests all things copyright and it’s unfortunately not at all surprising that you’d have maximalist people and groups acting as thought they have a right to have every other industry bending over backwards for them, no matter the cost.

It’s still worthwhile to point out the damage that such systems and mindset will carry if only to prevent those pushing them from being able to control the narrative but assuming honest ignorance is likely to just result in rude surprises and being on the backfoot.

ECA (profile) says:

Re: This could be interesting

How much money are the Corps paying to keep things restricted and controlled?
Our corps have Spread around the world, and paid off enough Politicians, that never figured those would be a REAL problem, But the USA keeps changing the TIME for holding Any rights.

That if the countries Dont let the Corps do their THING, the Trade agreements we have with them will be subject to cancellation.
(get that? TRADE AGREEMENTS)(we have had this Problem before.)

For all the money to Protect their rights, I would THINK, internationally, they are SPENDING TO MUCH.

Anonymous Coward says:

One problem with filters is that they are, and will be biased to support corporate publishers, and offer little to no protections to the self publishers. This comes from both who is pushing the filters, and also the volume difference between copyrights held by corporations, and copyrights held by self publishers The latter swamp the former by volume, and would require far more computer resources to protect than the corporate held copyrights.

The other significant problem with strengthening requirements for filters, is that the legacy industry is asking for the impossible, possible because they can use failures to eliminate safe harbors for sites that allow file uploads, and self publishing. The threat to the legacy industry is not piracy, but rather the competitions from the self publishers. People watching YouTube are not watching cable and off air TV, or a streaming service.

This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Anonymous Coward says:

No.

Chairman: “No? But… we said the right words from the grimoire to perform the Subpoena! We arranged the chairs in the hearing room according to the principles of Feng Shui. We provided the Silver Microphone with the tasteful pentacle design. We even sacrificed a health-care bill on the Minority Whip’s altar! And you tell us you’re not going to grant our wish: ‘Standard Technological Measures’ to make our sponsor’s dreams of milk and honey come true?

Speaker: True, your subpoena was flawless, your hearing room is appropriate, and the microphone is indeed tasteful. But had you summoned the Devil of the Details before calling upon me, that health care bill would still be alive in committee.

You have asked of me a task that is self-contradictory. You want all ‘copyrighted content’ blocked, yet you forbid solutions that would cause your children to wail, gnash their teeth, and blame you for their misfortune. You want to censor material, yet not have the law deemed ‘unconstitutional’. You offer to sell your souls to me for a solution, but your sponsors already hold a lien on them.

And one last thing, about that grimoire. Perhaps you wanted the next page over, to summon the Demon of Possible But Unsatisfying Technological Answers.

I, though, am the Demon of Logic. And it is my fate to be summoned to testify before you, and for you to ignore my words. Again. And again. And again.

This comment has been deemed insightful by the community.
That Anonymous Coward (profile) says:

“Standard Technical Measures”

These wouldn’t be a bad thing if not for…

Most of them aren’t based in reality
Most of them are things they imagine are possible without any understanding of reality
All of them are supposed to be built & paid for by anyone but them
All of them are supposed to be perfect 100% of the time
All of them are flawed & will be abused
None of them have any law actually preventing abuse
All of the penalties are for platforms & users are abusive and unrealistic

If the Ice Block Cartel had as much pull as the Copyright Cartel has managed we’d still be using ice boxes, the inventor of the fridge would be in prison, and it would be illegal to discuss refrigeration.

Perhaps it is time to stop coddling an industry that spends more time chasing ‘lost dollars’ that they’ve never proven actually exist.
They treat consumers like crap, demand the world cater to them, & magically somehow make huge sums of money while claiming poverty.

Piracy at its core is a failure to give 2 shits about what consumers want.
Its not funding drugs, terrorism, or any other nefarious shadowy group.

It is far past time that so many industries in the nation stop being coddled & protected from competition, and be forced to focus on their customers.

Anonymous Coward says:

Right so. The copyright cultists are not going to win the War against Piracy. The government thinks it is going to win? Has it not learned the lessons of the failed War Against Drugs. The War Against Drugs at least had moral legs to stand on. The War Against Piracy? Nothing moral about it . What are we fighting for but to serve greed. It will be always a “problem” until people finally realize the real problem lies with the cultists. Stop coddling them and you eliminate the “problem”. Its time to stop protecting legacy business models and let them go away like ice boxes for more innovative and more society-friendly business models. It is time to dethrone industries and make the consumers the kings instead.

GHB (profile) says:

Same problem with Sculthorpe problem

“context is everything”

–Torrentfreak article on Activision Blizzard demanding github to be like youtube else you are encouraging piracy

What else requires context? Knowing if someone is saying a naughty word. Developing an algorithm to do this at a massive scale? Well, these happened.

You may be thinking “well we could just add additional checks like additional letters around the flagged word and make sure it doesn’t ignore spaces”. WRONG. People that are persistent on trying to sneak swear words aren’t stupid, and will try alternative ways to bypass it. Like, making it not ignore spaces and people can just say “youasshole” without being flagged.

If data can be manipulated or transform in a way it can be reverted back to its original form, then the very same thing is true for copyrighted material. Already demonstrated when that famous 09 F9 and youtube-dl encoded as an image when they’re attacked with 1201 claims. Obfuscation, encryption are ways to trick the system to think these are different files. Archive files like ZIP, 7zip have a feature to encrypt them with a password. Even without that, you can split your files into chunks and upload them separately, reverse the bytes, encode them, and so on since everything digital is made up of 0 and 1s.

The list of prohibited license plates in most states in the US are HUGE, with most of them are variations of curse words, phrases that pronounce a swear word, and so on.

This problem started in 1996 and is still happening even today (say several Kid-friendly Nintendo’s Mario maker and pokemon games). All these recent examples indicate that this problem hasn’t been solved.

I also really hate that “not interfere with standard technical measures” clause, because this means that platforms aren’t even allowed to correct (as in, override) an auto-takedown on stuff that they believe is legal, forcing every content on the internet to be “taken down first, ask questions later”. The current DMCA on safe harbors is already a “guilty until proven innocent” but at least platforms may legally reject notices they deem are invalid. A law that requires platform to over-rely and be gullible on technology to do the anti-piracy job, WTF.

In the very worse case scenario is when you go online and search something returns almost empty with mostly irrelevant, unrelated results. Several of your pages you visit are error 404 or anything that is a message telling you its blocked and is inaccessible.

PaulT (profile) says:

Re: Re:

“Just to add, people will then use innocous words to mean something else, or resort to dogwhistling.”

This is the main thing a lot of people tend to “forget” when claiming that something is easy to moderate – it all fails rather quickly when you account for human behaviour. People don’t like being blocked from doing something they want to do, so they will find ways to do it. You can’t show pictures of penises, so people post emojis of eggplants instead when discussing certain things. You could maybe start blocking that emoji and all references to eggplants, but you’re going to maybe cause a short delay until people come up with alternative euphemisms (and there’s a lot of phallic symbols out there), with the only fixed end result being a lot of pissed off gardeners and cooks.

Same with copyright – YouTube has a lot of videos where people use tricks to try and bypass existing filters by messing with aspect ratio, how the image is displayed, framerate, etc. You can get a lot of stuff taken down but you can’t take every down, and the harder you push the more collateral damage there will be.

“These sorts of moderation calls have to be made by humans, not algorithms.”

Which goes back to the central problem here – the sheer amount of content uploaded to popular sites makes such a thing impossible to do with any sort of accuracy and consistency unless you’re going to force people to wait weeks to have their content approved. Algorithms have to be involved at some point, and while such moderation depends on context and without a central repository for pre-approved works to be queried even human moderation will have a margin of error.

Anonymous Coward says:

The problem is politicans dont understand technology, they follow what big corporation tell them to do,
you could take standard technical measures with the eu,s magical filters which will be able to examine every image, video clip ,song, audio program and tell it if its infringing ,
of course this will ignore fair use laws,or even public domain .

even if there was a central register of every tv program made ,film, video clip ,song, we simply do not have the technology to do this without
blocking alot of legal content or content made by small creators ,
maybe we will see this disaster happen when new eu laws go into force.
Another problem is its likely only big corporations, tik tok,facebook,youtube would be able to afford to build such large filters,
small websites that maybe serve minoritys might have to block all video ,audio uploads .
the music industry solved this problem to a certain extent by the switch over to itunes,
and music streaming services .which of course were invented by technicans and programmers from various tech companys .

if i was cynical i,d say the long term goal is to force every media website to pay for music licences
maybe x dollars per user ,or
turn the web into some version of cable tv.

Any expert that looks at standard technical measures
would say even if we could do this it would result in massive overblocking of legal content which
go,s against the whole purpose of copyright which is to promote the creation of art .

Anonymous Coward says:

Re:

if i was cynical i,d say the long term goal is to force every media website to pay for music licences
maybe x dollars per user ,or
turn the web into some version of cable tv.

You are not being cynical enough. To the legacy copyright industries, self publishers and their supporting platforms are competition, and piracy is just an excuse to cripple self publishing.

Leave a Reply to Anonymous Coward Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...