by Mike Masnick
Wed, Nov 26th 2014 12:38pm
by Mike Masnick
Fri, Nov 7th 2014 7:39pm
Why We Can't Have Nice Things: Universal Music Takes Down Fun Mashup Of Taylor Swift's 'Shake It Off' And 1989 Aerobics Video
from the because-universal-music-sucks dept
As you may have heard, Taylor Swift recently came out with a new album, "1989," and she's at war with Spotify over it and making some statements about streaming royalties and such (which we've mostly been avoiding covering because this fight has gone on long enough already and it's silly and mostly misleading).
However, in the last day or so, someone tried to do the equivalent of the Soul Train/Get Lucky video above with Swift's song "Shake it Off." They matched the song to an aerobics competition video from (amusingly) 1989 -- and it worked quite well. It got lots of attention with the Huffington Post and Slate and others writing about it. So, I went to check out the video and got this instead:
It makes you wonder what's the point here? Yes, legally, Universal/Taylor Swift may have the legal right to pull the video down (though, some could make a reasonable fair use argument), but it seems pretty futile. Here are people having fun with her music, doing something of their own free will to get it more attention (I hadn't heard the song at all before this), and then it gets pulled down, because copyright.
In many ways, this is the antithesis of how music worked for ages. Music was always about people sharing and building on the works of others. Someone would create a song, and others would take it, resing it, adapt it, change it, mix it up with other things. That's how culture works. But not so much in an era with strict copyright laws and automated takedown systems. What a shame.
by Tim Cushing
Fri, Nov 7th 2014 9:08am
Islamic Extremists Use YouTube's Automated Copyright Dispute Process To Access Critics' Personal Data
from the the-further-breaking-of-a-very-broken-system dept
YouTube's infringement reporting system is -- like many others around the web -- fundamentally broken. Making bogus copyright claims is still an easy way to get channels shut down or to siphon ad revenue from existing videos. It can also be used as a censor -- a cheap and dirty way to shut up critics or remove compromising video.
Apparently, Islamic extremists linked with Al-Qaeda have found another use for YouTube's mostly automated dispute process: low-effort doxxing. According to German news sites, a YouTube channel (Al Hayat TV) known for its criticism of Islam has had to send its listed contact person into hiding after bogus copyright claims filed by extremists led to the exposure of his personal information.
On September 25th, someone using the name "First Crist, Copyright" filed bogus copyright complaints against Al Hayat TV. In order to prevent the channel from being shut down for multiple "strikes," Al Hayat TV was forced to file a counter notification. But in order to do so, the channel operators had to expose sensitive information.
From the YouTube Help section on counter notifications:
After we receive your counter notification, we will forward it to the party who submitted the original claim of copyright infringement. Please note that when we forward the notice, it will include your personal information. By submitting a counter notification, you consent to having your information revealed in this way.Some of the people behind the channel contacted YouTube and tried to explain the danger of releasing this personal information, especially considering a majority of its contributors operated anonymously for safety reasons. These pleas went unheeded, thanks to the automation of the copyright dispute process. Each request was greeted with pre-generated responses from YouTube support. Discussions with actual humans at YouTube only confirmed that the channel wouldn't be reinstated without following the counter notice procedure -- including handing over details on the channel's contact person.
Unfortunately unaware of the fact that it could have used a legal representative to handle this, Al Hayat TV filed formal counter notices using one of its member's names. Shortly thereafter, it received threats from the supposed copyright holder warning the contact person to "watch your head" (a phrase basically understood to be a death threat in Arabic) and promising to spread this info across several extremist websites. The message also told the contact person to [paraphrased slightly] enjoy living in fear under police protection. The contact person has since gone into hiding.
The quid pro quo of the copyright dispute process netted Al Hayat TV death threats and a completely bogus "First Crist, Copyright" contact person: Samuel George of 245 George Street in Sydney, Australia. Google Street View shows this address to be right in the middle of some prime downtown shopping.
At this point, it would be beyond tedious to rehash the problems with these automated enforcement systems. But this story shows the system can be easily exploited to satisfy very twisted ends. YouTube's copyright dispute process is automated out of necessity. The fact that it instantly "sides" with the accuser contributes to the problem. Trying to sort out the legitimacy of copyright claims without chewing up thousands of man-hours would be a logistical nightmare and would quite possibly result in a system inferior to the irreversibly-broken one in place today. The unfortunate lesson to be drawn from this debacle is that those on the "inside" need to game the system as effectively as those on the "outside." If YouTube's going to treat copyright claims issued by "Crist" from the middle of the Establishment Bar in Sydney, Australia as wholly legitimate, Al Hayat TV should be shown the same disinterested "courtesy" and be allowed to issue a counter notices signed by an imaginary attorney residing at some random address. After all, if the dispute continues past this point, YouTube simply washes its hands of the entire situation and tells both parties to work it out themselves.
Copyright isn't really the culprit here. It's the systems that have been developed in response to rights holders' complaints. They're too easily gamed and little to nothing in the way of deterrents. Unfortunately, unlike incidents where copyright enforcement has been clumsily deployed as a censor, there's no Streisand Effect equivalent for those who greet speech they don't like with threats and violence. Extremists like this simply don't care what others think of their irrational hatred and colossally stupid worldview.
by Mike Masnick
Mon, Oct 27th 2014 9:02pm
from the good-to-see dept
Automattic's Wordpress.com and NameCheap were the only two companies to receive five out of five stars. However, two other companies were recognized for going the extra mile: Etsy, for providing educational guides, and Twitter, for publishing regular and thorough transparency reports. Overall, 10 companies did not publish adequate transparency reports, highlighting an information black hole for consumers. Additionally, four companies missed a star for their counter-notice practices—a critical procedure for restoring content that may have been taken down without cause.Twitter lost a point for not documenting the counternotice process. Etsy lost a point for failing to have a transparency report (something I'm guessing the company will do before too long). Facebook also doesn't have a transparency report -- though it does have one for government requests, so hopefully it will expand that to copyright and trademark takedown requests as well. YouTube lost points for not requiring a DMCA notice (thank you ContentID) before taking down content. Imgur also doesn't require a DMCA notice (which surprised me).
The EFF's original "Who Has Your Back" effort really did help shame many companies into upping their game in protecting the privacy of users from government requests. Hopefully this new one will do the same for copyright and trademark takedowns.
by Mike Masnick
Wed, Oct 15th 2014 8:03am
YouTube Has Paid $1 Billion To Copyright Holders Via ContentID; What Happened To Stories About It Destroying Content?
from the curious... dept
However, reality is looking pretty different these days. A couple months ago, Businessweek had a big cover story about how YouTube has become Hollywood's "hit factory", and just this week, YouTube revealed that its ContentID program, which allows copyright holders to monetize unauthorized uses of their works, had paid out over $1 billion since its inception. This isn't to say there aren't problems with ContentID. We've noted in the past the problems with false flagging, revenue diversion and other issues -- but the simple fact is that it appears to be making money for content creators. Actually, quite a lot of money.
And this brings us back to a key point that we've hit on over and over and over again: given a chance to operate, these business models tend to come about without the need to pass draconian copyright laws and without the need to completely takedown and destroy businesses. When allowed to thrive, innovate and experiment, it's only natural that workable business models develop. We've seen it over and over again in the industry. The recording industry insisted radio was going to kill the entire industry -- and then it made the industry into a massive juggernaut. The movie industry insisted that the VCR would be its "Boston Strangler," but four years later home video outpaced the box office in generating revenue for the studios.
The continuous claims of "Hollywood vs. Silicon Valley" on copyright issues is so clearly bogus. As we've argued for years, it's the innovations of the tech industry that keep saving the entertainment industry over and over and over again. There's no "war" between the two when it appears that Silicon Valley is the one supplying the "weapons" that's making Hollywood very, very wealthy.
But when will those folks in Hollywood learn this? Instead, they keep attacking these new services, demanding more copyright "enforcement" and blocking these forms of innovation. Who knows what other innovations might have occurred had the industry not shut down Veoh. Or Grokster. Before the US government completely shut down Megaupload, it was experimenting with new revenue models were attracting the interest of lots of famous musicians. Imagine if that had been allowed to continue. Who knows what other kinds of cool business models would be in place today making more money for artists.
Attacking innovation seems to be the legacy entertainment industry's default position, no matter how many times that innovation actually opens up new markets, provides new revenue streams and makes pretty much everyone better off. Oh, except some of the gatekeepers. Those guys tend not to be able to keep quite as much of the revenue generated by these new platforms. And maybe, just maybe, that's the real reason they're so angry about innovation.
Fri, Sep 5th 2014 11:19am
from the getting-it-wrong dept
Following the horrific actions of ISIS/ISIL, in which the group beheaded American journalist James Foley and plastered the video in online forums like Twitter and YouTube, I argued that it is important that the American Public be given the chance to repudiate the aim of the video: paralyzing us with fear. Adding to that thought, Glenn Greenwald argued that the reason one must fight against censorship in the most egregious of speech cases is that such cases are often where the limitation of speech is legitimized. While this may not be a First Amendment consideration, since those sites are not affiliated with the government, it would be a mistake to suggest that free speech is limited as a concept to that narrow legal definition. Free and open speech is an ideal, one that is codified into law in some places, and one which enjoys a more relaxed but important status within societal norms.
I can only assume it's a lack of understanding in both arguments above that has led one Forbes writer to rush to praise YouTube for taking down the latest ISIS/ISIL video. You've almost certainly heard that another American has been beheaded at the hands of civilization's enemy, yet you'll have a much harder time finding the video of Steven Sotloff's death on YouTube this time around. Jeff Bercovici suggests this is a good thing.
With 100 hours of new footage uploaded every minute, YouTube says it doesn’t, and couldn’t, prescreen content, relying on users to flag violations. In this case, its monitors were, unfortunately, expecting the Sotloff video to be posted after weeks of threats by his captors and a widely circulated video plea by his mother to spare his life. That readiness allowed them to remove the video and shut down the account that posted it within hours.This is how you get an American public uninformed about the brutality of groups like ISIS/ISIL. It's how you legitimize terror groups who themselves wish to impose limitations on the types of things the people under their rule are allowed to see and do. It's the start of how the American public is refused the opportunity to witness the full story. And that last part is especially egregious in a time and place where images rule the news cycle. Here the public is, inundated with the story of an American journalist being murdered at the hands of a group that considers that public a target for violence, and the public isn't even given the opportunity to see the images at hand.
This, of course, isn't to argue that people should be forced to watch the brutality. But, as I argued before, denying the American people the opportunity to disabuse ISIS/ISIL of the notion that they can scare us into inaction is something we shouldn't stand for. YouTube can do this, but they shouldn't, and they certainly shouldn't be praised for it.
YouTube, on the other hand, has given itself more latitude to make judgement calls by basing its policies on common sense rather than First Amendment absolutism...For tech companies to embrace the principle of free expression is laudable — but they should also leave themselves the maneuverability to deal with bad actors who care nothing for that or any other civilized value.This misunderstands the most important value of free speech: allowing the evil in the world to identify itself. Once we start down the road of disappearing the speech we deem to not have any value, you open the door for alternative interpretations of the value on a whole host of other speech. Censoring the bad actors doesn't make them go away, it only refuses to shine the public light on them. It keeps people from being able to confront the horrible reality that exists and the group that wants to do us harm. That can't be allowed to continue.
by Mike Masnick
Tue, Jul 22nd 2014 9:03am
from the because-that's-what-copyright-does dept
Jackson West attended one of the sessions and video taped people protesting it at a seminar given by lawfirm Bornstein & Bornstein. You can see the video below via Vimeo:
In West's account (which is, obviously, just his side of the story), Bornstein doesn't seem to understand copyright laws:
...he began asking to meet in person in order to be “presented as human, multi-dimensional.” I pointed out that issuing a takedown notice without contacting me first didn’t really offer me that same benefit of the doubt. I asked if he’d actually watched the video, which he didn’t confirm but instead indicated that he’d objected to the characterization of the incident in the description, complained about other videos of the event (which can’t be found on YouTube, suggesting he may have issued additional claims) and asked to be sent a copy.Just because you object to the "characterization" of the event, it doesn't magically give you the right to abuse copyright law.
Bornstein promised that if I agreed to meet he would consider dropping the matter, but when I made it clear that I reserved the right to publish a story before the meeting, he replied he’d then have to contact copyright counsel. While not directly stated, the implication was clear that if I agreed to hold the story until after meeting with him, he’d agree to drop the claim.Later in the story, there's an "update" when West goes to meet with Bornstein. After a dispute about whether things are on or off the record, Bornstein trots out another non-copyright, but still bogus, reason for issuing the copyright takedown, claiming West is not "a legitimate reporter."
When I pointed out that a story was already online, along with the video, he rescinded the offer. However, seemingly confused over the difference between copyrights and privacy rights, he seemed intent on arguing that I wasn’t acting as a legitimate reporter for having attended the event and filmed the protest without notifying the firm first.That doesn't really have anything to do with privacy rights either -- and even if it did, it still doesn't give Bornstein (a lawyer, remember) the right to abuse copyright law to takedown the video.
Yet again, we see copyright being abused for the purpose of censorsing content someone doesn't like.
Update: As noted in the comments, YouTube has put the video back up...
by Mike Masnick
Fri, Jul 18th 2014 12:16pm
from the disappearing-culture dept
by Mike Masnick
Tue, Jul 15th 2014 7:59am
from the fascinating dept
The Sunni Islamic State insurgents, now locked in a deadly struggle with Iraq’s Shiite majority, excel online. They command a plethora of official and unofficial channels on Facebook, Twitter, and YouTube. “And kill them wherever you find them,” commands one recent propaganda reel of firefights and bound hostages, contorting a passage from the Koran. “Take up arms, take up arms, O soldiers of the Islamic State. And fight, fight!” adds another, featuring a sermon from the group’s leader, Abu Bakr al-Baghdadi. The material is often slickly produced, like “The Clanging of Swords IV,” a glossy, feature-length film replete with slow-motion action scenes. Much of it is available in English, directly targeting the recruits with Western passports that have become one of the organization’s more dangerous assets. And almost all of it appeals to the young: Photoshops of Islamic State fighters and their grizzly massacres with video game-savvy captions like, “This is our Call of Duty.”Of course, what Farrow ignores is that it's not at all difficult to find Americans using social media for similar calls to action. For example, how about a Fox News contributor announcing that it was time to "Muslims are evil. Let's kill them all." Or a Breitbart News contributor calling for people to "start slaughtering Muslims in the streets, all of them."
But officials at social media companies are leery of adjudicating what should be taken down and what should be left alone. “One person’s terrorist is another person’s freedom fighter,” one senior executive tells me on condition of anonymity. Making that call is “not something we’d want to do.”
I find both of those statements abhorrent, but the point is that idiots will make stupid incendiary statements on Twitter, Facebook and YouTube all the time -- and most people look at them and realize that they're ignorant crazy people talking. No one is actually incentivized to run out and actually follow those arguments. Yet Farrow seems to think that the people who follow those other groups on social media immediately accept what is said and follow through?
Just because people are saying stupid stuff on social media, doesn't mean internet companies should step in and decide what is and what is not appropriate. Where do you draw the line? Farrow breezily admits that it may be difficult to figure out what to take down and what to leave up, but... then just assumes it's kind of easy anyway... because child porn.
More troubling still is the fact that these companies already know how to police and remove content that violates other laws. Every major social media network employs algorithms that automatically detect and prevent the posting of child pornography. Many, including YouTube, use a similar technique to prevent copyrighted material from hitting the web. Why not, in those overt cases of beheading videos and calls for blood, employ a similar system?See how limited types of censorship almost always lead to calls for greater and greater censorship> It's fairly amazing that an attorney, former State Department official and a reporter would so blatantly call for censorship, but that appears to be Farrow's bag. Besides, he's apparently rather clueless about why his call for censoring "terrorists" is so different from child porn (an absolute liability situation, where it's generally immediately obvious if something is illegal) and copyright (where the system is already quite problematic, and involves a detailed notice-and-takedown process that has massive dangerous unintended consequences). He also ignores the fact that all of these companies already do pull down extremist content (something many folks think already goes too far). Apparently, Farrow's not big on details.
Farrow does mention Section 230 of the CDA, but apparently is ignorant of how that law actually works as well:
As always, beneath legitimate practical and ethical concerns, there is a question about the bottom line. Section 230 of the Telecom Act of 1996 inoculates these companies from responsibility for content that users post—as long as they don’t know about it. Individuals involved in content removal policies at the major social media companies, speaking to me on condition of anonymity, say that’s a driving factor in their thinking. “We can’t police any content ourselves,” one explains. Adds another: “The second we get into reviewing any content ourselves, record labels say, ‘You should be reviewing all videos for copyright violations, too.’”First of all, this is wrong. The "as long as they don't know about it" is flat out wrong. Section 230 actually is explicit that if you do know about it, it's entirely the company's discretion whether or not to remove. If they do, that imposes no additional obligations on them to remove other content. However, the final comment is more accurate -- though, amusingly, it contradicts Farrow's own earlier statement about how these companies already know how to stop copyright-covered content from appearing.
The point is that determining who is and who is not a "terrorist" isn't so easy, and that slope is very slippery. Should those Fox News and Breitbart contributors be cut off as well for their "terroristic" threats? Remember that after then-Senator Joe Lieberman went on a similar crusade to get YouTube to take down "terrorist" videos, it resulted in YouTube disabling the YouTube channel of an important Syrian watchdog group that had been unveiling atrocities in that country.
Farrow keeps going back to the genocide in Rwanda to prove his point. But under his logic, anyone documenting that genocide and getting the news out to the world would likely be censored, allowing that kind of genocide to go on.
Yes, if you think simplistically about things, it must seem so easy to just say, "Well, censor the bad guys." But you'd think that someone with Farrow's training and background would actually know that simplistic solutions to challenging and nuanced questions often result in very dangerous policies with serious unintended consequences.
by Tim Cushing
Mon, Jul 14th 2014 2:14pm
from the please-file-counter-notice-in-nearest-recycle-bin dept
As noted recently, Soundcloud has given Universal Music Group the power to directly take down content, bypassing the site's internal takedown process as well as the few remedies it offers users who wish to dispute deletions. YouTube has also given UMG this same level of access, again bypassing the normal notice and takedown system.
While sites may claim (as Soundcloud did) that they need to give powerful rights holders direct access in order to comply with the DMCA, this simply isn't true. Utilizing the normal notice-and-takedown system would be more than adequate. The only thing giving a label direct access does is increase the amount of abuse.
It's long been noted that the DMCA takedown process tends to encourage rights holders to disregard fair use and fire off notices. The toothless "perjury" language included at the bottom of every takedown notice is almost never enforced, making false or bad claims painless for those sending takedowns.
UMG, with its direct access, certainly isn't going to consider fair use when it starts pulling the plug on content, as one YouTube remix artist discovered. Elisa Kreisinger has been remixing cultural touchstones for years. This video -- one that never even made it past YouTube's upload process -- was no different.
Last August, I saw Jay Z’s HBO mini-documentary commemorating his 6-hour “unprecedented performance experience” of “Picasso Baby” at Pace Gallery. The movie was ripe for remixing: Within the first minute, Jay Z reflects on the similarities between performance art and his usual concert performances, arguing that art galleries have separated art from mass culture, presumably unintentionally.The reason was simply this: where fair use could be reasonably argued, UMG (and its bots/lawyers) saw nothing more than two of its videos being "stolen."
With one simple twist, I recontextualized select scenes of his performance and set it to Taylor Swift’s “22." The remix illustrates how both Jay Z and Swift use their status as outsiders to relate to audiences despite being very much insiders. [...] I uploaded the mashup to YouTube on Aug. 5, 2013, and it was immediately blocked globally. Ten months later, I finally uncovered the reason.
It turns out no defense would have revived my video. YouTube had cut a private deal that gave Universal Music Group the power to take down any video, even those videos (like mine and countless others, including creators as diverse as Patrick McKay and Megaupload) that didn’t require Universal’s permission in the first place.The only plus side was that's UMG's deletion didn't result in a strike against Kreisinger's account. But that's of little comfort when fair use is steamrolled by "contractual obligations" YouTube (or Soundcloud) really don't need to have in place to stay compliant with the DMCA.
When fair use gets damaged, so does free speech.
Fair use prevents rightsholders from silencing critics with the threat of a copyright infringement lawsuit. By giving UMG the ability to take down videos that use their content regardless of fair use, YouTube has given UMG sweeping power to control what is – and is not – said about UMG and UMG artists. UMG should not be asking for this kind of power, and YouTube should not be granting it.Labels and copyright industry lobbyists love to call these sorts of deals "voluntary." But they aren't. They're coercive. Lobbyists lean on politicians and politicians lean on internet services to do more to help out struggling, billion-dollar industries. Rather than see IP laws get any worse (or target them any more specifically), they comply with increasingly ridiculous demands. And the users pay the price by having their uploads deleted, their accounts closed and their mouths shut -- all without any genuine level of recourse.