Who should be directly liable for online infringement – the entity that serves it up or a user who embeds a link to it? For almost two decades, most U.S. courts have held that the former is responsible, applying a rule called the server test. Under the server test, whomever controls the server that hosts a copyrighted work—and therefore determines who has access to what and how—can be directly liable if that content turns out to be infringing. Anyone else who merely links to it can be secondarily liable in some circumstances (for example, if that third party promotes the infringement), but isn’t on the hook under most circumstances.
The test just makes sense. In the analog world, a person is free to tell others where they may view a third party’s display of a copyrighted work, without being directly liable for infringement if that display turns out to be unlawful. The server test is the straightforward application of the same principle in the online context. A user that links to a picture, video, or article isn’t in charge of transmitting that content to the world, nor are they in a good position to know whether that content violates copyright. In fact, the user doesn’t even control what’s located on the other end of the link—the person that controls the server can change what’s on it at any time, such as swapping in different images, re-editing a video or rewriting an article.
But a news publisher, Emmerich Newspapers, wants the Fifth Circuit to reject the server test, arguing that the entity that embeds links to the content is responsible for “displaying” it and, therefore, can be directly liable if the content turns out to be infringing. If they are right, the common act of embedding is a legally fraught activity and a trap for the unwary.
The Court should decline, or risk destabilizing fundamental, and useful, online activities. As we explain in an amicus brief filed with several public interest and trade organizations, linking and embedding are not unusual, nefarious, or misleading practices. Rather, the ability to embed external content and code is a crucial design feature of internet architecture, responsible for many of the internet’s most useful functions. Millions of websites—including EFF’s—embed external content or code for everything from selecting fonts and streaming music to providing services like customer support and legal compliance. The server test provides legal certainty for internet users by assigning primary responsibility to the person with the best ability to prevent infringement. Emmerich’s approach, by contrast, invites legal chaos.
Emmerich also claims that altering a URL violates the Digital Millennium Copyright Act’s prohibition on changing or deleting copyright management information. If they are correct, using a link shortener could put users at risks of statutory penalties—an outcome Congress surely did not intend.
Both of these theories would make common internet activities legally risky and undermine copyright’s Constitutional purpose: to promote the creation of and access to knowledge. The district court recognized as much and we hope the appeals court agrees.
There have been some ongoing debates (going back many years) in the copyright space regarding whether or not embedding infringing content into a website could be infringing in and of itself. If you understand what’s happening technically, this seems ludicrous. An embed is basically the same thing as a link. And merely linking to infringing content is unlikely to be infringing itself. All embedding is really doing is taking a link, and showing the content from that link. If embedding were found to be infringing, then there’s an argument that linking is infringing, and (as we’re seeing with various link tax proposals) that would break a fundamental part of how the internet works.
Last year we discussed a case that was on appeal to the 9th Circuit, that asked a slightly different question: could a company providing embeddable content (in this case Instagram) be held liable for providing embedding tools that then allowed others to embed content from that website elsewhere. In this case, some photographers argued that by providing tools (i.e., a tiny snippet of code that basically says “show this content at this link”) for embedding, Instagram was unfairly distributing works without a license. Specifically, the photographers were upset that works that they uploaded to Instagram were showing up in news articles after the media orgs used Instagram’s embed tool to embed the original (non-infringing) images. The lower court had (thankfully) rejected that argument.
The 9th Circuit has now upheld that lower court ruling, protecting some important elements of the ability to offer and use embed codes. At issue in this case, really, was yet another attempt to take the already problematic Aereo copyright test, which we’ve described as the “looks like a duck” test (ignoring the technical issues, and doing a “this looks like something else that is infringing, therefore we will assume this is infringing, no matter what the underlying details show”), which argued that because embeds look like locally hosted content, we should still treat it as if it’s locally hosted content.
Thankfully, the court rejects that line of argument, and makes it clear that the important Perfect 10 copyright case, which focused on who was actually hosting the material, was not overruled by Aereo.
This copyright dispute tests the limits of our holding in Perfect 10 v. Amazon, 508 F.3d 1146 (9th Cir. 2007) in light of the Supreme Court’s subsequent decision in American Broadcasting Companies, Inc. v. Aereo, 573 U.S. 431 (2014). Plaintiffs-appellees Alexis Hunley and Matthew Scott Brauer (collectively “Hunley”) are photographers who sued defendant Instagram for copyright infringement. Hunley alleges that Instagram violates their exclusive display right by permitting third-party sites to embed the photographers’ Instagram content. See 17 U.S.C. § 106(5). The district court held that Instagram could not be liable for secondary infringement because embedding a photo does not “display a copy” of the underlying images under Perfect 10.
We agree with the district court that Perfect 10 forecloses relief in this case. Accordingly, we affirm.
I’m actually somewhat impressed that the court’s discussion of embedding remote content vs. hosting local content is… pretty clear, correct, and understandable.
When a web creator wants to include an image on a website, the web creator will write HTML instructions that direct the user’s web browser to retrieve the image from a specific location on a server and display it according to the website’s formatting requirements. When the image is located on the same server as the website, the HTML will include the file name of that image. So for example, if the National Parks Service wants to display a photo of Joshua Tree National Park located on its own server, it will write HTML instructions directing the browser to display the image file, , and the browser will retrieve and display the photo, hosted by the NPS server. By contrast, if an external website wants to include an image that is not located on its own servers, it will use HTML instructions to “embed” the image from another website’s server. To do so, the embedding website creator will use HTML instructions directing the browser to retrieve and display an image from an outside website rather than an image file. So if the embedding website wants to show the National Park Service’s Instagram post featuring Joshua Tree National Park—content that is not on the embedding website’s same server—it will direct the browser to retrieve and display content from the Instagram’s server.
It even includes an example of a full Instagram embed, which is not something you normally see in a judicial ruling.
The court does say that hyperlinking is different than embedding (though I’d argue it’s not really), but at least the court is clear that embedding content is not hosted by the website using the embed code:
As illustrated by the HTML instructions above, embedding is different from merely providing a hyperlink. Hyperlinking gives the URL address where external content is located directly to a user. To access that content, the user must click on the URL to open the linked website in its entirety. By contrast, embedding provides instructions to the browser, and the browser automatically retrieves and shows the content from the host website in the format specified by the embedding website. Embedding therefore allows users to see the content itself—not merely the address—on the embedding website without navigating away from the site. Courts have generally held that hyperlinking does not constitute direct infringement. See, e.g., Online Pol’y Grp. v. Diebold, Inc., 337 F. Supp. 2d 1195, 1202 n.12 (N.D. Cal. 2004) (“[H]yperlinking per se does not constitute direct infringement because there is no copying, [but] in some instances there may be a tenable claim of contributory infringement or vicarious liability.”); MyPlayCity, Inc. v. Conduit Ltd., 2012 WL 1107648, at *12 (S.D.N.Y. Mar. 20, 2012) (collecting cases), adhered to on reconsideration, 2012 WL 2929392 (S.D.N.Y. July 18, 2012).
From the user’s perspective, embedding is entirely passive: the embedding website directs the user’s own browser to the Instagram account and the Instagram content appears as part of the embedding website’s content. The embedding website appears to the user to have included the copyrighted material in its content. In reality, the embedding website has directed the reader’s browser to retrieve the public Instagram account and juxtapose it on the embedding website. Showing the Instagram content is almost instantaneous.
Importantly, the embedding website does not store a copy of the underlying image. Rather, embedding allows multiple websites to incorporate content stored on a single server simultaneously. The host server can control whether embedding is available to other websites and what image appears at a specific address. The host server can also delete or replace the image. For example, the National Park Service could replace the picture of Joshua Tree at with a picture of Canyonlands National Park. So long as the HTML instructions from the third-party site instruct the browser to retrieve the image located at a specific address, the browser will retrieve whatever the host server supplies at that location.
As the 9th Circuit notes, under the Perfect 10 rulings, the court has (rightly!) recognized that the Copyright Act’s “fixation” requirement means that the content in question has to actually be stored on the computer’s memory to be infringing, and embedding and other “in-line linking” don’t do that.
The court first rejects the argument that Perfect 10’s so-called “server test” only applies to search engines. As it notes, there’s no rationale for such a limitation:
Perfect 10 did not restrict the application of the Server Test to a specific type of website, such as search engines. To be sure, in Perfect 10, we considered the technical specifications of Google Image Search, including Google’s ability to index third-party websites in its search results. Perfect 10, 508 F.3d at 1155. We also noted Google’s reliance on an automated process for searching vast amounts of data: to create such a search engine, Google “automatically accesses thousands of websites . . . and indexes them within a database” and “Google’s computer program selects the advertising automatically by means of an algorithm.” Id. at 1155–56. But in articulating the Server Test, we did not rely on the unique context of a search engine. Our holding relied on the “plain language” of the Copyright Act and our own precedent describing when a copy is “fixed” in a tangible medium of expression. Id. (citing 17 U.S.C. § 101). We looked to MAI Sys. Corp. v. Peak Computer, Inc., for the conclusion that a digital image is “fixed” when it is stored in a server, hard disk, or other storage device. 991 F.2d 511, 517–18 (9th Cir. 1993). Applying this fixation requirement to the internet infrastructure, we concluded that in the embedding context, a website must store the image on its own server to directly infringe the public display right.
Then there’s the question of whether or not Aereo’s “looks like a duck” test at the Supreme Court effectively overruled the 9th Circuit’s server test. Thankfully, the 9th Circuit says it did not. The reasoning here is a bit complex (perhaps overly so), but basically the 9th Circuit says that Perfect 10 “server test” applies to the display right under copyright, whereas the Aereo test applies to the transmission of content, or the public performance right. It’s true that these are different rights, but really all this should serve to do is reinforce jus how wrong (and stupid) the Aereo ruling was. But, alas:
This difference between these two rights are significant in this case. Perfect 10 and Aereo deal with separate provisions of the Copyright Act—Perfect 10 addressed the public display right, and Aereo concerned the public performance right. In Perfect 10, we analyzed what it meant to publicly display a copy in the electronic context. See Perfect 10, 508 F.3d at 1161. By contrast, in Aereo the Court did not address what it means to transmit a copy, because the public performance right has no such requirement. See Aereo, 573 U.S. at 439–44. In other words, regardless of what Aereo said about retransmission of licensed works, Perfect 10 still forecloses liability to Hunley because it answered a predicate question: whether embedding constitutes “display” of a “copy.” Perfect 10, 508 F.3d at 1160. Aereo may have clarified who is liable for retransmitting or providing equipment to facilitate access to a display—but unless an underlying “copy” of the work is being transmitted, there is no direct infringement of the exclusive display right. Thus, Perfect 10 forecloses Hunley’s claims, even in light of Aereo.
This is correct in paying attention to who actually is making copies of the underlying work, but is still highlighting just how broken the Aereo ruling really was.
Either way, the 9th Circuit adds one more point here, which is that to violate copyright law, you have to show “volitional conduct,” and that can’t be done here:
There is an additional reason we cannot find liability for Instagram here. We held, prior to Aereo, that infringement under the Copyright Act requires proof of volitional conduct, the Copyright Act’s version of proximate cause. See Fox Broad. Co., Inc. v. Dish Network LLC, 747 F.3d 1060, 1067 (9th Cir. 2013); Kelly v. Arriba Soft Corp., 336 F.3d 811, 817 (9th Cir. 2003) (“To establish a claim of copyright infringement by reproduction, the plaintiff must show . . . copying by the defendant.”). And we are not alone, indeed, “every circuit to address this issue has adopted some version of . . . the volitional-conduct requirement.” BWP Media USA, Inc. v. T&S Software Assocs., Inc., 852 F.3d 436, 440 (5th Cir. 2017) (citing cases). The Court in Aereo did not address volitional conduct as such, although Justice Scalia did so in his dissent. See Aereo, 573 U.S. at 453 (Scalia, J., dissenting). But the Court did distinguish between those who engage in activities and may be said to “perform” and those who engage in passive activities such as “merely suppl[ying] equipment that allows others to do so.” Id. at 438–39. In any event, Perfect 10 was bound to apply our volitional-conduct analysis. When we applied our requirement that the infringer be the direct cause of the infringement, we concluded that the entity providing access to infringing content did not directly infringe, but the websites who copied and displayed the content did. Perfect 10, 508 F.3d at 1160.
Post-Aereo, we have continued to require proof of “causation [as] an element of a direct infringement claim.” Giganews, 847 F.3d at 666. In such cases we have taken account of Aereo and concluded that our volitional conduct requirement is “consistent with the Aereo majority opinion,” and thus remains “intact” in this circuit. Id. at 667; see Bell v. Wilmott Storage Servs., LLC, 12 F.4th 1065, 1081–82 (9th Cir. 2021); Oracle Am., Inc. v. Hewlett Packard Enter. Co., 971 F.3d 1042, 1053 (9th Cir. 2020); VHT, Inc. v. Zillow Grp., Inc., 918 F.3d 723, 731 (9th Cir. 2019). Our volitional conduct requirement draws a distinction between direct and secondary infringement that would likely foreclose direct liability for third-party embedders. And without direct infringement, Hunley’s secondary liability theories all fail. See Oracle Am., Inc., 971 F.3d at 1050.
So, even if Aereo overruled Perfect 10 (which it did not), this case was a loser.
Interestingly, the 9th Circuit also seems to throw some shade on the attempt by the plaintiffs in this case to try to stretch the “looks like a duck” test to apply here, and thankfully, the 9th Circuit basically says “don’t read too much into that test.”
We are reluctant to read too much into this passage. The Court commented on user perception to point out the similarities between Aereo and traditional cable companies. These similarities mattered because the 1976 Copyright Amendments specifically targeted cable broadcasts. See Aereo, 573 U.S. at 433. But the Court did not rely on user perception alone to determine whether Aereo performed. See id. The Court has not converted user perception into a separate and independent rule of decision.
While this again suggests that the 9th Circuit realizes the Aereo ruling is problematic, it also provides more examples of why, even with Aereo in place, the test does not apply to “perception” in other contexts.
There is a weird bit at the end of the ruling, responding to the plaintiff’s (ridiculous) claims that the server test undermines the policy purpose of copyright law (it does not, it upholds it…) by suggesting that plaintiffs apply for en banc review or petition the Supreme Court to review this as well. That… very well might happen, and would (yet again) put another important factor of the open web on trial.
Hunley, Instagram, and their amici have peppered us with policy reasons to uphold or overturn the Server Test. Their concerns are serious and well argued. Hunley argues that the Server Test allows embedders to circumvent the rights of copyright holders. Amici for Hunley argue that the Server Test is a bad policy judgment because it destroys the licensing market for photographers. On the other hand, amici for Instagram argue that embedding is a necessary part of the open internet that promotes innovation. As citizens and internet users, we too are concerned with the various tensions in the law and the implications of our decisions, but we are not the policymakers.
If Hunley disagrees with our legal interpretation—either because our reading of Perfect 10 is wrong or because Perfect 10 itself was wrongly decided—Hunley can petition for en banc review to correct our mistakes. But we have no right “to judge the validity of those [] claims or to foresee the path of future technological development.” Aereo, 573 U.S. at 463 (Scalia, J., dissenting). Most obviously, Hunley can seek further review in the Supreme Court or legislative clarification in Congress.
In other words, while this is a good ruling, there’s a good chance this issue is far from settled.
On Monday, I saw Elon Musk tweet the following, and initially thought that he might have actually made a good policy decision for once, and planned to write up something about Elon doing something right (contrary to the opinion of some, I’m happy to give him credit when it’s due):
Punching back against DMCA abusers is a good policy (and one that the old Twitter was willing to go to court over — though very early Twitter was less good about it). So, in theory, suspending accounts of those who engage in “repeated, egregious weaponization” of the DMCA seems like a good policy and Musk should be given kudos if that’s how the policy is actually put into operation.
Though, the actual details here are kind of a mess, and it’s possible that instead of putting in place a good policy, Musk might have (instead!) opened up Twitter to potentially massive liability.
This came about over a dispute between two Twitter users, but the details are now gone, as Twitter suspended one account, and it appears the other account deleted all the tweets about this dispute (though I’ve been able to dig up a few screenshots).
One account, @Rainmaker1973, is one of thousands of aggregator accounts that basically find other people’s content and post a constant stream of it to their feed. Rainmaker has 1 million followers, so is a pretty large account. Looking over Rainmaker’s feed, you can see that the account links to source material (through tracked buff.ly links). When it’s posting videos, it appears to embed the original video, rather than re-uploading it, though the way Twitter handles that is sometimes a little confusing. It just puts a little “from @OriginalAccount” in small letters underneath the video, with a link to that account’s profile page, but not to the tweet where the original video was. I’ve never quite understood why Twitter handles video embeds this way, but it does. Here’s one example, with the Twitter-appended attribution highlighted:
For photos, that’s not how it works. You basically have to reuse the photos (and if they’re hosted somewhere else, upload them to Twitter). That’s what the Rainmaker account did here, with a photo that originated on Facebook:
Is that infringing? Eh… I’d say that the Rainmaker account has pretty strong fair use claims much of the time. The account also appears to lean towards public domain images (such as from NASA) and some Creative Commons-licensed images. But fair use is always fact-specific, so it’s difficult to say if none of the accounts tweets might violate copyright law.
What appears to have happened, based on what many others have written, is that the Rainmaker account posted a video from another account, @NightLights_AM, that specialized in images and videos of the northern lights. While these tweets have since been deleted, note that the video in the image does not show the little “from” line, as it would if it were embedded directly from Twitter.
Now, unfortunately, since it’s all deleted, we can’t see exactly how the video is embedded. Rainmaker says it’s embedded, not uploaded. It doesn’t have the “from” line at the bottom in that screenshot, but… it might still be from a Twitter embed, because Twitter (confusingly!) does not show that “from” line in the video if it’s being quote tweeted, as is the case here.
So, based on all that, there’s a decent chance that the DMCA notice was somewhat iffy. I recognize that lots of people don’t like aggregators like the Rainmaker account, but if he’s just using an existing upload from the official account as an embed, then it’s clearly not actually infringing.
It is quite possible, though, that most people don’t understand how video embeds of other Twitter videos work on Twitter (it’s confusing!), and so it wouldn’t surprise me if the NightLights account didn’t even realize it was an embed of the original, and out of frustrating of this large aggregator account getting all the traffic for its video, sent a good faith (but mistaken) DMCA notice.
In the now-deleted tweet you see above, the Rainmaker account says it reached out to the NightLights account, and NightLights asked for money (likely for a license). Again, assuming Rainmaker was just embedding, there is no need for a license. It’s literally just using Twitter in the way it was intended, and in a manner that NightLights already granted a license for. Somewhat confusingly, in a later tweet, the Rainmaker account claims that NightLights didn’t actually want money and instead said that NightLights was trying to shut down his account:
For what it’s worth, the guy behind NightLights told TorrentFreak that Rainmaker is misrepresenting their conversation, and that it was Rainmaker who first proposed paying, if NightLights would rescind the DMCA notice:
Mauduit informs TorrentFreak that after sending the DMCA notice to Twitter, Massimo initiated contact and suggested that he should pay an amount to have the report retracted “since the situation for him was so dire.”
Mauduit says that since the offer came Massimo, that doesn’t constitute blackmail. A few hours later Massimo accused Mauduit of blackmail on Twitter, Mauduit says.
“I asked him to compensate me fairly for the use of the material. So at that point, that was purely business related and politically correct,” Mauduit says.
Either way, Twitter’s head of trust and safety quickly told the Rainmaker account that, despite his fears, the company would not suspend his account:
This is also a good policy (so kudos to Irwin and Musk on continuing this aspect of old Twitter’s policies). She also noted in another tweet that the Rainmaker account “is not at any risk for suspension.”
Of course, “pirating / egregious illegal behavior” is somewhat in the eye of the beholder. And so is… “blackmail.” Yet, about an hour after Ella’s tweets, Musk himself noticed Rainmaker’s tweets and announced that he would “suspend” accounts for “blackmail.”
Again, in a vacuum, this could be good policy. Suspending egregious copyfraudsters who abuse copyright to shake people down or silence them makes sense. And DMCA abuse for extortionate behavior does happen unfortunately often. As does abusing the DMCA to silence others over non-infringing speech. We’ve covered many, many such cases over the years.
So, having a policy that pushes back on that abuse of copyright law is good —and another nice thing you can say about Elon Musk is that he’s been quite good about recognizing the problems associated with patent and copyright law. Other companies have pushed back on copyright abuses as well, such as how Automattic (the company behind WordPress, and also the company that hosts Techdirt) has spent years fighting back against DMCA abusers. But it has a clear process for doing so, rather than the whims of an impulsive owner.
In this case, though, Elon appeared to take Rainmaker’s (slightly confused) word for what happened, and flat out suspended (temporarily) NightLights for what appears to have been a good faith DMCA notice, followed by a discussion initiated by Rainmaker regarding payment.
As I was finishing up this article, the NightLights account was actually reinstated, though the guy says he’s now considering leaving the platform:
So, given all this, the concept behind the policy is good, but there’s not much evidence that NightLights was actually actually “blackmailing” Rainmaker. From what was public (and mostly now deleted), it looks more like the account mistakenly thought that its content was used in an infringing manner due, in part, to Twitter’s own confusing presentation of embedded videos, and filed a good faith, but mistaken, DMCA notice. When Rainmaker contacted NightLights to try to get the DMCA strike removed (out of a fear that it would take down the account), the began a discussion on a licensing fee, which again seems reasonable if NightLights actually thought the use was infringing.
Also, this seems to have no angered others who were fans of the NightLights account:
Once again, content moderation at scale is impossible to do well because people are going to be mad at you on both sides of the equation.
In the end, this looks like a lot of miscommunication across the board, in part from people who aren’t fully aware of how Twitter or copyright law actually work. The end policy — don’t put up with shit from those who abuse the DMCA process — is actually great. But it really doesn’t look like NightLights was abusing the DMCA, just confused about how Twitter worked.
And because of the somewhat less-than-well-considered way in which Twitter under Elon is acting, if NightLights had a legitimate claim (and again, I don’t think it does in this case), quickly suspending an account for filing a real DMCA claim could open up Elon and Twitter to pretty significant liability. Contrary to popular belief, companies that receive a DMCA notice do not need to take down the content. But if they don’t, they can no longer use the DMCA’s 512(c) safe harbor, which is a risk if the case went to court. So refusing to take down something upon notice is a legal risk, and the kind of thing a large company like Twitter would normally have a copyright lawyer review.
The other potential issue is that if Twitter makes it a habit to suspend accounts that send good faith or legitimate DMCA notices, it could very much open them up to claims that they do not have a valid “repeat infringer” policy, as required by 512(i). Suspending one account for sending a good faith DMCA almost certainly won’t trigger that issue, but having Elon flat out say that Rainmaker’s account “won’t be” suspended could be read to mean that Twitter is ignoring its repeat infringer policy with regards to at least that account. And, I could see copyright lawyers trying to argue that this is an example of how Musk is willing to ditch the 512(i) policy for accounts he likes. At the very least, you can bet that these kinds of impulsive policy decisions will be used in court by copyright litigants. Perhaps from Hollywood studios who noticed that, last fall, amidst all the turmoil, Elon’s Twitter seemed to be ignoring many DMCA notices about accounts posting entire Hollywood movies.
In short, impulsive decisions around DMCA policy, made without first going over things with an actual copyright lawyer, can open up a company like Twitter to quite a bit of liability. But this is the Elon Era, in which YOLO seems to be the general ethos, and if it happens to add to yet more legal liability? Well, just toss it on the pile.
Imagine waking up in the morning, grabbing your hot cup of coffee, and scrolling your favorite blog post or website, only to find it looking like this:
Images are missing. Video content is not there. Nothing but an empty black void staring back at you.
This is what could happen if a recent case brought to the 9th Circuit Court of Appeals, Hunley v. Instagram, brings about a change to how content can be shared around the Internet.
Through amicus briefs, the Internet Society can draw attention to issues or arguments that the parties involved in a lawsuit are unlikely to raise themselves, helping courts understand the potential impacts of their rulings on the digital world.
In Hunley v. Instagram, several photographers are suing Instagram for copyright infringement, that is, when someone copies content without a license or fair use defense in the United States. Hunley, as the plaintiffs—the ones doing the suing—claim that Instagram is guilty of copyright infringement by allowing others to embed photos on other websites.
They are saying that in addition to grabbing the “embed code”—a short snippet of web code that allows embedding content into another page—website designers and users should also have to negotiate a copyright license to display the embedded content. This would drastically change how we build and use services online.
While this specific case is about embedding images around the web, a ruling in favor of the plaintiffs could easily make it more difficult to embed anything in other contexts around the Internet, not just the web.
The ability for a content creator to embed code or instructions for others’ web browsers to access images, videos, services, etc., from somewhere else online exemplifies the Internet’s generative and modular capacity. Embedding enables the creation of content that is more accessible to more people. Embedding is found across many different aspects of the Internet, with around 95% of all sites embedding third-party content (source: Web Almanac).
For example, the 9th Circuit Court’s own website embeds videos stored on YouTube servers. Below, you will see two screenshots of a typical oral argument page on the Court’s website, one with and one without embedding. Key features of the website clearly no longer work.
Even emails and Internet-based text message applications frequently embed content from third-party servers. Moreover, website creators may embed code to incorporate enhanced functionality into their websites, for instance, translation functionality, video captions, or CAPTCHA functionality intended to secure websites.
The Internet was built to be built upon, and assembling different components, known as modularity, and ensuring a common set of principles about how systems can be assembled to make new systems, known as generativity, are core aspects of our digital world. Our Internet experience would be a mere shadow of what it is today if we could not embed other content, services, and resources.
Do you believe in defending the Internet as we know it today, with easy access to content? Read the brief we filed, and help spread the word, embedding matters! #EmbeddingMatters
Joseph Lorenzo Hall is the Distinguished Technologist for a Strong Internet at the Internet Society. This post was reposted, with permission, from the Internet Society blog.
Five years ago, we wrote a post detailing the crazy permission-asking media scrum that forms on Twitter when people post photographic or video documentation of something major happening. Under such tweets, you’ll often see dozens of reporters asking for “permission” to use the images or videos in news reports. In many cases, fair use would likely cover the usage, but news organizations are understandably gun shy about copyright lawsuits from greedy lawyers who would be all too quick to sue them for merely embedding a tweet.
However, it appears that the Associated Press takes this to absolutely insane, and legally problematic, levels. And it appears that the AP would rather not talk about this.
You may have seen that, over the weekend, there was an explosion in downtown LA. Others in Los Angeles were able to see the fire and posted images and videos on Twitter. One of these was Brian Magno, who tweeted a 21 second video of the fire:
In response, the usual media scrum descended. Among them was an editor for the Associated Press, named RJ Rico. Rico didn’t just ask for permission to use the video, like most other reporters, but with the request, he posted an image of what he called a “social media release form.” I can’t link you to the tweet because by Sunday evening Rico had protected his Twitter account (we’ll come back to that later). Thankfully, I got a screenshot before all that happened:
The “Social Media Release Form” is truly a piece of work. Here’s what it says:
Social Media Release Form
Please respond to this message with confirmation that you agree to the following:
The Associated Press and its subsidiaries and affiliates (“AP”) shall have the world-wide, non-exclusive right to (and all consents necessary to) use, reproduce, prepare derivative works of, edit, translate, distribute, publicly perform, and publicly display the content throughout the world in perpetuity by any and all means now known or hereafter created in all media known or hereafter created, and AP shall further have the right to license these rights to others; and
That you are the copyright owner or the copyright owner’s authorized agent and that you are full entitled to grant these rights in favor of AP and that there is no agreement or other restriction preventing this grant of rights. You agree to indemnify and hold harmless the AP and its licensees from and against any claims, losses, liability, damages, costs and expenses arising from any breach or alleged breach of these representations and warranties.
Okay, so, first things first, you should basically never, ever, ever just “agree” to a contract that someone tweets at you as an image. But there are some really problematic elements of this “release form.” First is the fact that it grants relicensing rights to the AP, so that they somehow can pass along this image to anyone they want without any restriction. From this, it certainly seems like the AP could then choose to sell the licensing to others and the original copyright holder couldn’t do anything about that even if he didn’t like it. This term probably isn’t quite as bad as some people are making it out to be, and is more in line with typical website terms of service licensing grants, that just cover that the rights can be passed on to future entities (should the organization come under control of another entity, for example). It’s still pretty broadly worded though, and anyone agreeing to it should be careful.
But the much bigger issue is the indemnity clause, which would mean that if there were some sort of legal dispute over the image, all of the costs would basically fall back on poor Brian Magno. And that includes if there were a legal dispute with one of those licensees down the road that he agreed to let the AP pass along the license to. The indemnification clauses are, tragically, common in many freelance contracts, and I’ve taken to simply crossing them out of any writing contract I’ve engaged in (and have not had anyone push back on that yet). Suffice it to say, no one should agree to these AP terms. If you want a thorough explanation of how these clauses could come back to bite you, this is a good thread:
"[I took the video, I own it, nobody will be sued] so there's no risk for me in granting an indemnity" or "they'll never insist on strict performance of that onerous term" are things I hear at the outset of advising on contracts all the time.
A bunch of folks started calling this nonsense out, including lawyer Jay Wolman, who worried that people were agreeing to the “release form” without speaking to a lawyer or understanding what they were agreeing to. In response, Rico blocked him on Twitter, which is pretty fucked up. Indeed, a bunch of law twitter started pointing out how ridiculous this policy is, and Rico started blocking more of them:
The fact that your reporters block en masse anyone who criticizes this disgusting business practice is more than a little concerning, too, @AP. pic.twitter.com/UuORJgKigI
In short, you never know if there might be a legal dispute down the road, and there are countless reasons why there might be, even if you don’t care about the image at all. And, yet, if there is, you’ve just agreed to pay a giant company’s legal fees. With a tweet. Think about that.
That’s bad. And, then on Sunday (as already noted), Rico took his entire account private.
I’m kind of wondering what that even means for anyone who agreed to Rico’s tweeted contract in the first place. Unless they took a screenshot of it, they can’t even prove that they ever saw it or what they agreed to — because it’s now hidden away.
Wolman began investigating this entire practice, and has traced it back at least to 2015 (coincidentally, a month after my article about how this Twitter media scrum descends on such things):
This is a horrific practice by the Associated Press, and it really should (1) apologize, (2) revoke any of those agreements for anyone who did agree to them, and (3) no longer do that again.
Last month, we wrote about a declaratory judgment lawsuit that had been filed against a client of Mathew Higbee. As we’ve discussed at length, Higbee runs “Higbee & Associates” which is one of the more active copyright trolls around these days, frequently sending threatening shakedown-style letters to people, and then having various “paralegals” demand insane sums of money. In some cases, it does appear that Higbee turns up actual cases of infringement (though, even in those cases, the amount he demands seems disconnected from anything regarding a reasonable fee). But, in way too many cases, the claims are highly questionable. The lawsuit mentioned last month represented just one of those cases — involving a threat against a forum because one of its users had deeplinked a photographer’s own uploaded image into the forum. There were many reasons why the threat was bogus, but as per the Higbee operation’s MO, they kept demanding payment and dismissing any arguments for why the use was not infringing (and, relatedly, why it was against the incorrect target).
Paul Levy and Public Citizen filed for declaratory judgment that the use was non-infringing, and in the process, pondered publicly whether or not Higbee had warned his various clients that they might end up in court in response to Higbee’s aggressive tactics. Apparently, in the case of photographer Quang-Tuan Luong, the photographer was not particularly happy about ending up in court, and Higbee and his client quickly agreed to cut and run, despite Higbee’s insistence that he was ready to take this matter to court.
I gave Higbee a chance to withdraw his client?s claims; however, Higbee had previously told me that my arguments about non-liability for infringement in an identical case were ?delusional,? so we decided to give Higbee a chance to explain to a judge in what way these defenses were delusional, that is, in response to an action for a declaratory judgment.
I confess that, in filing that lawsuit, I wondered whether Higbee had ever warned Luong that he would not necessarily get to make the final decision whether his demand would end up in litigation, in that the very aggressiveness of Higbee?s demand letters, coupled with persistent nagging from paralegals to offer a settlement or face immediate litigation, sets up his clients to be sued for a declaratory judgment of non-infringement. That speculation proved prescient, because Higbee?s immediate response to the lawsuit was to offer to have his client covenant not to sue Schlossberg for infringement. Higbee also told me that he had offered to defend Luong against the declaratory judgment action for free. It appears, however, that even such a generous offer was not enough to hold onto Luong as a copyright infringement claimant in this case. A settlement agreement has been signed; because there is no longer a case or controversy, the lawsuit has now been dismissed.
Levy makes it clear, however, that he’s actively looking for other such cases to challenge in court in response to Higbee’s overaggressive demands:
Since that blog post, I have got wind of several other situations in which Higbee has claimed large amounts of damages against forum hosts. We are considering which ones would make the best test cases.
My last blog post about Higbee mentioned another case in which he had made a demand against the host of a forum about United States elections, where a user had posted a deep link to a photograph by another of Higbee?s stable of clients, Michael Grecco. Higbee has sued on Grecco?s behalf on a number of occasions, and Higbee told me that, unlike Luong, Grecco was a true believer who was looking for opportunities to pursue Higbee?s copyright theories in litigation. Higbee said that he was going to be talking to Grecco to confirm that he wanted to litigate against the election forum. I could not help suspecting at the time that Higbee was blowing smoke to show what a tough guy he is. That was a month ago, and yet so far as I can tell, Higbee has not yet got around to talking to his client about the subject. I have to wonder just who it is that wants to litigate Higbee?s legal theories.
Indeed, I have asked Higbee whether he warns his clients generally that they can be sued for a declaratory judgment of non-infringement even if they have never given Higbee authority to go to court on their behalf. He told me that he is too busy to address my questions.
He also notes that another such declaratory judgment filing has been made against the very same Michael Grecco:
That case involves another demand letter from Higbee, this time to an indigent young man named Lee Golden who lives in Brooklyn with his parents and blogs about action movies. Because Golden included a Grecco photograph of Xena the Warrior Princess, Higbee sent his typical aggressive demand letter, setting $25,000 as the required payment to avoid being sued. Golden responded with a plaintive email, apologizing profusely, saying that he had no idea about copyright issues, that he had taken down the photo…own, returning to its demand for $25,000 and threatening to seek $30,000 or even $150,000 if the case had to be litigated. Higbee even sent a draft infringement complaint, threatening to make Golden defend himself in the Central District of California even though many of Higbee?s actual lawsuits are filed in the jurisdiction where the alleged infringer lives, perhaps because Higbee wants to avoid having to litigate personal jurisdiction.
But Golden?s counsel likely did not know this, so Strupinsky and his partner Joshua Lurie have filed suit on Golden’s behalf in the Eastern District of New York, seeking a declaratory judgment of non-infringement. We will see how anxious Michael Grecco is to litigate this case.
We see this again and again with copyright trolling operations. They often promise potential clients that this is a “no risk” way to make money. Just sign up and they’ll scour the internet and you’ll just sit back and receive the payments. Indeed, Higbee’s site suggests just that:
Let a national copyright law firm take care of all of your copyright enforcement needs? from reverse image search to collecting payment. You pay nothing up front. We only get paid when you get paid. Best of all, by using us for reverse image search you will be eliminating the middle man and nearly doubling your profit.
His site also claims that he’ll go to court for you “assuming you want us to” — leaving out the risk of a declaratory judgment filing (and associated embarrassment for trying to shake down non-profits and personal websites of people with no money).
A little over a year ago, we wrote about a pretty bad ruling in NY, by Judge Katherine Forrest, arguing that merely embedding content on a site — even though it’s hosted elsewhere — could be deemed infringing. This went against what has been known as the “server test,” which says that the issue is where the content is actually hosted (which server it’s actually on), and that merely embedding the image shouldn’t lead to new claims of infringement. Considering that, technically, embedding an image is no different than linking to an image, saying that embedding an image that is hosted elsewhere is itself infringing could put much of the basic concept of how the internet works at risk.
This particular case involved a photo of quarterback Tom Brady that had been posted originally to Snapchat. The image, taken by photographer Justin Goldman, made its way from Snapchat to Reddit to Twitter. Some news organizations embedded tweets showing the photo, using Twitter’s native embed functionality. Goldman sued a bunch of them. Judge Forrest, citing the Supreme Court’s “looks like a duck” test in the Aereo ruling said that embedding qualifies as displaying a work (even though the websites in question aren’t hosting anything other than a pointer telling user’s computers to go find that image). Even worse, Forrest explicitly rejected the server test, saying it was wrong.
This was poised to be a pretty big deal… except that it’s not. Because the entire lawsuit has been settled leaving the question of whether or not the server test is considered valid (especially in NY where the case was filed) unanswered. While there is the Forrest ruling on the books, since it’s in a district court it creates no official precedent that other courts need to follow (though that won’t stop it from being cited). However, as the linked article notes, there are some other cases challenging the server test and looking at the legality of embeds still going on, so perhaps we won’t have to wait long for the issue to bubble up again. One hopes that, this time, a court will accept the basic server test as the only reasonable interpretation of the law.
While much of the attention around French President Emmanuel Macron’s speech at the Internet Governance Forum (IGF) on Monday was focused on the so-called “Paris Call” agreement on cybersecurity, it was also an occasion for the French President to announce a plan to effectively embed regulators with Facebook to learn how to better censor the platform:
The French president announced on Monday a six-month partnership with Facebook aimed at figuring out how the European country should police hate speech on the social network.
As part of the cooperation — the first time that Facebook has teamed up with national politicians to hammer out such a contentious issue — both sides plan to meet regularly between now and May, when the European election is due to be held. They will focus on how the French government and Facebook can work together to remove harmful content from across the digital platform, without specifying the outcome of their work or if it would result in binding regulation.
Facebook’s press people have pushed back on the claim that this is a program to “embed” government censors within Facebook, saying it’s more just about showing them how Facebook manages content moderation:
It’s a pilot program of a more structured engagement with the French government so that both sides can better understand the other’s challenges in dealing with the issue of hate speech online. The program will allow a team of regulators, chosen by the Elysee, to familiarize [itself] with the tools and processes set up by Facebook to fight against hate speech. The working group will not be based in one location but will travel to different Facebook facilities around the world, with likely visits to Dublin and California. The purpose of this program is to enable regulators to better understand Facebook’s tools and policies to combat hate speech and, for Facebook, to better understand the needs of regulators.
While many people may have the instinctual reaction that having government regulators coming in to see how to “better” censor speech on your platform is inherently a problem, one hopes that the end result of this is influencing things in the other direction. A bad outcome would be French regulators deciding that this experience gives them enough info to craft impossible regulations to wave digital magic wands and “make the bad stuff disappear.” But a more optimistic argument would be that it gives these French regulators a chance to get some first hand knowledge of (1) how seriously Facebook takes this issue (don’t laugh, because the company absolutely does take this issue seriously now, even if it didn’t in the past) and (2) just how impossible it is to do a particularly good job at it (even as Facebook has gotten much better in the past year).
So while I’m always a little concerned about the idea of having government regulators come into a company when the upfront stated objective is about more content moderation demands, it certainly would be beneficial for French officials not to be so incredibly ignorant about how content moderation at scale truly works, and why the easy solutions they always seem to propose won’t help (and could make problems significantly worse).
Back in February we wrote about an absolutely horrible ruling out of a New York court by Judge Katherine Forrest that argued embedding an infringing tweet could be an act of infringement on its own. As we pointed out, if this ruling holds, it would undermine some of the basis of how the internet itself works. The issue here gets a bit into the weeds of both how the internet and how copyright law works. Embedding something on the internet, at a technical level, is really no different than how linking on the internet works. And it’s long been established that if you link to infringing content, that alone should not be considered a separate act of infringement. But is embedding? At a very basic level, this is the difference between the two:
A link:
<a href=”http://www.somedomain.tld/image.png”>An image!</a>
An embed:
<img src=”http://www.somedomain.tld/image.png” title=”An image!”>
Everyone agrees that the first one is not infringing by itself (the original site hosting it, or the person who uploaded it, may be infringing, but not the person linking to it). Most courts have used the “server test” on this question, saying that if you merely embedded the image, a la what’s above, it’s not infringing for the person who used the embed code. This makes sense for a fairly important reason: if you use an embed code on your site, you never actually have the image on your site. Even if it appears on the site, that is merely because the end user’s browser pulls that image in and displays it — which is exactly how the web was designed to work, with the ability to pull in content from many different places and show it all together.
But Judge Forrest decided to throw everyone for a loop and toss that whole idea out the window:
The Court declines defendants? invitation to apply Perfect 10?s Server Test for two reasons. First, this Court is skeptical that Perfect 10 correctly interprets the display right of the Copyright Act. As stated above, this Court finds no indication in the text or legislative history of the Act that possessing a copy of an infringing image is a prerequisite to displaying it…
Perhaps more troubling is that Forrest cited the silly Aereo “looks like a duck” test to argue that even though it’s technically no different than linking, and even though the defendants in this case don’t actually host or distribute the image, because it looks like they are hosting it, they can be liable for infringing the display right.
In this particular case, photographer Justin Goldman sued a bunch of media sites for embedding a photo that others had uploaded to Twitter (Goldman had originally posted it to Snapchat, and someone else took it to Reddit, where someone else brought it to Twitter). A bunch of media sites then embedded the tweet, and Goldman sued them all more or less, even though such embeds that show the associated media are a key feature of Twitter.
Judge Forrest allowed the defendants to do an interlocutory appeal, which basically puts the rest of the case on hold to allow a certain part of the case to be appealed to make sure the district court got it right. Interlocutory appeals aren’t always allowed and some courts don’t really like them very much. In this case, Judge Forrest allowed it to go up to the 2nd Circuit appeals court, but that court has said it won’t review the ruling… for now.
Depending on where you stand this may or may not be a good thing. The case now moves back to the lower court (though, potentially with a different judge as Forrest just announced she’s leaving the bench at some point “later this year.”). It may go to trial, or the remaining defendants may decide to just settle the case and not have to deal with it. If the case does move forward, there are other potential reasons why Goldman may have difficulty winning, including the lack of actual knowledge of infringement by the publishers embedding the tweets.
In either of those situations, Forrest’s odd decision is then rendered less impactful. Since it’s in the district court, it has no direct precedential value on other cases (though can be cited). And that’s at least preferable to the 2nd Circuit blessing Forrest’s dismissal of the server test… though not as good as if the 2nd Circuit decides to bless the server test. It’s also possible that the issue could come up on appeal later (i.e., not as an interlocutory appeal, but after the case reaches a conclusion in the lower court). Either way, this case is still a bit of a mess, and is yet another example of how bad the law is at dealing with technology.
Just earlier this week we noted that a judge easily laughed Playboy’s silly lawsuit out of court because merely linking to infringing content is not infringing itself. But a judge in New York, Judge Katherine Forrest, has ruled on a different case in a manner that is quite concerning, which goes against many other court rulings, and basically puts some fundamental concepts of how the internet works at risk. It’s pretty bad. In short, she has ruled that merely embedding content from another site can be deemed infringing even if the new site is not hosting the content at all. This is wrong legally and technically, and hopefully this ruling will get overturned on appeal. But let’s dig into the details.
The case involved a photographer, Justin Goldman, who took a photograph of quarterback Tom Brady on Snapchat. Somehow that image made its way from Snapchat to Reddit to Twitter. The photo went a bit viral, and a bunch of news organizations used Twitter’s embed feature to show the tweet and the image. Goldman sued basically all the news publications that embedded the tweet — including Breitbart, Vox, Yahoo, Gannett, the Boston Globe, Time and more. Now, multiple different courts around the country have said why this should not be seen as infringing by these publications. It’s generally referred to as “the server test” — in which to be direct infringement, you have to host the image yourself. This makes sense at both a technical and legal level because “embedding” an image is no different technically than linking to an image. It is literally the same thing — you put in a piece of code that points the end user’s computer to an image. The server at no point hosts or displays the image — it is only the end user’s computer. In the 9th Circuit, the various Perfect 10 cases have established the server test, and other courts have adopted it or similar concepts. In the 7th Circuit there was the famous Flavaworks case, where Judge Posner seemed almost annoyed that anyone could think that merely embedding infringing content could be deemed infringing.
But Judge Forrest has decided to carve a new path on this issue in Southern New York, teeing up (hopefully) an opportunity for the 2nd Circuit to tell her why she’s wrong. Even more troubling, she actually relies on the awful Aereo “looks like a duck” test to come to this conclusion. Let’s dig into her reasoning. The key issue here is the exclusive right to “display” a work under copyright, known as 106(5) under copyright law.
It’s also important to note that this ruling is just at the summary judgment stage, and doesn’t mean that the various publications will be found to have infringed — it just means that the court is letting the case go forward, meaning that the various publications might now raise various defenses as to why their embedding is not infringing. It’s still concerning, because given the “server test” in other jurisdictions, such a case would easily be tossed on a motion to dismiss or summary judgment because there’s no legitimate claim of copyright infringement if no direct infringement can be shown. But here, Judge Forrest argues that because an embed leads an end user’s computer to display an image, that somehow makes the publisher who included the embed code possibly liable for infringing the display right. Because it looks like a duck.
This is not a new issue by any means. I found a story from over a decade ago in which I warned that we’d see a lot more stupid lawsuits about embedding content from platforms, and have to admit I’m a bit surprised we haven’t seen more. The reason that’s the case is almost certainly because of the reliance of many courts on the server test, leading many to realize such an argument is a non-starter. Until now.
Forrest basically says that even though the image never touches the publisher’s server, and the only thing the publisher is doing is linking to an image in a manner that makes the end-user’s browser grab that image from another location and display it, it still counts as infringement — because of the Aereo ruling. If you don’t recall, Aereo involved a creative (if technically stupid) method for streaming over-the-air broadcast TV to users by setting up many local antennas that were legally allowed to receive the signals, and then transmitting them over the internet (which is also legal). But, the Supreme Court came up with a brand new test for why that’s not allowed — which we’ve called the “looks like a duck” test. The ruling found that because Aereo kinda looked like cable to the end user, the technical rigamarole in the background to make it legal simply doesn’t matter — all that matters is how things looked to the end user. Forrest argues the same is true here:
Moreover, though the Supreme Court has only weighed in obliquely on the issue, its language in Aereo is instructive. At heart, the Court?s holding eschewed the notion that Aereo should be absolved of liability based upon purely technical distinctions?in the end, Aereo was held to have transmitted the performances, despite its argument that it was the user clicking a button, and not any volitional act of Aereo itself, that did the performing. The language the Court used there to describe invisible technological details applies equally well here: ?This difference means nothing to the subscriber. It means nothing to the broadcaster. We do not see how this single difference, invisible to subscriber and broadcaster alike, could transform a system that is for all practical purposes a traditional cable system into a ?copy shop that provides patrons with a library card.??
We were worried about the wider impact of the Aereo “duck” test — and people told us it wasn’t that big a deal. Indeed, until this ruling, Aereo hasn’t been (successfully) cited very often. Many thought that the very specific nature of Aereo might limit that precedent to a very specific situation involving cable TV. This ruling suggests that the silly “duck” test may be spreading. And that’s bad, because it’s based on ignoring what’s actually happening at the technological level, in which the technology may be designed specifically to not violate any of the exclusive rights of copyright law.
Also, it should worry people greatly that courts are using this “we don’t care about what’s actually happening, we just care what it looks like” standard for judging infringement. Because to infringe on a copyright requires a very specific set of facts. And here (as with Aereo) the court is saying “we don’t care about whether or not it actually violates one of the exclusive rights granted under copyright, we only care if it looks like it infringes.” That’s… a huge change in the law, and it’s not at all how copyright law has been judged in the past. It can and will be used to hamstring, limit, or destroy all sorts of unique and useful technological innovations.
Forrest also tries to distinguish this ruling from the Perfect 10 cases and the Flava Works case — even admitting that other 2nd circuit courts have used the server test. But, she says, they were all different — doing things like only using the server test for the distribution right, but not the display right, or not really endorsing the server test and ruling on other reasons.
Forrest also points to a trademark case that involved an embedded image which was found to be infringing — but that’s entirely different. The rules for trademark infringement are completely different than the exclusive rights related to copyright. With trademark, it’s not as specific, and the use of someone else’s logo broadly (as happened in the case cited) could easily be infringing on the trademark, but that doesn’t get to the copyright question which involves much more carefully limited rights.
But, most troulbing of all, Forrest argues that the server test… is just wrong:
The Court declines defendants? invitation to apply Perfect 10?s Server Test for two reasons. First, this Court is skeptical that Perfect 10 correctly interprets the display right of the Copyright Act. As stated above, this Court finds no indication in the text or legislative history of the Act that possessing a copy of an infringing image is a prerequisite to displaying it. The Ninth Circuit?s analysis hinged, however, on making a ?copy? of the image to be displayed?which copy would be stored on the server. It stated that its holding did not ?erroneously collapse the display right in section 106(5) into the reproduction right in 106(1).? Perfect 10 II, 508 F.3d at 1161. But indeed, that appears to be exactly what was done.
The Copyright Act, however, provides several clues that this is not what was intended. In several distinct parts of the Act, it contemplates infringers who would not be in possession of copies?for example in Section 110(5)(A) which exempts ?small commercial establishments whose proprietors merely bring onto their premises standard radio or television equipment and turn it on for their customer?s enjoyment? from liability. H.R. Rep. No. 94-1476 at 87 (1976). That these establishments require an exemption, despite the fact that to turn on the radio or television is not to make or store a copy, is strong evidence that a copy need not be made in order to display an image.
Except… that’s still very different. That’s still a case where the “small commercial establishments” are showing the work. In this case — and the very reason why the server test is so important — the content in question is never on the publisher’s premises or server. It only appears on the end user’s browser, because that browser goes and fetches it.
Even more bizarre, Forrest argues that Perfect 10 and the server test are different because the image is displayed on the end user’s computer:
In addition, the role of the user was paramount in the Perfect 10 case?the district court found that users who view the full-size images ?after clicking on one of the thumbnails? are ?engaged in a direct connection with third-party websites, which are themselves responsible for transferring content.? Perfect 10 I, 416 F. Supp. 2d at 843.
In this Court?s view, these distinctions are critical.
While this doesn’t involve the end user “clicking” first to get the display, it’s really no different. It is the end user who has the allegedly infringing content displayed on their computer, not the publisher. A direct connection is made between the end user and the hosting provider (in this case Twitter). The publisher never touches the actual content. Yet, Forrest argues that they can be direct infringers.
That’s… wrong.
Despite the fact that EFF and others warned the court that this ruling would would massively upset the way the internet works, Forrest doesn’t seem to believe them (or care)… because maybe fair use will protect people.
The Court does not view the results of its decision as having such dire consequences. Certainly, given a number as of yet unresolved strong defenses to liability separate from this issue, numerous viable claims should not follow.
In this case, there are genuine questions about whether plaintiff effectively released his image into the public domain when he posted it to his Snapchat account. Indeed, in many cases there are likely to be factual questions as to licensing and authorization. There is also a very serious and strong fair use defense, a defense under the Digital Millennium Copyright Act, and limitations on damages from innocent infringement.
That’s… also wrong. Yes, publishers may be protected by fair use or other defenses. But fair use is much harder to get a ruling on at an early (summary judgment) stage in a case (a few courts are starting to allow this, but it’s not all that common). Having the server test be good law would prevent a flood of these kinds of cases from being filed. Without it, people can troll media sites that embed tweets and go after them, leading to long and costly litigation, even if they have strong fair use defenses. Also, the reference above to releasing the image “into the public domain” is nonsensical. No one is arguing that the image was in the public domain. It is clearly covered by copyright.
Given what a total and complete mess this ruling will cause on the internet should it stand, I fully expect a robust appeal. The 2nd circuit can be a mixed bag on copyright, but often does a pretty good job in the end. One hopes that the 2nd circuit reverses this ruling, endorses the server test, and keeps the internet working as it was designed — where embedding and linking to content doesn’t magically make one liable for infringement.