Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private

from the that-doesn't-make-any-sense dept

There have been a bunch of slightly wacky court rulings of late, and this recent one from magistrate judge Zia Faruqui definitely is up there on the list of rulings that makes you scratch your head. The case involves the Republic of Gambia seeking information on Facebook accounts that were accused of contributing to ethnic genocide of the Rohingya in Myanmar. This situation was -- quite obviously -- horrible, and it tends to be the go-to story for anyone who wants to show that Facebook is evil (though I'm often confused about how people often seem more focused on blaming Facebook for the situation than the Myanmar government which carried out the genocide...). Either way, the Republic of Gambia is seeking information from Facebook regarding the accounts that played a role in the genocide, as part of its case at the International Court of Justice.

Facebook, which (way too late in the process) did shut down a bunch of accounts in Myanmar, resisted demands from Gambia to hand over information on those accounts noting, correctly, that the Stored Communications Act likely forbids it from handing over such private information. The SCA is actually pretty important in protecting the privacy of email and messages, and is one of the rare US laws on the books that is actually (for the most part) privacy protecting. That's not to say it doesn't have its own issues, but the SCA has been useful in the past in protecting privacy.

The ruling here more or less upends interpretations of the SCA by saying once an account is deleted, it's no longer covered by the SCA. That's... worrisome. The full ruling is worth a read, as you'll know you'll be in for something of a journey when it starts out:

I come to praise Facebook, not to bury it.

Not quite what you expect from a judicial order. The order lays out the unfortunately gory details of the genocide in Myanmar, as well as Facebook's role in enabling the Myanmar government to push out propaganda and rally support for its ethnic cleansing. But the real question is how does all of this impact the SCA. As the judge notes, since the SCA was written in 1986 it certainly didn't predict today's modern social media, or the questions related to content moderation, so this is a new issue for the court to decide. But... still. The court decides that because an account is disabled... that means that the communications are no longer "stored." Because [reasons].

The Problem Of Content Moderation

At the time of enactment, Congress viewed ECS and RCS providers as mail/package delivery services. See Cong. Rsch. Serv., R46662, Social Media: Misinformation and Content Moderation Issues for Congress (2021), This view failed to consider content moderation; mail/package delivery services have neither the ability nor the responsibility to search the contents of every package. Yet after disinformation on social media has fed a series of catastrophic harms, major providers have responded by taking on the de facto responsibility of content moderation. See id. “The question of how social media platforms can respect the freedom of expression rights of users while also protecting [users] from harm is one of the most pressing challenges of our time.” ...

This Court is the first to consider the question of what happens after a provider acts on its content moderation responsibility. Is content deleted from the platform but retained by the provider in “backup storage?” It is not.

That obviously seems like a stretch to me. If the company still retains the information then it is clearly in storage. Otherwise, you've just created a massive loophole by saying that any platform can expose the private communications of someone if they first disable their account.

The court's reasoning, though gets at the heart of the language of the SCA and how it protects both "any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof" or "any storage of such communication by an electronic communication service for purposes of backup protection of such communication." It says the first bit can't apply because these communications had reached their "final destination" and were no longer temporary. And it can't be "backup" since the original content had been deleted, therefore there couldn't be any "backup."

Congress’s conception of “‘backup’ necessarily presupposes the existence of another copy to which this [backup record] would serve as a substitute or support.” Id. Without an original, there is nothing to back up. Indeed “the lifespan of a backup is necessarily tied to that of the underlying message. Where the underlying message has expired . . . , any copy is no longer performing any backup function. An [ECS] that kept permanent copies of [deleted] messages could not fairly be described as ‘backing up’ those messages.”

But... I think that's just wrong. Facebook retaining this data (but blocking the users from accessing it themselves) is clearly a "backup." It's backup in case there is a reason why, at some future date, the content does need to be restored. Under the judge's own interpretation, if you backup your hard drive, but then the drive crashes, your backup is no longer your backup, because there's no original. But... that's completely nonsensical.

The judge relies on (not surprisingly) a case in which the DOJ twisted and stretched the limits of the SCA to get access to private communications:

Nearly all “backup storage” litigation relates to delivered, undeleted content. That case law informs and supports the Court’s decision here. “Although there is no binding circuit precedent, it appears that a clear majority of courts have held that emails opened by the intended recipient (but kept on a web-based server like Gmail) do not meet the [backup protection] definition of ‘electronic storage.’” Sartori v. Schrodt, 424 F. Supp. 3d 1121, 1132 (N.D. Fla. 2019) (collecting cases). The Department of Justice adopted this view, finding that backup protection “does not include post-transmission storage of communications.” U.S. Dep’t of Just., Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations, 123 (2009), The Gambia argues for following the majority view’s limited definition of backup storage. See Sartori, 424 F. Supp. 3d at 1132; ECF No. 16 (Pet’r’s Resp. to Surreply) at 5–6. If undeleted content retained by the user is not in backup storage, it would defy logic for deleted content to which the user has no access to be in backup storage.

As for the argument (which makes sense to me) that Facebook made that the entire reason for retaining the account shows that it's backup, the judge just doesn't buy it.

Facebook argues that because the provider-deleted content remains on Facebook servers in proximity to where active content on the platform is stored, both sets of content should be protected as backup storage. See Conf. Tr. at 76. However, the question is not where the records are stored but why they are stored. See Theofel, 359 F.3d at 1070. Facebook claims it kept the instant records as part of an autopsy of its role in the Rohingya genocide. See Conf. Tr. at 80–81. While admirable, that is storage for self-reflection, not for backup.

The judge also brushes aside the idea that there are serious privacy concerns with this result, mainly because the judge doesn't believe Facebook cares about privacy. That, alone, is kind of a weird way to rule on this issue.

Finally, Facebook advances a policy argument, opining that this Court’s holding will “have sweeping privacy implications—every time a service provider deactivates a user’s account for any reason, the contents of the user’s communications would become available for disclosure to anyone, including the U.S. government.”.... Facebook taking up the mantle of privacy rights is rich with irony. News sites have entire sections dedicated to Facebook’s sordid history of privacy scandals.

So... because Facebook doesn't have a great history regarding the protection of privacy... we can make it easier for Facebook to expose private communications? What? And even if it's true that Facebook has made problematic decisions in the past regarding privacy, that's wholly separate from the question of whether or not it has a legal obligation to protect the privacy of messages now.

Furthermore, the judge insists that even if there are privacy concerns, they are "minimal":

The privacy implications here are minimal given the narrow category of requested content. Content urging the murder of the Rohingya still permeates social media. See Stecklow, supra (documenting “more than 1,000 examples . . . of posts, comments, images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook” even after Facebook apologized for its services being “used to amplify hate or exacerbate harm against the Rohingya”). Such content, however vile, is protected by the SCA while it remains on the platform. The parade of horribles is limited to a single float: the loss of privacy protections for de-platformed content. And even that could be mitigated by users joining sites that do not de-platform content.

Yes. In this case. But this could set a precedent for accessing a ton of other private communications as well, and that's what's worrying. It's absolutely bizarre and distressing that the judge doesn't bother to think through the implications of this ruling beyond just this one case.

Prof. Orin Kerr, one of the foremost experts on ECPA and the SCA, notes that this is both an "astonishing interpretation" and "stunning."

The entire ruling is concerning -- and feels like yet another situation where someone's general disdain for Facebook and its policies (a totally reasonable position to take!) colored the analysis of the law. And the end result is a lot more dangerous for everyone.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: backup, deleted profiles, ecpa, gambia, myanmar, privacy, sca, stored communications act, zia faruqui
Companies: facebook

Reader Comments

Subscribe: RSS

View by: Time | Thread

  1. icon
    Dallas Wonder (profile), 8 Oct 2021 @ 4:38pm


    Why is the SCA being applied to a crime that happened in Myanmar and a law enforcement agency in Gambia?

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here

Subscribe to the Techdirt Daily newsletter

Comment Options:

  • Use markdown. Use plain text.
  • Make this the First Word or Last Word. No thanks. (get credits or sign in to see balance)    
  • Remember name/email/url (set a cookie)

Follow Techdirt

Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it

Email This

This feature is only available to registered users. Register or sign in to use it.