Social Responsibility Organization Says Meta’s Embrace Of Encryption Is Important For Human Rights
from the encryption-protects-human-rights dept
Encryption is under attack from all over the world. Australia already has a law on the books trying to force companies to backdoor encryption. The UK is pushing its Online Safety Bill, which would be an attack on encryption (the UK government has made it clear it wants an end to encryption). In the US, we have the EARN IT Act, whose author, Senator Richard Blumenthal, has admitted he sees it as a necessary attack on companies who “hide behind” encryption.
All over the world, politicians and law enforcement officials insist that they need to break encryption to “protect” people. This has always been false. If you want to protect people, you want them to have (and use) encryption.
Against this backdrop, we have Meta/Facebook. While the company has long supported end-to-end encryption in WhatsApp, it’s been rolling it out on the company’s other messaging apps as well. Even if part of the reason for enabling encryption is about protecting itself, getting more encryption out to more people is clearly a good thing.
And now there’s more proof of that. Business for Social Responsibility is a well respected organization, which was asked by Meta to do a “human rights assessment” of Meta’s expanded use of end-to-end encryption. While the report was paid for by Meta, BSR’s reputation is unimpeachable. It’s not the kind of organization that throws away its reputation because a company paid for some research. The end result is well worth reading, but, in short, BSR notes that the expansion of end-to-end encryption is an important step in protecting human rights.
The paper is thorough and careful, details its methodology, and basically proves what many of us have been saying all along: if you’re pushing to end or diminish end-to-end encryption, you are attacking human rights. The key point:
Privacy and security while using online platforms should not only be the preserve of the technically savvy and those able to make proactive choices to opt into end-to-end encrypted services, but should be democratized and available for all.
The report notes that we’re living in a time of rising authoritarianism, and end-to-end encryption is crucial in protecting people fighting back against such efforts. The report is careful and nuanced, and isn’t just a one-sided “all encryption must be good” kind of thing. It does note that there are competing interests.
The reality is much more nuanced. There are privacy and security concerns on both sides, and there are many other human rights that are impacted by end-to-end encrypted messaging, both positively and negatively, and in ways that are interconnected. It would be easy, for example, to frame the encryption debate not only as “privacy vs. security” but also as “security vs. security,” because the privacy protections of encryption also protect the bodily security of vulnerable users. End-to-end encryption can make it more challenging for law enforcement agencies to access the communications of criminals, but end-to-end encryption also makes it more challenging for criminals to access the communications of law-abiding citizens.
As such, the report highlights the various tradeoffs involved in encrypting more communications, but notes:
Meta’s expansion of end-to-end encrypted messaging will directly result in the increased realization of a range of human rights, and will address many human rights risks associated with the absence of ubiquitous end-to-end encryption on messaging platforms today. The provision of end-to-end encrypted messaging by Meta directly enables the right to privacy, which in turn enables other rights such as freedom of expression, association, opinion, religion, and movement, and bodily security. By contrast, the human rights harms associated with end-toend encrypted messaging are largely caused by individuals abusing messaging platforms in ways that harm the rights of others—often violating the service terms that they have agreed to. However, this does not mean that Meta should not address these harms; rather, Meta’s relationship to these harms can help identify the types of leverage Meta has available to address them.
The report notes that people who are worried that by enabling end-to-end encryption, Meta will enable more bad actors, do not seem to be supported by evidence, since bad actors have a plethora of encrypted communications channels already at their disposal:
If Meta decided not to implement end-toend encryption, the most sophisticated bad actors would likely choose other end-to-end encrypted communications platforms. Sophisticated tech use is increasingly part of criminal tradecraft, and the percentage of criminals without the knowledge and skills to use end-to-end encryption will continue to decrease over time. For this reason, if Meta chose not to provide end-to-end encryption, this choice would likely not improve the company’s ability to help law enforcement identify the most sophisticated and motivated bad actors, who can choose to use other end-to-end encrypted messaging products.
While the report notes that things like child sexual abuse material (CSAM) are a serious issue, focusing solely on scanning everything and trying to block it is not the only way (or even the best) way of addressing the issue. Someone should send this to the backers of the EARN IT Act, which is predicated on forcing more companies to scan more communications.
Content removal is just one way of addressing harms. Prevention methods are feasible in an end-to-end encrypted environment, and are essential for achieving better human rights outcomes over time. The public policy debate about end-to-end encryption often focuses heavily or exclusively on the importance of detecting and removing problematic, often illegal content from platforms, whether that be CSAM or terrorist content. Content removal is important for harm from occurring in end-to-end encrypted messaging through the use of behavioral signals, public platform information, user reports, and metadata to identify and interrupt problematic behavior before it occurs.
The report also, correctly, calls out how the “victims” in this debate are most often vulnerable groups — the kind of people who really could use much more access to private communications. It also notes that while some have suggested “technical mitigations” that can be used to identify illegal content in encrypted communications, these mitigations are “not technically feasible today.” This includes the much discussed “client-side” scanning idea that Apple has toyed with.
Methods such as client-side scanning of a hash corpus, trained neural networks, and multiparty computation including partial or fully homomorphic encryption have all been suggested by some as solutions to enable messaging apps to identify, remove, and report content such as CSAM. They are often collectively referred to as ”perceptual hashing” or “client-side scanning,” even though they can also be server-side. Nearly all proposed client-side scanning approaches would undermine the cryptographic integrity of end-to-end encryption, which because it is so fundamental to privacy would constitute significant, disproportionate restrictions on a range of rights, and should therefore not be pursued
The report also notes that even if someone came up with a backdoor technology that allowed Meta to scan encrypted communications, the risks to human rights would be great, given that such technology could be repurposed in dangerous ways.
For example, if Meta starts detecting and reporting universally illegal content like CSAM, some governments are likely to exploit this capability by requiring Meta to block and report legitimate content they find objectionable, thereby infringing on the privacy and freedom of expression rights of users. It is noteworthy that even some prior proponents of homomorphic encryption have subsequently altered their perspective for this reason, concluding that their proposals would be too easily repurposed for surveillance and censorship. In addition, these solutions are not foolproof; matching errors can occur, and bad actors may take advantage of the technical vulnerabilities of these solutions to circumvent or game the system
The report notes that there are still ways that encrypted communications can be at risk, even name-checking NSO Group’s infamous Pegasus spyware.
How about all the usual complaints from law enforcement about how greater use of encryption will destroy their ability to solve crimes? BSR says “not so fast…”
While a shift to end-to-end encryption may reduce law enforcement agency access to the content of some communications, it would be wrong to conclude that law enforcement agencies are faced with a net loss in capability overall. Trends such as the collection and analysis of significantly increased volumes of metadata, the value of behavioral signals, and the increasing availability of artificial intelligence-based solutions run counter to the suggestion that law enforcement agencies will necessarily have less insight into the activities of bad actors than they did in the past. Innovative approaches can be deployed that may deliver similar or improved outcomes for law enforcement agencies, even in the context of end-to-end encryption. However, many law enforcement entities today lack the knowledge or the resources to take advantage of these approaches and are still relying on more traditional techniques.
Still, the report does note that Meta should take responsibility in dealing with some of the second- and third-order impacts of ramping up encryption. To that end, it does suggest some “mitigation measures” Meta should explore — though noting that a decision not to implement end-to-end encryption “would also more closely connect Meta to human rights harm.” In other words, if you want to protect human rights, you should encrypt. In fact, the report is pretty bluntly direct on this point:
If Meta were to choose not to implement end-to-end encryption across its messaging platforms in the emerging era of increased surveillance, hacking, and cyberattacks, then it could be considered to be “contributing to” many adverse human rights impacts due to a failure to protect the privacy of user communications.
Finally, the paper concludes with a series of recommendations for Meta on how to “avoid, prevent, and mitigate the potential adverse human rights impacts from the expansion of end-to-end encryption, while also maximizing the beneficial impact end-to-end encryption will have on human rights.”
The report has 45 specific (detailed and thoughtful) recommendations to that end. Meta has already committed to fully implementing 34 of them, while partly implementing four more, and assessing six others. There is only one of the recommendations that Meta has rejected. The one that it rejected has to do with “client side scanning” which the report itself was already nervous about (see above). However, one of the recommendations suggested that Meta “continue investigating” client-side scanning techniques to see if a method was eventually developed that wouldn’t have all the problems detailed above. However, Meta says it sees no reason to continue exploring such a technology. From Meta’s response:
As the HRIA highlights, technical experts and human rights stakeholders alike have raised significant concerns about such client-side scanning systems, including impacts on privacy, technical and security risks, and fears that governments could mandate they be used for surveillance and censorship in ways that restrict legitimate expression, opinion, and political participation that is clearly protected under international human rights law.
Meta shares these concerns. Meta believes that any form of client-side scanning that exposes information about the content of a message without the consent and control of the sender or intended recipients is fundamentally incompatible with an E2EE messaging service. This would be the case even with theoretical approaches that could maintain “cryptographic integrity” such as via a technology like homomorphic encryption—which the HRIA rightly notes is a nascent technology whose feasibility in this context is still speculative.
People who use E2EE messaging services rely on a basic premise: that only the sender and intended recipients of a message can know or infer the contents of that message. As a result, Meta does not plan to actively pursue any such client-side scanning technologies that are inconsistent with this user expectation.
We spend a lot of time criticizing Facebook/Meta around these parts, as the company often seems to trip over itself in trying to do the absolutely wrongest thing over and over again. But on this it’s doing a very important and good thing. The BSR report confirms that.
Filed Under: client-side scanning, csam, encryption, end to end encryption, human rights, messenger
Companies: facebook, instagram, meta, whatsapp
Comments on “Social Responsibility Organization Says Meta’s Embrace Of Encryption Is Important For Human Rights”
Is a backdoor in an encryption system, and has all the problems of any backdoor. If the software can look at contents and send a hash, it can also be easily modified to send the actual contents.
Re: What a sad state of affairs
Not that long ago there was the idea that E2E would become standard after the Snowden revelations but it seems we’re on course to be right back where we started with no encryption and everyone less safe.
What really makes me angry though is that most of these governments are using the idea of child safety to justify making everyone less safe including children who arguably need it the most. It’s a hollow and disingenuous means to justify mass surveillance.
There are ways to still tackle illegal content in E2E systems according to actual tech experts in a more surgical manner while keeping everyone safe, however it seems governments have taken the blunt force and privacy-invasive approach of SCAN ALL THE THINGS.
Re: Re: sad state...
…. and then blow the scan(s) all out of proportion.
I mean, if they want someone badly enough, they can simply substitute their own “decryption” of what was allegedly sent by the alleged bad guy (intentional), or they can actually decrypt it, but with poor results (not quite intentional, though plausible, given that “there’s no budget to purchase better tools”).
Of course, there’s always Option C…. simply ‘disappear’ the bad guy on mere suspicion (aka “upon information and belief”). But who’s going to go that far, eh?
It’s like they can’t grasp or maybe don’t want to understand that if you make ANY by means able to see the contents of an encrypted communication it is by DEFINITION a backdoor and not E2E and as we’ve seen time and time again if a backdoor exists, it WILL be exploited by malicious actors (and that’s not even getting to the uses by supposed “trusted” people. LOVEIT anyone?).
There is no such thing as a backdoor that can ONLY be used by the “Good Guys”.
Politicians and security services want back doors into everything so they know what the ordinary people have found out what these supposed top tiers are up to, what they are getting away with, what laws and bills they have/are sorting out so as to continue having advantages in everything over us mere mortals! Keeping us basically enslaved, even if it isn’t through law, just through the actions used on us will achieve what they want, knowledge of all we know and control of how it’s used. That means our fate will be whatever they deem it will be and also what will be to their greatest advantage. Just look at what’s happening worldwide atm. Are any money people struggling to live? Of course not! Are any money people having to choose between heat and eat? Of course not! And are any money people having to constantly decide to change energy companies, change from one shop to another to reduce their outgoings because their incomes have plummeted? Of course not! And how come the record breaking price increases for energy, gas, fuel is constantly blamed on shortages and the countries producing them holding them back intentionally. How come there’s never even a hint that the craving is reduced because of the greater use of recycled items, the vastly cheaper ways energy etc are being produced so the reliability on traditional production methods is less, just like our reliance on traditional carbon fuels is less. The problem then arises that those producing the traditional fuels in the traditional ways still want to get the same payments even though usage is much, much less. Therefore, we have to pay 50% + extra for using 50% less just so those massive earners can carry on being massive earners and as usual, we’re the only ones who get hit in the pockets and lose out on everything while none of the money people have to change a thing in their lives, go without anything or even have to start any form of rationing!
Despite offering end-to-end encryption, WhatsApp collects a ton of metadata outside of that which is strictly necessary for transporting the communications. I’m sure that Facebook’s/Meta’s other messaging apps will continue to collect a ton of metadata after E2EE rolls out to them.
“Infer” is meaningless here because the Business for Social Responsibility’s report also recommends a ton of metadata collection and inference.
Behavioral signals? Metadata? To protect privacy, the implementation of E2EE must be combined with other practices, especially minimization of data collection and minimization of data retention.