from the not-good,-not-good-at-all dept
As the big push is on to approve two internet-focused antitrust bills, the American Innovation and Choice Online Act (AICOA) and the Open App Markets Act, we’ve been calling out that while the overall intentions of both may be good, there are real concerns with the language of both and how it could impact content moderation debates. Indeed, it seems pretty clear that the only reason these bills have strong support from Republicans is because they know the bills can be abused to attack editorial discretion.
There have been some other claims made about problems with these bills, though some of them seem overblown to me (for example, the claims that the Open App Markets bill would magically undermine security on mobile phones). However, Bruce Schneier now points out another potential issue with both bills that seems like a legitimate concern. They both could be backdoors to pressuring companies into blocking encryption apps. He starts by highlighting how it might work with AICOA:
Let’s start with S. 2992. Sec. 3(c)(7)(A)(iii) would allow a company to deny access to apps installed by users, where those app makers “have been identified [by the Federal Government] as national security, intelligence, or law enforcement risks.” That language is far too broad. It would allow Apple to deny access to an encryption service provider that provides encrypted cloud backups to the cloud (which Apple does not currently offer). All Apple would need to do is point to any number of FBI materials decrying the security risks with “warrant proof encryption.”
Sec. 3(c)(7)(A)(vi) states that there shall be no liability for a platform “solely” because it offers “end-to-end encryption.” This language is too narrow. The word “solely” suggests that offering end-to-end encryption could be a factor in determining liability, provided that it is not the only reason. This is very similar to one of the problems with the encryption carve-out in the EARN IT Act. The section also doesn’t mention any other important privacy-protective features and policies, which also shouldn’t be the basis for creating liability for a covered platform under Sec. 3(a).
It gets worse:
In Sec. 2(a)(2), the definition of business user excludes any person who “is a clear national security risk.” This term is undefined, and as such far too broad. It can easily be interpreted to cover any company that offers an end-to-end encrypted alternative, or a service offered in a country whose privacy laws forbid disclosing data in response to US court-ordered surveillance. Again, the FBI’s repeated statements about end-to-end encryption could serve as support.
Finally, under Sec. 3(b)(2)(B), platforms have an affirmative defense for conduct that would otherwise violate the Act if they do so in order to “protect safety, user privacy, the security of nonpublic data, or the security of the covered platform.” This language is too vague, and could be used to deny users the ability to use competing services that offer better security/privacy than the incumbent platform—particularly where the platform offers subpar security in the name of “public safety.” For example, today Apple only offers unencrypted iCloud backups, which it can then turn over governments who claim this is necessary for “public safety.” Apple can raise this defense to justify its blocking third-party services from offering competing, end-to-end encrypted backups of iMessage and other sensitive data stored on an iPhone.
And the Open App Markets bill has similar issues:
S. 2710 has similar problems. Sec 7. (6)(B) contains language specifying that the bill does not “require a covered company to interoperate or share data with persons or business users that…have been identified by the Federal Government as national security, intelligence, or law enforcement risks.” This would mean that Apple could ignore the prohibition against private APIs, and deny access to otherwise private APIs, for developers of encryption products that have been publicly identified by the FBI. That is, end-to-end encryption products.
Some might push back on this by pointing out that Apple has strongly supported encryption over the years, but these bills open up some potential problems, and, at the very least, might allow companies like Apple to block third party encryption apps — even as the stated purpose of the bill is the opposite.
As Schneier notes, he likes both bills in general, but this sloppy drafting is a problem.
The same is true of the language that could impact content moderation. In both cases, it seems that this is messy drafting (though in the content moderation case, it seems that Republicans have jumped on it and have now made it the main reason they support these bills, beyond general anger towards “big tech” for populist reasons).
Once again, the underlying thinking behind both bills seems mostly sound, but these problems again suggest that these bills are, at best, half-baked, and could do with some careful revisions. Unfortunately, the only revisions we’ve seen so far are those that carved out a few powerful industries.