Maira Sutton's Techdirt Profile

Maira Sutton

About Maira Sutton

Posted on Techdirt - 4 March 2020 @ 01:43pm

Defeating Tech Giants With Open Protocols, Interoperability, And Shared Stewardship

Across the ideological spectrum, there seems to be a consensus that something must be done about the biggest tech companies — that the legal mechanisms we currently have to address monopolization in the United States are inadequate to deal with the realities of the digital market. While recognizing how powerless our institutions have become in the face of Big Tech’s massive lobbying power, there’s an idea that’s gaining traction as a viable approach to curb the societal and economic impacts of tech monopolies. The idea is to restore the core of a healthy internet ecosystem: interoperability and the revival of open protocols.

Lawmakers, policy experts, and even Twitter, are advocating for tech companies to open up their platforms to enable other services and start-ups to enter the playing field. There are different approaches to doing so and varying layers of interoperability that are possible. The overarching goal, however, would be to get rid of the worst aspects of tech monopolization and bring about a new era of competition and innovation.

As a writer and nonprofit tech entrepreneur who has focused on projects promoting digital justice and community networks, I became interested in these ideas. Why do we need to extend interoperability into the application layer? How do we create new Internet standards that open up our networks and platforms in ways that invite new features and applications that better respect our individual and collective rights online? As I examined the three most likely scenarios being discussed, I realized that we had much to learn from the past. We need to revive the power of standards bodies, and ensure that they stay relevant and effective by observing known principles about how to successfully govern a commons.

A Brief Overview of Interoperability and Competition

What made the early Internet so exciting was how quickly it changed. Different services like bulletin board systems (BBS), email, and Internet relay chat (IRC) came about and allowed people to communicate in ways that were impossible before. That rich ecosystem of tools and services were enabled by downstream innovation. New applications and features could be built with existing technologies with or without permission from the prevailing tech companies. Yes, there were plenty of lawsuits against these start-ups back then. But people were still willing to take the risk, and there were investors that wanted to back them up. There were less onerous laws hindering experimental technologies.

Perhaps most importantly, much of the Internet ran on open protocols and standards. The academics and others who initially designed the protocols wanted to build a relatively free ecosystem, so they made it possible for services to interoperate with each other. Standards bodies like the World Wide Web Consortium (W3C) established shared protocols in the name of the collective interest. These institutions have helped companies and organizations come together and set rules based on agreed upon needs, making them transparent and representative of the interests of more than one stakeholder. At standards bodies, companies sit alongside non-profit organizations, educational institutions, policy experts, and academics.

But standards bodies have grown increasingly inefficient and exploitable. Not only were they always slow and under-resourced, tech companies grew powerful enough to bend them to their will or ignore them altogether by building walled gardens with no interoperability built into their platform (besides providing some public APIs with varying levels of consistency). In the era of Move Fast and Break Things, there was little patience for the kind of multi-stakeholder dialogue and decision-making that is required to build and conform to shared technical standards.

There has been little incentive for tech companies to play well with others. Not only that, it’s become the norm for tech monopolies to destroy any competition. Laws that regulate the internet such as the Digital Millennium Copyright Act (DMCA) and the Computer Fraud and Abuse Act (CFAA) have had a chilling effect on the types of innovation that was characteristic of the early internet. These regulations can be weaponized by big players to crush new start-ups over even the most trivial violations. If they don’t sue them, tech companies can easily buy them out or throw all their resources into imitating the services of their smaller competitors until they crush them.

Now most people communicate, get news, and publish their work through closed platforms run as web services. When people think of the Internet, they think about the platforms, not the protocols that run beneath them and make them work. To many, email is Gmail, chat is Slack, and discussion forums are Facebook.

Of course the underlying protocols are still core to the Internet’s functionality. But these closed platforms severely lack the traits of interoperability. As we’ve become dependent on them, our digital lives have been left at the mercy of companies whose primary goal is to enclose as much of the Internet’s infrastructure as they can get away with. Especially when it comes to social networks, their ability to mediate every aspect of our relationships and interactions online has come at an immense cost to our right to free expression, privacy, and access to knowledge.

Possible Paths Towards an Interoperable Internet

There are those who are calling for a revival of antitrust enforcement to break up the tech monopolies. But federal agencies in the U.S. such as the Federal Trade Commission (FTC) move too slowly and are under-resourced. And then there are others who say that breaking up the tech companies is entirely the wrong approach — that we need to build protocols to again make the Internet more interoperable as it was in the early days.

The European Commission, the Electronic Frontier Foundation, the University of Chicago Booth School of Business, Mozilla, Twitter CEO Jack Dorsey, and others are calling for a revival of interoperability as means to address Big Tech’s dominance over the Internet. Among them they present three possible ways this could come about, with or without state intervention.

1) State Antitrust Enforcement

Through litigation or legislative action, the state could require companies to make their platforms more open and interoperable. Mozilla’s Chris Riley asserts that the agency best suited to take this on would be the FTC, which has the explicit mandate to protect consumer protection and enforce U.S. antitrust laws. Harold Feld of Public Knowledge calls for an entirely new agency empowered to oversee any implementation of any proposed law enforcing digital platform competition, given the specific technical complexities of enforcing such a law.

There is precedence for this in Europe. The European Commission brought a case against Microsoft in the early 2000’s that resulted in the company being required to release information enabling competing software to interoperate with Windows desktops. The U.S. and Europe have their own approaches to antitrust, of course. Interoperability enforcement would look very different depending on which state(s) had the mandate to move forward with this type of action.

2) Established Platform Companies Seek Standardization

One of the big players could willingly embark on a path to build open protocols. In December, Twitter CEO Jack Dorsey announced Blue Sky, an initiative to help develop an open and decentralized standard for social media. In his Twitter thread about the project, Dorsey says that Twitter would fund further development of an existing decentralized standard or as he says, “create one from scratch”.

Many responded to him asking about ActivityPub — the protocol behind Mastodon, the federated alternative to Twitter. Why wouldn’t Twitter invest its resources into that? Dorsey responded that it might be possible, but that it’s up to the Blue Sky team to decide whether that protocol would be best. It’s worth pointing out that ActivityPub has already gone through discussions at the W3C and is officially a recommended standard.

It makes sense that a major platform would want to decentralize their platform, the most obvious reason being to relieve themselves of the responsibility over content moderation. The second reason is to fortify itself against even bigger competitors, like Facebook, that threaten to enclose even more of the Internet.

3) Building Open Protocols from Scratch

Within the last seven years there’s been an explosion of decentralized protocols, dealing with everything ranging from currency and commerce to social media and decision-making. We are way beyond the proof of concept stage. There are all kinds of ways to build decentralized protocols — based on gossip, distributed files, blockchains, or federated databases. The issue isn’t whether decentralization is technically feasible. The issue is that there are so many ways to do it and how each protocol is appropriate for different use cases.

Developer and writer Jay Graber compared a few of the most well-known decentralized social network protocols. She explains the pros and cons of each protocol and how they operate. Protocols that put users in full control over their data and identity in a network can be too technically challenging for the average user. Protocols that rely on append-only logs, such as secure scuttlebutt, make it impossible to edit or delete posts. Federated networks can carry many of the same user-friendly features as centralized networks, but still leave the server administrators hosting the network with the same challenges — such as overseeing content moderation and platform security. So while protocols can be more neutral than platforms they still contain biases.

This Is a Human Problem, Not a Technical One

If we’re talking about interoperability, we’re talking about public Internet infrastructure. Open protocols and standards are part of a digital commons, and a commons thrives when people use and maintain it together.

As economist Elinor Ostrom declared in her Nobel Prize winning work, commoning is a social practice. What Ostrom asserted was that the success or failure of a commons, or what she called a “common-pool resource”, rested on “how a group of principals who are in an interdependent situation can organize and govern themselves to obtain continuing joint benefits when all face temptations to free-ride, shirk or otherwise act opportunistically.” For any type of commons, it’s the relationships between those governing and relying on shared resources together that’s central to its success.

Standards bodies were in many ways an implementation of Ostrom’s eight principles for a commons — what she found were the basic elements needed for a commons to be governed sustainability and equitably. Thus Dorsey’s call to build an “open community” around a new social media protocol is encouraging. It suggests the need to build organizations that keep various stakeholders engaged in an open dialogue about how we make social networks open and interoperable. This is the promise of a functional standards body. When they’re robust and effective, they can play a critical role in ensuring that the Internet remains free, open, and equitable.

Overcoming the Challenges

It’s exciting to see this resurgence of energy for greater interoperability. But I’m not going to let myself get too hopeful. Anyone involved in this project needs to prove that this is going to be done right. Whether due to state-driven antitrust enforcement, tech companies’ self-motivation, or from the bottom up with a new decentralized protocol, the manner in which protocols come about will be critical.

There are foreseeable issues with all three paths towards interoperability. Even if the government were to regulate companies, the state is feeble in the face of overpowering influence from corporate lobbyists. The revolving door that exists between private industry and public oversight bodies nearly guarantees a compromised process. Companies can’t be trusted either. Independent developers have been continually burned by companies being unpredictable and negligent regarding the availability of public APIs — not to mention the hundreds of other ways they’ve violated public trust. Finally, nearly all of the decentralized web protocols were built by lone geniuses who collectively represent one demographic. If new protocols are to address the needs of diverse online communities, more types of people will need to be involved in their development.

Internet interoperability cannot be a project embarked on for the sake of profit, power, or someone’s ego. Who is or is not in the room when critical decisions are being made about the protocol will make or break whether it will succeed in bringing about a more interoperable internet. We ought to learn from experts who know what it takes to govern shared resources together. If we’re serious about re-building the Internet as public infrastructure, we need to be prudent enough to assemble the types of organizations that can steward the Internet protocols of the future.

Further Reading

Posted on Techdirt - 6 March 2015 @ 04:02pm

The White House Has Gone Full Doublespeak On Fast Track And The TPP

Sen. Ron Wyden and Sen. Orrin Hatch are now in a stand-off over a bill that would put secretive trade deals like the Trans-Pacific Partnership (TPP) agreement on the Fast Track to passage through Congress. The White House meanwhile, has intensified their propaganda campaign, going so far as to mislead the public about how trade deals?like the TPP and its counterpart, the Transatlantic Trade and Investment Partnership (TTIP)?will effect the Internet and users’ rights. They are creating videos, writing several blog posts, and then this week, even sent out a letter from an “online small business owner” to everyone on the White House’s massive email list, to further misinform the public about Fast Track.

In a blog post published this week, the White House flat out uses doublespeak to tout the benefits of the TPP, even going so far as to claim that without these new trade agreements, “there would be no rules protecting American invention, artistic creativity, and research”. That is pure bogus, much like the other lies the White House has been recently saying about its trade policies. Let’s look at the four main myths they have been saying to sell lawmakers and the public on Fast Track for the TPP.

Myth #1: TPP Is Good for the Internet

First, there are the claims that this agreement will create “stronger protections of a free and open Internet”. As we know from previous leaks of the TPP’s Intellectual Property chapter, the complete opposite is true. Most of all, the TPP’s ISP liability provisions could create greater incentives for Internet and content providers to block and filter content, or even monitor their users in the name of copyright enforcement. What they believe are efforts toward protecting the future of the Internet are provisions they’re advocating for in this and other secret agreements on the “free flow of information”. In short, these are policies aimed at subverting data localization laws.

Such an obligation could be a good or a bad thing, depending on what kind of impact it could have on national censorship, or consumer protections for personal data. It’s a complicated issue without an easy solution?which is exactly why this should not be decided through secretive trade negotiations. These “free flow of information” rules have likely been lobbied for by major tech companies, which do not want laws to restrict them on how they deal with users’ data. It is dishonest to say that what these tech companies can do with people’s data is good for all users and the Internet at large.

Myth #2: Fast Track Would Strengthen Congressional Oversight

The second, oft-repeated claim is that Fast Track would strengthen congressional oversight?which is again not true. The U.S. Trade Representative has made this claim throughout the past couple months, including at a Senate Finance Committee hearing in January when he said:

TPA puts Congress in the driver?s seat to define our negotiating objectives and strengthens Congressional oversight by requiring consultations and transparency throughout the negotiating process.

Maybe we could believe this if the White House had fought for Fast Track before delegates began negotiating the TPP and TTIP. Maybe it could also have been true if that bill had ensured that Congress members had easy access to the text and kept a close leash on the White House throughout the process to ensure that the negotiating objectives they had outline were in fact being met in the deal. However, we know from the past several years of TPP negotiations, that Congress has largely been shut out of the process. Many members of Congress have spoken out about the White House’s strict rules that have made it exceedingly difficult to influence or even see the terms of these trade deals.

The only way Fast Track could really put “Congress in the driver’s seat” over trade policy would be if it fully addressed the lack of congressional oversight over the TPP and TTIP thus far. Lawmakers should be able to hold unlimited debate over the policies being proposed in these deals, and if it comes to it, to amend their provisions. It would be meaningless if the new Fast Track bill enabled more congressional oversight, but if it did not apply to agreements that are ongoing or almost completed.

Myth #3: Small Online Businesses Would Benefit from Fast Track

Then the third misleading claim is that Fast Track would help small businesses. Their repetition of this has become louder amid increasing public awareness that the TPP has primarily been driven by major corporations. What may be good for established multinational companies could also benefit certain small online businesses as well. The White House says that tariffs are hindering small online businesses from selling their products abroad, but research has shown that the kinds of traditional trade barriers, like tariffs, that past trade agreements were negotiated to address are already close to non-existent. Therefore it is unclear what other kind of benefits online businesses would see from the TPP.

Even if there were some benefits, there are many more ways that the TPP could harm small Internet-based companies. The TPP’s copyright provisions could lead to policies where ISPs would be forced to implement costly systems to oversee all users’ activities and process each takedown notice they receive. They could also discourage investment in new innovative start-ups, even those that plan to “play by the rules”, due to the risk that companies would have to sink significant resources into legal defenses against copyright holders, or face heavy deterrent penalties for infringement established by the TPP.

Myth #4: TPP and Other Secret Trade Deals Are a National Security Issue

The last, and most confounding of the White House’s assertions is that the TPP and TTIP are an “integral part” of the United States’ national security strategy, because its “global strategic interests are intimately linked with [its] broader economic interests.” As we have seen with the U.S. government’s expansive surveillance regime, “national security” is often invoked for policies even if they directly undermine our civil liberties. It is hard to argue with the administration whether the TPP and TTIP are in fact in the United States’ economic or strategic interests, since only they are allowed to see the entire contents of these agreements. Either way, it seems like a huge stretch to say that we can trust the White House and major corporate representatives to determine, in secret, what is in fact good digital policy for the country and the world. We may be hearing this line more and more in the coming weeks as the White House becomes more desperate to legitimize the need for Fast Track to pass the TPP and TTIP.


The fact that the White House has resorted to distorting the truth about its trade policies is enough to demonstrate how little the administration values honesty and transparency in policy making, and how much the public stands to lose from these agreements negotiated in secret. The more they try and espouse the potential gains from Fast Track?while the trade agreements this legislation would advance remain secret?the more reason we ought to be skeptical. If the TPP is so great and if Fast Track would in fact enable more democratic oversight, why are the contents of either of them still not public?

Reposted from the Electronic Frontier Foundation Deeplinks blog

More posts from Maira Sutton >>