from the direct-sharing-files-is-hard dept
Parker Higgins has a great opinion piece over at Wired, which is ostensibly about the recent release of OnionShare, a tool for sharing large documents directly and securely between two individuals, but which looks deeper into the question of why we’re in 2014 and sharing such large files directly without intermediaries is such a challenge. And, as Higgins notes, a big part of that goes right back to… the copyright wars.
Groups like the Motion Picture Association of America (MPAA), the Recording Industry Association of America (RIAA), and others that make up the copyright lobby have actively campaigned against the kinds of tools that address these aims.
OnionShare creates direct connections between users, making it an example of peer-to-peer network architecture. The copyright lobby?s got a long history with peer-to-peer: at least since Napster emerged a decade and a half ago, corporate copyright holders have endeavored to destroy examples of the tech. We live today with the disastrous results.
After 15 years of being attacked, villainized, and litigated over, peer-to-peer programs and protocols have become a hard sell for investment and development. And as centralized products have gotten a lion?s share of the attention, their usability and market share have increased as well.
The simple fact is that the fight to protect one business model (out of many possible business models) for the entertainment industry, has clearly had a pretty big negative impact on the development of new tools and services that would lead to greater privacy and security (and a more functioning free press):
The qualities that the copyright lobby dislike about peer-to-peer are precisely the ones that make it a powerful choice for defenders of press freedom and personal privacy. Namely, peer-to-peer offers no convenient mechanism for centralized surveillance or censorship. By design, there?s usually no middleman that can easily record metadata about transfers?who uploaded and downloaded what, when, and from where?or block those transfers.
So, if you’re concerned about how much metadata the NSA is scooping up from online services, you have the MPAA and RIAA and its legal fights partially to blame for that. In demonizing distributed, private peer-to-peer applications and protocols, we’ve been driven increasingly to more centralized offerings. As Higgins further highlights, the third party doctrine, giving less privacy to information held by third parties, makes this situation even worse.
The distinction is further reflected in the U.S. legal system, which often offers data that goes through a third party reduced protection. That premise, the ?third party doctrine,? is badly out-of-date, and produces counter-intuitive results in an era where the location of data storage is otherwise abstracted away. Already one Supreme Court Justice, Sonia Sotomayor, has called for reconsidering it. But as long as the third party doctrine exists, architectures like peer-to-peer that allow for direct communication, broadly speaking, provide more privacy protection against invasive government requests.
In short, you have the government wanting to get more access to information, and it can do that on centralized systems — and combine that with the RIAA/MPAAs of the world fighting to either outlaw or diminish investment in more decentralized systems, and you have a recipe for easy mass surveillance. A decentralized world is important for the internet to work correctly, but we’ve been increasingly pushed away from that.
The good news is that with all the discussions of surveillance lately, a renewed push is being made for more decentralized systems. The success of decentralized cryptocurrencies like Bitcoin is also helping things along the way. And there are a large number of other projects that are each trying to tackle different aspects of more centralized systems. Hopefully, they won’t be deterred by litigation spats focused on just preserving a particular business model as well.