Can Peer To Peer Make The Internet More Reliable?
from the networks-that-can't-be-shut-down dept
While so much of the focus on peer-to-peer networks is about file sharing, Simson Garfinkel thinks that’s something of a distraction. The overall benefits of peer-to-peer computing mean that we should be looking to use it to make the internet more reliable. As he points out in the article, part of the fear the recording industry has about peer-to-peer networks is that they can’t be shut down. However, couldn’t we take that same ability to stay up and use it for good reasons? For example, if DNS ran via peer-to-peer, we wouldn’t have to worry about denial of service attacks (like the one that took down a bunch of root servers last year) completely killing DNS. Also, if websites were served up on a peer-to-peer system, then denial of service attacks or even “the Slashdot effect” would have very little impact. He even suggests it could be a way to avoid site defacements – though, if a hacker can propagate the defacement across peer-to-peer nodes they could get around that. Of course, he also points out that this isn’t particularly easy to do. There certainly are people working on it, but that doesn’t mean we’re going to have it any time soon. I think this is all part of the pendulum that keeps swinging back and forth about the pros and cons of centralized vs. distributed data. If we really moved to a truly distributed P2P world, there would suddenly be articles about how inefficient this was to have so much repetitive data out there, and recommending a “new” and “innovative” system of somehow corralling all that distributed data into a “centralized” database, and everyone will suddenly think that’s the next big thing. I don’t deny the benefits of distributed and redundant data. However, both sides have pros and cons – and people seem to get pretty enamored by the potential of whichever one they’re not using.