YouTube is catching a lot of flack for suspending the account
of an Egyptian activist who had captured evidence of police brutality on video and uploaded it to the site. YouTube's "community guidelines" state that "graphic or gratuitous violence is not allowed"
on YouTube. Apparently, that includes graphic or gratuitous violence perpetrated by governments against innocent civilians. I have to say I don't understand why YouTube goes to so much trouble to censor "objectionable" content. If the goal is to keep such materials away from children, there are effective ways to do that without censoring the material altogether. Flickr, for example, permits pornographic photographs to be uploaded to its site, but it restricts access to them in various ways that helps prevent children from inadvertently stumbling across them. YouTube should be able to implement a similar system. Instead of deleting objectionable content, it should flag it as objectionable. Objectionable content might not show up on the home page or in the default search results. It might also require clicking through a warning page before viewing it. But it's hard to see what purpose is served by deleting the content entirely. The content will be posted somewhere else, where someone else will derive advertising revenue from it. And in the process, YouTube is inadvertently giving the impression that it is helping oppressive governments squelch criticism of their regimes.