I disagree. If you're not going to allow proper https functionality, the correct best practices functionality should have the https version automatically redirect via 301 server redirect to the non https version.
This ensures that anyone getting to the page will actually get a page.
"We will make use of whatever technology is available to preserve evidence on cell phones while seeking a warrant, and we will assist our agents in determining when exigent circumstances or another applicable exception to the warrant requirement will permit them to search the phone immediately without a warrant," Canale said. (Canale is from the DOJ)
There you go. Yeah, fine. I read that as meaning "We're going to find ways to help law enforcement get around this"
So as far as I'm concerned, the DOJ still does what it wants and conspires with others to ignore the constitution.
"permanent secretary of the Information and Communications Technology Ministry"
So exactly how permanent is such a position in a country like this? I mean is it "permanent as long as this coup lasts" permanent? Or is it "permanent even if some other group stages a coup later" permanent?
I think before they shut down all social media entirely (because you know - fomenting unrest during such an important period of unrest is wrong), they should be more clear in the titles they issue to people...
Matt Cutts, head of Googles Search Spam unit stated on the record over on TWIT.TV their intent with the way they are going after spammers:
"If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is sort of break their spirits. There are lots of Google algorithms specifically designed to frustrate spammers. Some of the things we do is give people a hint their site will drop and then a week or two later, their site actually does drop. So they get a little bit more frustrated. So hopefully, and weíve seen this happen, people step away from the dark side and say, you know what, that was so much pain and anguish and frustration, letís just stay on the high road from now on."
So my position that it IS punishment, in my opinion based on this statement,is correct. Punishment is designed to break people of a bad habit.
Okay as an actual SEO professional (one who only advocates real, sustainable SEO, and not crappy spam tactics), I will add this:
1. It IS punishment. Spam has gotten so far out of hand over the years that previous efforts to discourage and otherwise eliminate otherwise undeserving results from the organic listings were not getting the message across. Spam just became a massive business.
So to really send the message across, Google is now much more SEVERELY penalizing sites that use spam tactics, one of which is crappy link techniques. The notion here being that when a site gets a manual penalty for crap links, it becomes a very daunting task to clean up now.
Couple that with most of those sites then needing to re-earn (or in actually earn for the first time in legitimate ways) rankings, and more sites are doing all they can to become good netizens.
2. Leaving spam comments up just to spite foolish site owners is NOT helpful to TechDirt. And it doesn't contribute to punishing those site owners because they'll just disavow the links if you leave them up.
In fact, where it CAN be a problem for TechDirt is if Google's system detects too many spam comments, this site WILL be penalized.
I doubt there are that many on TD, so it's highly unlikely that this scenario would happen (as compared to sites like Mashable or others that have free-for-all comment spam where those are more likely to see some sort of hit).
3. Charging site owners to remove their links is a possible revenue stream, however the overwhelming majority of site owners or link-clean-up providers who encounter a fee situation ignore it and just disavow those links.
And for those site owners who come to me for an audit after they've been penalized, that's exactly what I recommend to them. Along with noting in their tracking of their clean-up those sites that attempted to charge for the service. Because that's potentially subject to being viewed as an extortion scheme under some circumstances (not a TD scenario though either).
How many more sleazy lowlife lawsuits are you intending to file in your lifetime? I mean, isnít the impending doom from the fallout of your shenanigans with John enough to get you to run for cover at this point? Do you believe you can continue to trash the American legal system for every penny you can squeeze out of unsuspecting American citizens who are tricked by your tactics?
Do you feel no remorse? Do you completely disregard human decency? Do you believe you and John are so bullet-proof to the long term legal process? Or do you have tickets out of the country sitting on your nightstand awaiting that fateful day in the near future when a warrant will be issued for your arrest?
I ask out of simple human fascination.
And of course, I state here, for the record, that my views are purely my personal opinion.
Your honor, my client informs me they cannot, unfortunately produce such records. Apparently all records were turned over to Salt Marsh. [communicated as the smell of burning paper wafts across the offices of Steele, Hansmeier and Duffy...]
it sure as hell can be both. While robots.txt is not by original nature related to search engines, a means of security, Google has the power and resources to respect it for the sake of security. If you don't grasp that, not my problem.
Because its an opportunity for Google to help improve the securing of private information on the web. Since they already take proactive steps in other areas to improve security online, why not here?
For example - they proactively block sites their system detects that have malware or viruses. They don't have to. Its the responsibility of site owners to ensure their sites don't have malware or viruses baked in. Yet Google has chosen to help.
No, Google is NOT saying that. Yet their system is more than capable of keeping URLs out of the system that are listed in the robots file so there's no excuse why they, as a supposed security advocate, shouldn't honor robots.txt instructions. "Disallow" is pretty clear in its definition.