Google Fighting Back Against Optimized Pages

from the interesting... dept

I’ve been hearing more and more people say that Google’s algorithm is “broken”. People are finding that the search engine optimizer people have figured out ways to game their PageRank system to get their pages to the top of Google’s results, leaving more relevant pages out. Mostly this is true of commercial sites. I found this out last week when I tried to find the phone number for a particular hotel, and, instead, Google gave me listing after listing to help me book hotel rooms. Of course, Google is not going to stand still, and it appears that their latest adjustment is targeted directly at Google optimizers. The article is not particularly clear (though, maybe it makes sense to the search engine optimizers it’s targeted for), but the summary appears to be that Google has added some sort of filter – and if the filter believes the site is too optimized it excludes the page from its results. In other words, it’s trying to guess which sites are trying to hard to move up the Google ranks and punishes them for it. That’s a pretty tight line to walk, and the writer of the article (a search engine optimizer) seems to think that there will be ways to beat it, and Google will have to ditch the plan. It does seem pretty difficult to figure out sites look too good for Google without blocking out plenty of legitimate sites as well.


Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Google Fighting Back Against Optimized Pages”

Subscribe: RSS Leave a comment
2 Comments
Gluefreak says:

No Subject Given

Another thing Google needs to filter for is Amazon Web Services. I published a book two years ago and when you search for it, you get tons of results for people’s dynamically generated AWS sites. (Not that I object to the book appearing on those sites, since presumably more exposure = more sales.) But certainly those sites should not be ranking higher than legitimate book reviews, blogs in which the book is mentioned, and so forth. You’d think it would be a relatively simple thing to score AWS pages in relation to other pages, especially since they all have the exact same Amazon-provided content (which is just a mirror of what’s on amazon.com anyway).

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...