I have some thoughts in response to Douglas Bowman’s “The Cost of PageRank” article. In short, it appears Google is becoming too powerful, and some sites are suffering through this. However, Bowman seems to have taken it upon his shoulders to devalue the rank of his pages. Is this right? Should we censor ourselves in fear of being found for the wrong reasons? How about fear being associated with the wrong titles?
The brilliant John Gruber wrote ‘Writing for Google’ earlier this month, where he discusses tactics for getting a solution into
the biggest troubleshooting database. However, as Bowman discussed, this tactic is backlashing. Authors of blogs are posting something about a subject that inadvertently gets them very high on Google’s results for that query. This would normally be a good thing, but sometimes the author gets comment after comment requesting something to do with their subject, even though the author has nothing to do with it. As Bowman says,
Matt has a theory why so many people blindly believe they’ve found the right place to leave comments, or get in touch with some person or entity. And I agree with him. In fact, I’d take his theory a step further. To many people, the Google search box is synonymous with the browser’s location field. It’s the same thing to them. With the Google Toolbar installed in IE, (or because it pre-exists in other browsers) the search field is almost indistinguishable from the browser’s location field. “I type something into that field, and Google takes me where I want to go.”
So there’s the problem: comment spam originating from misunderstanding of the weblog concept. Bowman then goes on to say that he can’t possibly control what he says,
But a single writer can’t always predict (or be responsible for) exiting uses of every noun, verb, adjective, and adverb combination. If someone kept a popular personal blog, and wanted to write about the victories of a family member previously inflicted with a terminal illness, does this mean using the word “Survivor” in the title of a post needs to be avoided, lest that someone be inundated with insensitive folk trying to contact Rupert or Jeff Probst. Or worse yet, begging for the opportunity to be the next Survivor?
I totally agree with this. It’s no fault of the author that Google is ranking their pages over more useful ones. I feel no obligation to censor myself based on one web tool, regardless of its power. But if we feel a need for finger-pointing, shouldn’t we blame Google? Shouldn’t we complain that it’s PageRank’s fault for causing those pages to appear relevant to the hapless searcher?
But then we can’t really blame Google either. A computerised algorithm will always love a cleaner source copy, and thanks to the wonders of semantic markup that’s what’s getting indexed. Google is giving blogs a higher waiting due to their cleaner code (and the frequent updates). So as a result we get this problem.
Now to refer back to the title of this entry. Is Google overpowered? Admittedly, this problem would be less if the search engine market was more evenly spread. Different algorithms would be indexing our pages and we could worry less about the results of a single one. But yet again, noone can be blamed for the popularity of Google, simply conrgratulated. It’s a triumph of supply-and-demand theory that meets precisely the needs of the markets. Let’s not nitpick the side-effects.