Site owners that want to gain any kind of substantial revenue from ad traffic have to play by Google's rules. Go black hat, hide text or otherwise try to game the system and you'll get banned or
penalized--and watch the money stream dry up. But for many publishers, the problem is not that they're
trying to run afoul of Google, it's that the rules aren't perfectly clear--and they're
sometimes even arbitrary.
Take Techmeme, for example, the tech industry's blog aggregator. The site is relatively authoritative, doesn't spam readers and explicitly marks its
sponsored posts as ... well, sponsored posts. But Andy Beard noticed that Techmeme's PageRank stands at a relatively low PR 4, a 3-point drop from the PR 7 of a few months ago.
Digging deeper, Beard finds that Gabe, Techmeme's Webmaster and founder, uses 301 redirects, as well as the robots.txt file to block those sponsored posts from passing PageRank (or authoritative
"juice")--but unfortunately there's no way to know whether these methods actually work for that purpose. Using the robots.txt file to stop Google's crawler from even going to those pages seems like
common SEO sense, but the result is ambiguous at best-or, in Techmeme's case, clearly not good enough.
Beard, like many other SEO pros, argues that this ambiguity is what makes
Google seem like a bully when it comes to publishers trying to monetize their sites. Gabe agrees. "Yes, we all know Google is free to set the rules for its search engine," he says. "But I'd like to
submit this: Google wants me to learn these arcane rules, apply them and then prostrate myself for 'reconsideration.' Is this a winning long term strategy for Google?"
Read the whole story at Andy Beard »