Think You’re Successfully Flying Under Google’s Radar? Think Again.

January 24th, 2008


Originally published in Search Engine Land

Have you been trying to “fly under the radar,” engaging in activities outside of Google’s guidelines but subtly so as not to get caught? More and more SEOs are moving into this dangerous territory as the guidelines continue to broaden (prime examples of which being the expanded definition of doorway pages and the addition of link buying to the list of no-nos). Buying links in “stealth” mode still works, as many SEOs will attest. But what if Google is archiving your efforts for future review, to uncover what it can’t right now due to current limitations? Do you really want to be profiled retroactively as a spammer?

“You made your bed, now sleep in it.” You may have heard that admonition from your parents at some point. As children, we learned about consequences. Sometimes those consequences were immediate. Those were the easy ones from which to learn. Sometimes, however, the repercussions were slow to manifest, but severe and long-lasting. This can be especially hard to swallow if we don’t see it coming.

Perhaps you aren’t “squeaky clean” in terms of your SEO practices. And perhaps you believe this to be a sound financial decision—at least for the time-being. But would you still use these gray-hat or black-hat tactics if you knew you’d get penalized by Google somewhere down the line for these tactics—even if you had ceased using them once they became too dangerous? Cumulative penalties could be instituted potentially years after the initial infraction. Seem outlandish? I don’t think so. As the price of storage plummets and as computing power continues to double every eighteen months thanks to Moore’s Law, the likelihood of such a scenario increases. If the Internet Archive can archive the Web, so can Google. With such a repository of historical web data (including your own websites) at its fingertips, Google could return to that data later—with new and improved guideline infringement detection algorithms—scanning for anomalies, patterns, and trends in your SEO behavior that demonstrate a proclivity for rule-breaking. And even if Google can’t yet archive the Web to the degree of the Internet Archive, it can certainly store various scores and checksums, which it can mine and further analyze at a later date, allowing Google to effectively reach back in time to smack you for recurring bad behaviors.

For example, suspicious-looking keyword-rich backlinks combined with a prior record of repeated on-page offending could help implicate you as a link buyer—as it demonstrates intent to manipulate rankings and thus effectively removes the “But I was Google-bowled by my competitor!” argument as the probable explanation.

The Wikiscanner should serve as a lesson in how clever manipulations done in the past can eventually bite you in the backside. Through some clever but straightforward forensics, this tool makes it easy to connect an anonymous Wikipedia editor to his/her employer, and it can be done years after the fact. This list of salacious edits offers some telling examples. Suddenly, what these employees thought were “anonymous” edits to help their employer are laid bare, potentially damaging their company’s credibility or reputation in the process. Ironically, using Wikiscanner, instances of Wikipedia vandalism have even been traced to within Google (such as in this example).

I think it’s inevitable that a historical webspam forensics tool emerges to root out naughty SEOs. Whether that tool is public, like Wikiscanner, or if it’s solely available from within Google’s VPN (and on Matt Cutts’ laptop during PubCon and SMX site clinics) is anybody’s guess.

In years to come, it may be that the “pearly white hats”—the “goody two shoes” of the SEO world who have always played by Google’s rules—are the only ones left standing, because they protected their pristine reputations by keeping a zero historical spam footprint over the years.