Wiep is a cool blogger. About a year ago he experiemented on his SEO blog with something I was intrigued by. He linked to a Matt Cutts blog postthat was playing with the spammy phrase buy cheap viagra online to make a point. The post wasn’t exactly optimizing for this term – Matt was using it as a sample, and other commenters were having fun with it. In the end, the phrase was repeated several times. In Wiep’s post, Trust + keywords + link = Good ranking (or: How Matt Cutts got ranked for “Buy Cheap Viagrafrom a year ago, he noticed this, and decided to link to the page in his blogroll with that exact phrase as his anchor text.
Damned if Matt Cutts didn’t briefly rank third for buy cheap viagra online. Briefly.
The basics of SEO sort of explain this. Authoritative sites, with trust, reputation, etc., and PR from external sources gave Matt this ranking. But where’s the relevancy? What about those other 200+ factors that we webmasters/SEOs don’t know about? They don’t seem to be in play here, unless there’s something about Matt Cutts and Viagra that we don’t know about either.
Maybe this is isolated to a small percentage of fringe cases, but with all the webspam out there still (even though it has gotten much better in my opinion), you’d think Google would have this sort of catch for this. Something’s clearly not working. In the original experiment, one link pushed this ranking to #3. Now, with at least one current link still to this page, is this really enough PR to rank the term? Does this mean external links might just have too much power?
I remember rumors of Page Rank being devalued even more. I forget where I heard it. It was months ago. I thought it was a good idea, and this experiment reinforces it. Trim back the external, and maybe turn up the internal. Here’s why I think this:
1. Link Spam, which is now out of hand with all the other spam , could drop. So should bad link bait, comment spam, and pay-per-posts. Ugh. This won’t be going away as social media just gets bigger – nope, the reverse will happen.
2. Engines would be forced to retune their overall algorithms, instead of putting thumbs in the dyke (hey, Microsoft – this isn’t working for Windows, either, by the way…). I think a retuning could lead to a whole new property value. I think the web is just about done with Google 1.0, and demanding Google 2.0. It’s going to happen, so let’s get to it.
3. Engines could get more semantic. If they’re ever going to start serving human language outside of the box, they need to start reading human language.
4. Speaking of human – the human element may come into play even more than before (c’mon Google, you can afford it… I’m not talking Mahalo, but kick up the hand-work a few more notches, if only to catch these kinds of algorithmic slip-ups).
5. Not all webmasters are SEOs (most of them aren’t) – they’re trying to create sites that are user aimed, even if they’re not exceptionally good at it. So giving extra attention to internal linking efforts in order to show the pages that webmasters think are good. Aside from the splogs/trash affiliate sites, etc., most spam tactics are outside of internal linking. Even if they start to spam internally, Google algorithiims should still be able to discount thin value.
Ex-big agency guy, now focused on helping small and medium sized business. I've been practicing SEO since 1998. I started the SEO practice at a major digital agency owned by eBay and helped develop SEO products for one of the largest ecommerce platforms. I'm a proud member of the Philadelphia SEO scene. I'm passionate about search, writing, UX, CRO, and psychology in marketing. Read More