However, in any case, the spam you’re seeing may not be doing what you think.
So what prompted my colleague’s email? He was calling out a site called Moosejaw (I really don’t want to give them a link… just add a ‘.com’, and you’ll get there). This ecommerce site seems pretty legit initially; decent products, good enough design, and a feeling of order security. Scrolling to the bottom, though, displays a lot of obvious SEO influence in the form of huge amounts of keywords and anchor text.
The site has healthy backlinks, a good internal structure, and other legit factors that I believe offset any negative judgement this anchor text stuffing might have caused, even though it is an obvious SEO effort. Upon seeing this site, my first thought was, “eh… fine, I’m sure it’s only ranking for some really long-tail, obscure queries”. Nope – it ranks pretty well for some popular product queries, too. It looks like a quality retailer, but Google’s whole stance is that they don’t want to serve sites that are obviously manipulating the algos. If this site was adding user value instead of SE value, then I’d be all for its high rankings. But really, what customer on the planet is going to even spend 1 millisecond on the bottom text? Are they going to follow any one of 50 links there? No. Hell no.
Google has cleaned up the index a lot since I was posting to my last blog. I remember writing a post calling out lots of front page results for simple queries, like “photoshop tricks”, “mixed drink recipes”, and “video game cheats”. There were lots of affiliate trash and ‘made-for-Adsense-sites’. Recently there was an illness in my family, and I found of my results were pretty fulfilling, unlike previous years where I just became frustrated. The webspam still shows, but I find it’s for more obscure longer-tailed keywords most often. I still believe that engines know when a page has been excessively SEO’d, and may flag certain heavy-handed techniques (the amount of these flags, based against the determined value of the page makes it hard to figure out exactly what heavy-handed actions were too much). In the past, when I’ve tried to break semantic markup and put header tags around keywords on my smaller sites, or get crazy with nofollows, I didn’t get a gain – in fact, I think I lost a step or two. When Overstock.com does it (they even put an <h2> on their breadcrumb trail), it’s clear it doesn’t negatively affect their rankings.
I’ve had discussions with other colleagues about keyword density. The SEO community generally agrees that it doesn’t exist, at least to an absolute level across all pages. A particular conversation I remember having is when a colleague Googled a keyword, and the snippet showed a handful of these terms bolded. Clearly, the snippet was just trying to show keyword relation to a user – in fact, all the SERP results had bolded keywords – but it does not necessarily suggest that these bolded words had an effect of the serving. Again, it seems obvious, but based on what SEO’s have learned, just doesn’t appear to be the case. In reviewing some of these served pages, as exemplified from the snippet, there was definitely some keyword stuffing in these authoritative pages. Was this enough to punish the page? Apparently not. They ranked great. Did it help them? Probably not – it was more likely due to all the other things the site was doing right, and the page authority. So it looks like Google ignored it and judged the page overall as valuable. All part of the spam score, I imagine.
Is this what is happenening for Moosejaw? Are they flagged for the heavy handed links? Yeah, possibly. But maybe the overall value of the site offsets that spam. I’m more in agreement with that theory. Ultimately, I don’t think Google loves spam – I just think you get away with SPAM a little more if you’re doing everything else right. I believe this is what’s giving you the rankings even though it’s easy to call out the obvious spam techniques.