Note: The title is not How To Trick Google. I am not a spammer – not in the slightest. It’s just not the side of the fence I reside in. But, as someone who breathes SEO, I do get curious about understanding blackhat techniques from time to time.
With all the technology that makes up webpages, and incredibly smart techies working as SEOs, it’s interesting to see what clever things SEOs still come up with. Obviously Google engineers eventually learn all these new tactics, but are they really able to defend against them? They provide guidelines on their Google site, but these guidelines are usually written loosely. They often raise more questions then they answer. And per the tactic above, I’m pretty sure that’s why – loosely, Google is able to take the stance against this tactic, without addressing (or even knowing about) this tactic.
As a whitehat SEO, I talk to link building tactics that “are against Google’s guidelines”, or CSS tricks that “are against Google’s guidelines.” Not because I think that Google is definitely able to catch them automatically, but because there’s a possibility. There are humans behind Google’s rankings – they might hear about it. A competitor might report you. Google’s toolbar, that’s on one of your visitor’s browser, may report back a different experience than the Google spiders report back.
Even though I fall for the loose guidelines, it does sounds like a big if though. If Googlet wants to thwart spammers, maybe it’s time to get more clear. Spend the time specifying the guidelines. Is it fear that specified guidelines will act as blueprints to spam techniques? Maybe – but it also might thwart SEOs from walking in the gray.
It has begun.
If you didn’t hear, Bing and Yahoo have merged to a degree. Bing search will begin powering Yahoo.com’s search function. This merge also includes paid search (which is the real monetary motivator for this merger). The transition timelines are now out there.
Apparently it should be done between August and September.
Read more at Search Engine Roundtable.
Google recently announced that loading speed is a signal to improved rankings. They even gave us a free tool to gauge it.
It’s always been assumed, but with this announcement it was made truly official. Did we really need Google to tell us? Couldn’t we figure out that Google wants to serve fast loading sites? Of course we could. Just about anything that is good for a user is good for Google.
But before you go running tests in a panic, think about your site. Have you been good to your visitors? Have you been sensible with redirects? Have you been mindful of bloated code and huge files (image, Flash, and otherwise)? Do you have a good webhost? If all of this is true, you’re probably fine. Sure, run the test, but I wouldn’t panic and put in projects to fix prematurely.
Another note – above I recommended watching your page weight. I like a good site validated to standards, but I’m not that kind of SEO anymore. I’m quite sure it’s OK to be a little noisy in code. Semantics in code is great, but don’t worry about a little bloated code. Again, run the speed test. See if Google thinks you have a problem. It probably won’t be from tag soup.
The idea for this post was from a string I just read on Webmaster World. I heard a webmaster talk about recoding to err on the side of caution. Many agreed without the test, simply because they got swept up in Google’s announcement, but not reality. I thought that was a really useless waste of time without some sort of real reason. The bottom line – if your page is chunking while it’s loading, on several fast connections, you have a problem. But I’m sure you knew that.
I just got back from SES new York. It was my first one. I’ve been in this game for 10 years, and always wanted to experience SES. Cost and timing always got in the way. I figured all the conference goodness was getting live blogged out anyway. But I was wrong. With more than 80 sessions, I hit about 12 of them. There was incredible value in the side conversations. It’s a social experience.
It was really interesting to be with rooms full of peers, all with different levels of experience and philosophies. No matter how confident (or cocky) you are, it still pulls you out of your comfort zone. You start questioning everything. You’re exhausted by the end.
One topic that routinely came up was the value of search for business. Old topic but finds new life each year. Is PPC or SEO better for your ROI? That’s such an executive level question, but to the shigrin of the asker, it doesn’t get answered straight forward. It can’t. The answer is “it depends.”
So I did attend one SEO only session where PPC came up. The feeling of the panelists was since everyone expects PPC to only be an ROI channel, compensate with SEO. Compensate? As in, clean up with SEO? It was a bit of a knock against the PPC definition of today.
Wait… It’s all coming back to me.
When PPC first came out, it was a way to help searchers, who were looking for yiour offerings, find your offerings. Period. Traditional marketing. PPC offered visibility into your efforts unlike any form before it. Television couldn’t tell you exactly who watched or acted on your ad. Billboards couldn’t do it. Magazines, bus wraps – everything – failed at providing hard data. So naturally, this amazing technology bombed us. A little later, google put out conversion code. Now we could tell how much money we made off an ad. We can report returns on ad spend. A version of interactive ROI became more and more of the focus. Soon, in some circles, it wasn’t about connecting users first. It was about buying revenue. The C-level was ecstatic. Marketers who got good at PPC looked like rock stars and happily spent ad budgets. Life got easy.
Avinash Kaushik had a keynote, and spoke about not just the micro side, but suggested we think about the macro side. What happened to marketing in search? What happened to the value of creating return visits, lifetime value, and brand awareness? Demanding solely ROI means a lot of bidding on brand terms, of which a massive percentage would be picked up by SEO anyway (since you almost always rank for your own terms). I know you have exceptions, and you need to compete against competitors who are going after your brand terms, but if you’re a shop selling dog gift items, and sell dog sweaters, bid the terms even if they don’t convert. Let people know you exist for when they want to buy dog collars or leashes. Get on their radar. That’s worth the cost of a click. You used to think your customers were worth more when you were blindly paying for billboards on the highway. Today, with Google being the most visited website in the world, I would put all my money in search first and foremost. Google owns your brand. Not you. Not your customer. Customers find you first through Google 9 times out of 10. If Google is your portal, advertise on it. Don’t be cheap.
I think the SEOs in this panel saw the miss on PPC, and suggested SEO to the rescue? If you can’t (or more likely, won’t) invest in PPC for the non-ROI of it, then maybe you’ll have some luck with the cheaper alternative – SEO. Branding, reputation management, controlling search engine real estate, and marketing can all be done with SEO. Though SEO is not an ROI channel (where the dollar can guarantee anything like PPC), it is certainly valuable as an avenue to reach massive, larger streams of qualified visits.
When I do PPC for my clients, I explain this value. Sometimes they get it, sometimes they don’t. But at least I tried. Technology spoiled us. We lost the real value of marketing and advertising. The 10:1 ROI focus is not the solution to online success. If you think it is, it’s time to look at the big macro picture.
When Corey Haim died a few days, TMZ put up a page with a pretty spammy URL:
Dead, died, and death? Corey Feldman? Really???
Maybe it’s an overzealous SEO or overzealous logic in the CMS, but this is pretty much straight against Google guidelines (if it were a title tag I think it would get more of a flag). But my question is, will Google penalize TMZ? They’re a pretty big property. And yes, Google has penalized – even banned – big properties before, but usually for shadier practices. Does Google’s algorithm have a way of forgiving this spam because of the popularity of TMZ? Personally I think so. But what kind of message does this send? If TMZ gets away with it, then I think other webmasters will try.
Unfair for those who follow the rules.
On his blog, Matt Cutts introduced a form to report websites/webmasters who are spamming with links. Link spam is pretty much anything (according to Google) from links placed by paying for them, to comment spam (where people or programs submit hundreds of random comments to blogs). The link is http://goo.gl/linkspam.
“Be sure to include the word “linkspam” (all one word, all lower-case) in the textarea (the last field in the form).”
I’m all for keeping the web clean. If Google does want link spam tripping up their algorithm, I actually think they have the right to ask for help. It’s their engine – they can do what they want. But the beauty has always been that Google was supposed to be good at figuring out their own link spam. I mean, I never worried about link farms linking to my sites. It happens! So now should I worry that Google will think it’s link spam from my site? If there’s a lot of it, will they react differently than they used to (which was to just cut the link juice flow)?
I guess if you have a competitor who is already killing you in the rankings, why not buy an automatic blog commenter and link spam the hell out of the blogosphere? Point the links to your competitor. Then call Google and call the competitor out as spammers. Not that I recommend it – but it’s something I’m sure is being tried.
And they did. The tests in the footer of this site showed it. Not only could Google index my pages that had the anchor text in them, but they could also index the thin destination pages. And they did so within 3 hours! Hey, they did say they’re obsessed with speed this year.
I also tried:
Finally I tried:
Again, Success! I’m pretty satisfied with Google here. Bing and Yahoo? Not so much – they were only able to index the page with the anchor text. And even then, not every time. But they never claimed to be able to (that I’m aware of).
Now this isn’t surprising to some groups of SEOs, but it really is interesting how often I still hear old SEO recommendations as being critical today. Granted, this test isn’t exhaustive (the actual PageRank associated through JS links wasn’t tested – just crawlability), but it’s valid. I think some SEOs really need to get caught up to Google, and start implementing what really matters – user value, context, authority, recommendation, and community. Whatever you want to call it (SEO 2.0 or not), the wave is starting to build right now – get in front of it, and down shift on the old school SEO tactics.