Founded in 2005, we're a boutique SEO consulting group with big experience and
industry recognition. We help companies reach their inbound marketing goals through
education and strategy development.
 
 
 

How Do You Trick Google?

Articles from

Follow me on or Twitter (@billsebald)

Like this post?  Why not tweet it! 


Like us on Facebook for one daily SEO industry article in your newsfeed.

Note: The title is not How To Trick Google.  I am not a spammer – not in the slightest.  It’s just not the side of the fence I reside in.  But, as someone who breathes SEO, I do get curious about understanding blackhat techniques from time to time.

I followed a thread on Webmaster World (via SERoundtable), where an SEO was requesting a direct answer on a tactic he uses.  His tactic uses Javascript for hiding affiliate links from Google.  His external .js file places the links and content into the webpage quickly after it loads.  But, he blocks the external file with robots.txt, so Google never sees the full final page with the inserted Javascript content.

With all the technology that makes up webpages, and incredibly smart techies working as SEOs, it’s interesting to see what clever things SEOs still come up with.  Obviously Google engineers eventually learn all these new tactics, but are they really able to defend against them?  They provide guidelines on their Google site, but these guidelines are usually written loosely.  They often raise more questions then they answer.  And per the tactic above, I’m pretty sure that’s why – loosely, Google is able to take the stance against this tactic, without addressing (or even knowing about) this tactic.

As a whitehat SEO, I talk to link building tactics that “are against Google’s guidelines”, or CSS tricks that “are against Google’s guidelines.”  Not because I think that Google is definitely able to catch them automatically, but because there’s a possibility.  There are humans behind Google’s rankings – they might hear about it.  A competitor might report you.  Google’s toolbar, that’s on one of your visitor’s browser, may report back a different experience than the Google spiders report back.

Even though I fall for the loose guidelines, it does sounds like a big if though.  If Googlet wants to thwart spammers, maybe it’s time to get more clear.  Spend the time specifying the guidelines.  Is it fear that specified guidelines will act as blueprints to spam techniques?  Maybe – but it also might thwart SEOs from walking in the gray.


    Sign up for our bi-weekly newsletter on SEO topics from around the web.






    We respect your email privacy