301′d to homepage
Relatively recently, Google started to incorporate ‘memory’ into their AdWords system. A user who first searched for “hockey”, then for “sticks”, would ultimately be served ads for hockey sticks. Not a bad idea, since about 80% of all unsuccessful searches result in keyword refinement (e.g., going back and trying a different query). Since the first and second searches are probably related, then this looks to be good for publishers, and gives more play on the keywords.
In April, at SMX Sydney, it was relayed by Marissa Mayer (VP of user experience – don’t know who she is? Look her up. She’s good people!) that this memory is coming to natural search as well. It was even given a name – “Previous Query”. Is this going to be in the form of personalized search when you’re logged in?
Sounds good. I’m all for a better search experience. Google is the bandleader, but other technologies and verticals are impeding. If Google can make their ‘best-of-breed’ general engine better, then they better do it fast.
I have a catch-phrase. It’s not really sexy, but I say it a lot – “SEO has a different definition depending on where you stand”. This harsh reality makes it really tough to sell to clients, both big and small. Not only does an SEO need to figure out the definition that works for the client, but we have to figure out the misconceptions or expectations the client already has. If you are able to connect with your client on a more informal level, you’re going to have a much better chance at molding this engagement. Still, very few clients across the board seem to fully accept that SEO is not a particular forest, but the entire wilderness. Yes, it is a marketing channel, but it certainly doesn’t live in a box. Understandably, this truth is not what many clients want to hear; they do not want to be told that the best SEO campaigns take time and are differentiated, or that there is a plethora of hits and misses on the way to pure optimization. Many clients often want quick hits. They’ve seen occasional organic traffic streams draw thousands of visits and dollars, and they just want it again… but not by any means necessary. This is direct internet marketing, right? It works for paid search, right? It would obviously work the same for SEO? Right?
This puts a marketing agency in a tough position. They want the business, so many agencies unfortunately feel forced to dig a measurable, generic, safe “acre” out of the wilderness and shape the client’s expectations to it. They call this “SEO”, and commit the client. The client never really knows that all they got was a decade old SEO marketing model. The lift the client receives is scaled from the start, so a positive end-relationship results. The client is none-the-wiser on how much missed SEO opportunity has been sailing by in the interim; enough missed opportunity to retire on.
One of the greatest (mostly) unknown abilities of robots.txt is wildcard pattern matching. We know how robots.txt can block files and directories from being crawled, but in the case of URLs with unique paramaters and duplicate content issues, did you know that Google and Yahoo respect wildcards (this was verified by connections at the engines – but MSN said they do not respect pattern matching “at this time”).
If you have URLs with unique parameters – for example, UTM with Google analytics, paid search tags, and so on – you can create a robots.txt entry like this:
How cool is that? Remember, this only should be employed if you have very unique parameters. If your parameters are keyworded, and that keyword appears as other directories or page names, they will get blocked too… quite possibly to your dismay.
More from Google’s Webmaster Blog.