So I’m not sure if this made the SEO rounds (possibly I’m just too busy hiding from my Christmas shopping responsibilities), but it was news to me. Google is playing with removing the displayed URL and replacing with a breadcrumb trail type display.
It doesn’t matter if you’re logged in or not. It doesn’t just happen if you’re ranked in the top position (like sitelinks). Honestly, it seems pretty random. I haven’t figured out what triggers it, but it does seem like a useful variation of the horizontal sitelinks. Is it better than seeing the usual display URL? Maybe since keyworded URLs still aren’t completely prevalent yet on the web. Function over form?
Try it – I Googled transformers toys but it’s happening elsewhere. All the more reason to have a good site hierarchy… now it’s actually marketing for you in the search engine result pages.
Google let us know this week that the canonical tag is now functional across domains. I think that’s a fine feature (not sure why it didn’t roll out that way the first time). I personally don’t have any reason to control cross domain canonical issues, but I can imagine several different applications. Maybe it’s an offering for the spammer who wants to go straight! Nah.
But it made me want to look at the canonical tag today. About 4-5 months ago I checked, and didn’t see an effect. I heard online buzz that it wasn’t doing much for anyone at that point. OK – it’s new, I’ll wait.
But now it’s been long enough. I was prompted to check again, but alas, the canonical tag really hasn’t been proven to do anything yet again. It’s been months (February???) since this thing was put out and touted as the end of duplicate content issues, but I haven’t seen any decrease in my indexed pages. In fact, I’m up about 10,000 pages in Yahoo and about the same in Google (in Bing I’m up off the charts, but that’s Bing for you). I just reviewed notes from a site with more than 30 thousand pages from 2008. The actual site’s page count hasn’t increased or decreased drastically in a year. This is pretty annoying.
Is it just that the site: is that inaccurate? Or is the canonical algorithm run so infrequently that it hasn’t permiated my client’s site yet (unlikely – it’s a hugely popular commerce site, but maybe there’s just too many pages to consolidate). It’s hard to help search engines with the duplicate content issues when things aren’t working or reported accurately. Makes me want to recommend hash tags in URLs, expensive meta robots implementations, or other nofollow tricks.
This is cool! Jeff Louella was able to get the real time search and posted a video of it on YouTube. Apparently it’s only available when you’re logged in, and it’s still rolling out so not everyone will get it yet (even if I sit less than 5 feet from him in the office… dammit).
Update: I finally got it, but I couldn’t get my tweets to trigger in it. Wondering if it has some sort of Klout feature. I also didn’t find it all that useful. It jammed up a lot. Was this thing ready for prime time? Does it make it easier to SPAM Google, or did Google put the right precautions in place?