According to Hitwise, 81% of the searches done on Bing and Yahoo resulted in an actual visit to a website. Google only showed a 65% rate. This suggests that either Bing/Yahoo is more relevant and providing the best results more often for the bulk of users, or that people search differently with Google. I’m assuming the latter.
I think most people who use Google expect to do a little digging. I think the results you’re given require you to refine your search, and as a Google user, you’re used to that. You’ve come to expect that.
Andy Beal at Marketing Pilgrim says, “Google offers more opportunities right upfront to refine the search by time, type of result, even result loca tions. Because of this, I’d bet many people take a second or third try at finding exactly what they want before they start clicking through.” That makes sense. I also think without the options, google users would be more apt to refining their search anyway.
I believe Google’s results are more detailed in nature and require your queries to be more specific as well. I feel like I get broader, safer results out of Bing. Thats what theyre going for per their marketing, but it feels a little “Fisher Price” to me. Not my style. Maybe Bing users are more casual.
Google and Bing have segregated the search audience. Like democrat and republican, NFL and MLB, or beer and wine, the two parties are different, and will continue to be shaped by the structure of the engine to some degree. It’s interesting, really, just how big a role search engines play, and what we can tell about people who use them. It’s not just an information retrieval system, but an extension of your brain. Much like a car.
I found a new feature on Google. If you click advanced search on the search page, you have a ‘reading level’ option. In the drop down, you can choose to annotate your search results. When you search, you can now see the reading level Google thinks these (and your) pages are at.
As a father of a six year old, this seems interesting. It could potentially help me find pages that he would understand and enjoy. Unfortunately it doesn’t seem to work as I had hoped. Apparently my about me page is at an intermediate reading level. Hardly. So is sesamestreet.com. Wow.
Maybe this is just more Google fluff, and maybe it will improve. But it has me wondering about the signals and algorithm that determines this labeling. Are there any clues here for the reverse engineers to understand more how Google thinks? Likely, Google would be very careful putting this out… but still.
Update: Looks like Google gave us a little insight. Looks like a model was built off decisions made by teachers. I’m now thinking this is simply another algorithm strand layered into the Google rope.
ATG (a large commerce platform) just put out some interesting studies. 53% (of 1,002 total people) cited search engines as their key source for discovering new products.
Is this news? Not really. But I was interested to see how competitive email still is. I was also interested to see where social media (as a channel) resides. Social is under In-store displays and offline signs. Wow. Even though it’s fertile, this is a reminder that social still has a long road until full maturity.
Check out Search Engine Land for more stats.
Many business owners ask the common question, “Do I need SEO?” When I’m asked, I’m likely to recite any of the following.
In the meantime I’m working on a case study with a family member’s family law office in Reading, PA. Should have some data soon to really show the before and after of a 6 month SEO campaign. So far it’s pretty compelling.
Here are a few little tricks you can do to customize or filter Google results. These 4 are clutch tricks for me. I end up using these more than most other tricks in my arsenal (oh, there are plenty…):
Enter -site: to remove sites from the SERPs: If you’re looking for competitors for a popular product, and keep seeing the big players, comparison shopping engines or affiliates, and would like to get a better feel for the other players in the landscape, this trick works well.To see this work, search for a key phrase like Wilson Official NCAA Football. You may see sites like Amazon.com, Nextag.com, and Bizrate.Try the search again like this -site:www.amazon.com -site:www.nextag.com -site:www.bizrate.com Wilson Official NCAA Football. See the difference? There are several ways you can use this iltering for your competitive education.
Discover related keywords: Google has the ability to show pages with keywords related to the actual keywords you searched. They’ll do this when their algorithms suggest it’s a better result. To get a feeling of what keywords variation Google is thinking about, at a tilda (~) to the query. For example, Google ~sofa. At the very least this can inspire your keyword research.
Find File Types in a site: Doing a quick audit and want to see if a site is using a particular file type (like Flash)? This will give you some insight: site:www.nike.com filetype:swf
Figure out where those indented links really rank: Today a Google search (on my computer) for Frank Zappa will show you Zappa.com with an indented link for Zappa.com/whatsnew in the #2 position. Indented links are pages from the same domain that can show up anywhere in the bracket of 10 results, except Google groups them together for user value. In other words, although Zappa.com/whatsnew is ranked at #2, it’s not really the second result. It could be the fifth, or the seventh, or the tenth.When working towards SERP domination, it’s important to know exactly where all the pages lie so you have a better idea of who you need to beat. Add &num=x to the end of the Google search query URL, where “x” is a number less than 10 (remember – without using Advanced Search, there are only 10 true listings in natural results on any given SERP). Keep experimenting with lower numbers for “x” until the indented link is gone. Once it’s gone, you’ll be able to surmise where the actual position of the listing.
Today I was asked to look at a site and explain why it’s not ranking. The answer… the site was whispering.
If you don’t have content, Google won’t know what your site is about. But I don’t mean any old content. I mean HTML text.
Oh… you say you have HTML content? Let’s see if Google can hear it.
1. Perform a search in Google to get your page to show up.
2. Click the ‘cached’ link.
3. Click the ‘text-only version’ link.
4. Find a sixth grader and ask them to explain what this page is about.
I once heard that Google has a reading comprehension of a sixth grader. If that’s true, then you need to speak to Google like a sixth grader. Give simple context, but be specific. Speak up! Promote your message, hammer it home. Don’t mumble (and spam your pages with junk content).
Granted there are a several ways you can add contextual relevance to a site, it doesn’t need to just be in the body. Tags and links still play a big part, sure. But why be shy in the body of your website? Is it that “text is ugly?” Is it that “people don’t read online?” All untrue. You read this post, and frankly, I think it looks rather beautiful.
Form vs. function, my friends. Form vs. function.
Note: The title is not How To Trick Google. I am not a spammer – not in the slightest. It’s just not the side of the fence I reside in. But, as someone who breathes SEO, I do get curious about understanding blackhat techniques from time to time.
With all the technology that makes up webpages, and incredibly smart techies working as SEOs, it’s interesting to see what clever things SEOs still come up with. Obviously Google engineers eventually learn all these new tactics, but are they really able to defend against them? They provide guidelines on their Google site, but these guidelines are usually written loosely. They often raise more questions then they answer. And per the tactic above, I’m pretty sure that’s why – loosely, Google is able to take the stance against this tactic, without addressing (or even knowing about) this tactic.
As a whitehat SEO, I talk to link building tactics that “are against Google’s guidelines”, or CSS tricks that “are against Google’s guidelines.” Not because I think that Google is definitely able to catch them automatically, but because there’s a possibility. There are humans behind Google’s rankings – they might hear about it. A competitor might report you. Google’s toolbar, that’s on one of your visitor’s browser, may report back a different experience than the Google spiders report back.
Even though I fall for the loose guidelines, it does sounds like a big if though. If Googlet wants to thwart spammers, maybe it’s time to get more clear. Spend the time specifying the guidelines. Is it fear that specified guidelines will act as blueprints to spam techniques? Maybe – but it also might thwart SEOs from walking in the gray.
Update (12-10-2010) – So today I’m not the Firefox fanboy I used to be. I’ve moved into Chrome. Check out Chrome Extensions That Make SEO Easier.
The Firefox browser is an amazing, innovative browser. It’s fun watching IE copy its features (well, as many as its architecture can allow, which isn’t many – MS doesn’t rebuild, so Firefox should be enjoying their notoriety for a long time to come). I was an early adopter, but it’s pretty amazing how many people use this browser now – it’s not just advanced web surfers anymore. I was helping my 60 year old mother install a webcam and saw the Firefox browser. Impressed, I asked her how she heard about it. She said, “well, I don’t want Spyware.” Wow.
For those who still don’t use Firefox, here’s some reasons you should take the plunge. If you’re a traditional IE user, believe me, learning this browser is a piece of cake.
Greasemonkey is a Firefox Extension that allows for sub-extensions (called scripts, also found by Googling ‘greasemonkey scripts’ or something similiar). Search Engine Journal just posted 14 Essential Greasemonkey Scripts for Google Searching, and had a few I didn’t know about. Some of these scripts are useful to the average searcher. They do a great job of summarizing each script, so take a look.
To use these scripts, you just have to install the Greasmonkey extension first, then go to the script pages and click INSTALL. That couldn’t be easier.
There are plenty of Firefox extensions for search engine optimization, allowing for quick site audits, spider emulation, NoFollow checkers, user-agent switchers (view a site as Google), and code viewers.
Again, these sites do a great job describing and sending you to the tools. Tackle these after lunch for an hour, and I guarantee the web will look a lot better. Enjoy -