I remember the Quality Deserves Freshness algorithm over a year ago. This update allowed fresher content to have a better chance at ranking. For example, if you’re looking for information on SMX East, you’d want to be served the 2011 conference… not the 2010 conference.
Well sometimes it worked, sometimes it didn’t. Like many people, we were directed to the 2010 SMX East page in September.
It makes sense that Google would reintroduce this now following so many Panda updates. It says to white hats, “go forth and write more. We appreciate it.” It says to black hats, “don’t quickly write crap as a spam tactic. Panda will get you.” Whether that’s true or not, I’m sure the timing of this update was by design.
The potential issue with this in my mind is how domain authority may play into it. Is this a factor that will weigh higher with big websites (high domain authority), of which usually has a higher budget to be able to hire writers to crank out new content more regularly? I still like the old feeling of natural search being a ground where small guys are on a level playing field with the rich dudes.
Need proof? Ask your mom to name her favorite store in the mall. Then ask her to their website online. If she doesn’t use the search bar in her browser or go directly to Google.com, I’ll give you a dollar.
We know people are creatures of habit. Search engines have become the main touch point in their online day-to-day. For many people, if you restrict their access to a search engine, they’ll fall apart like I do when I’m given an O’doul’s at a party. Why didn’t your mother try to type the website directly into the URL bar?
On the bright side, when 50% of your natural search visits are from variations of your brand name or URL spelling, at least your numbers look good – that may give you some job security if you spin it right.
But Google is a pain in the ass because with all this ownership and inside data, they don’t really play well with others. They don’t share. Take WMT for example – it’s very thin data in the grand scheme of things. It’s like giving a free sample, but never letting you buy the product.
And what’s with the rich snippets? I give you the content, and you post it in your result pages? I want people to click through. How do I know they’re not clicking through? Because I’m not even clicking through on my own site. I’ve been sucked into the Google vortex.
Don’t get me started on feeds, schema.org, or Google Places.
Granted, there’s a lot to gripe about. When I go to a search convention, I see I’m not alone. I never skip a panel with a Googler – it’s always a fun beatdown. But the same thing happens every time. By the end of the scolding, the Googler – be it an engineer or product manager, essentially commits social suicide, as he throws his hands in the air with the answer, “I don’t know.” As hard as he tries, he can’t answer all the questions he gets. It’s not always because he’s not allowed to, but I think it’s because he doesn’t know how to.
I think there are ghosts in the machine. I envision Dali painting with a thousands of gears all clumped together. Turn one, there’s no telling how many others will connect. At this point, I don’t think Google knows. They can try to reshape the monstrosity, but at this point the algorithm has to be pretty insurmountable. Add on top of this, there are several other algorithms running different Google properties that are probably comparably unruly.
But let’s face it. Google created this mess, and search engines created the SEO – both the good and the bad ones. But despite the hat you wear, we’re all dealing with our own KPI’s against this mutant algorithm. So are our competitors. Trial and error, testing, and patience are the key to building your experience. When you put together a marketing strategy, you’re typically trying to overcome an obstacle. You’re putting together a plan to move past the immovable objects. Google is an immovable object, so the SEO needs to strategize with that in mind. That’s far different than what many SEOs do by trying to defeat or complain about it. Since goal setting is the key to proper strategy, the SEO (and the employer of the SEO) need to plan touchpoints that accept these realities.
Things to ask yourself when choosing your tactics:
With this clarity comes opportunity. Again, many of our competitors are dealing with Google head on, and trying to plow through the algorithm instead of dance with it. They’re probably granted the same amount of time and budget as you. SEO is a household word in business, but it’s still rarely done right in the grand scheme of things. Spend some time with a clear understanding of what Google really is to your website, and spend more time in the planning stage.
If you haven’t heard, about a week ago Google rolled out a change that affects your data in analytics. I started seeing (not provided) as a natural search keyword. Google has decided to go all SSL on us people who log into Google, and based on how they built it, that hides the keyword data from our analytics. They’re telling us it’s for privacy.
Let’s put it this way – if you have 100 natural search visitors in a month, and they all come to your site by Googling a different keyword, you’d expect to see 100 different keywords in your natural search keyword report. Now, if 50% of those 100 people were logged into Google (ie., logged into Google Plus, or Gmail, or Docs, etc.), you’d see 50 different keywords, and one (not provided) stat showing 50 visits.
When I first heard the news I looked at my Google Analytics. The (not provided) only represented .001 of my natural search visitors. I checked again today, and it’s up to .05 of my visitors. I expect it to grow as it continues to roll out through data centers, and as more people started joining up with the Google products that make them log in. Does this roll into mobile too? I assume so.
I don’t know. Right now it only affects natural search. If a user clicks an AdWords ad from Google.com, the keyword referral data is still passed through whether the user is logged into Google or not. Speculation is that display companies are using natural search data to better target their ads, and since Google is focused on the display game now (trying to own it… which they’re completely on par to do), they’re possibly trying to lock away some of their keyword data. But those same companies can normalize the same keyword data from Yahoo/Bing and still be close.
If (not provided) grows, and a percentage of your keyword referral data is lost, will people start getting “rank crazy” again? Will people start scraping Google for rankings they think they should rank for, versus knowing they should (or shouldn’t) rank for with traditional ranking reports? Google hates when we scrape them and inflate their AdWords numbers.
But what really ticks me off is that I use my keyword data to better my visitors’ experience. With personalized search, social search, and all the cute little things Google does now, I get a lot of interesting queries in my keyword report. Sometimes they’re things I wouldn’t normally rank well for, but because there’s”some relevance” with my site, I get these rare keyword entries. They often inspire me to create content.
If I had a site for plastic sneakers, and I got a one time natural search keyword visit for “how to run with plastic sneakers and not get blisters,” I might assume there’s a pocket of people with that same question. I might write a blog post and answer the question. I might put an article on my main site to attract visitors. In the end, this might provide a great value to searchers, and my own website. But now, if the user who entered this query was logged in, I’d never see it in analytics. Inspiration may never hit. Everyone loses.
Ok, maybe right now it’s not something to freak out about. It’s another “Google wait and see” game, but we’re used to that now, aren’t we? This is just an odd one. Data is so important to content providers.
They say you ground your current experiences in past experiences. I worked in the music industry in the 90′s. Think Napster, Chemical Brothers, and music festivals. For me, the SEO blogosphere is reminiscent of that time.
I’ve been doing SEO for 11 years. There have always been SEO rock stars. Like Hendrix, many of them were pioneers of a new frontier. These SEOs are still around, but for one reason or another, many seem to have gone the way of Foreigner.
But today it’s a much different scene. We have a much bigger industry and heap of digital communication platforms. We’re so much more than just the HighRankings forum now. Still, I continue to see an odd centralization on today’s perceived rock stars. Almost as if there’s a (gasp) mainstream. Its amazingly cool to watch people sign autographs at SMX, even if these people won’t reply to you on Twitter. It’s also funny to see the egos on some of these peeps, the likes of which I haven’t seen since the singer of Everclear (That’s right whatsyourname singer from Everclear… Took me 10 years but I’m finally calling you out! I didn’t forget our fight!).
How can there even be a mainstream? There are hundreds more verticals than styles of music, hundreds more strategies than pop song formulas, and an endless need for experimentation. When’s the last time a rock star in the mainstream did anything new? And I’m not counting a meat-dress as experimental.
Looking back, Sphinn was pretty bad. Some of the most useless SEO content was sphunn up because of the name of the author or curator. But if you bothered to dig deeper, there was some great indie stuff. Google Plus is better because of the difference in interaction, but can be just as bad. In this case the curator (or DJ???) gets more rock god status. Twitter is the wild west, but my choice for really digging deep.
With all that said, there’s still great, “followable” people who have achieved rock star status. Rand Fishkin’s team at SEOmoz is still making hit songs. My friend Wil Reynolds and the SEER team still teach me actionable stuff weekly. They’re still highly relevant for the style of SEO I do. Alternatively, other friends like Eppie Vojt, Ian Howells, John Doherty, and Mike King make me take notes – these guys may not be on the Billboard Top 20, but they’re brilliant players. That’s who’s on my feed reader and my Twitter list. I have a lot more of these indie rock guys than the mainstream players (with notable exceptions).
I’m not saying you need to go alternative. I’m saying you should check to see if you’re looking deeply enough for your taste in SEO. And if you’re not stealing licks (in other words, actively applying what you learn), you may not be following the real artists of today’s SEO scene.
SEO is about searching; it may take a while longer to uncover some new personal rock stars, but so what? Is this your passion? Rawk on!!!
In a previous post I was ranting about how the marketing industry seems to write for the sake of being noticed, but it’s akin to millions of people throwing confetti in the air. We’re taught that content is king. We’re told it will help us stand out and/or get noticed by the search engines. Here’s the reality – nobody really stands out if they’re just writing small, non-descript content. It’s just noise. You’re actually doing more harm than good by wasting readers’ time. Usually a single post determines whether you get added to an RSS feed, retweeted, liked, or linked. It’s likely not going to make much of a dent in Google either.
I suggested not rehashing someone else’s opinion, but coming up with your own. Simple enough, right? If you don’t have a take, or something to offer, maybe don’t write.
So what does that leave? What do you write about? Where do you start? Hell if I know – I’m not in your industry and haven’t taken the time to think through your industry. But I do have a couple places you might think about going (these are my sources of inspiration).
1. Technorati – If you use Technorati wrong, you stand the chance of writing duplicate content. But if you search for a keyword (use the advanced search), and scan the resulting topics alone, you should have your first ingredient for an angle. What’s missing? Write it.
2. Your Sales Team Or Customer Service – These guys know the ins and outs of customers needs or problems. Have a meeting, and figure out what the issues are. Write something that gets ahead of these issues. This alone might be content for years.
3. Q&A sites - From Quora to Yahoo Answers, people are asking questions. If you don’t like the answers you see, then add your own to those – and your own – websites.
4. Use Social Mention - Conversations are happening, and even though you’re not in them, you can facilitate the conversation with something well written and targeted. Social Mention will help you track those conversations on topics you are an expert about.
What are your sources of inspiration?
I’m begging the marketing community – please drop “content is king” and replace with “unique, relevant opinions are king.” I can’t take the flurry of noise in my inbox, reader, and twitter stream anymore.
I love when clients ask, “What keywords am I ranking for?”
Well, unless you’re ranking for a keyword, and people are clicking through (where I can see it in your analytics), I have no idea. I suppose I could do a few hours of keyword research and run that through a rank checker to see if you’re in contention. Or I could use SEMrush.
SEMrush is a great tool, with a lot of keyword data for both PPC and SEO. Basically they create their own ranking index of the web. And, they’re pretty accurate. It’s great as a competitive tool – enter in a competitor and check out what they’re ranking for. This is eye-opening for several reasons. With estimated volume along with each keyword, it’s great to illustrate keywords you’re not optimizing for (and maybe should be).
It does a few other “nice to have” things, like providing quick snapshots of competitors in Google. Granted, you could do this yourself pretty quickly with your own Google search, but SEMrush gives you a decent .csv output. I also like the competitive ad review feature (helps you figure out how your ad copy stacks up with your competitors).
This tool has been out for a while, though I’m always surprised how underutilized it is. Give it a spin.
You know how Eskimos supposedly have the highest number words that mean “snow” than any other language. OK, maybe that’s an urban legend, but as an SEO, how would you optimize a site for such redundancy? If you’re working on an eCommerce site, you probably feel like you have that problem all the time.
“I have 30 collection pages here that would be a good landing page for shirts!!! What do I pick? Where do I start?”
Do you decide to try to get all the pages to rank for shirts, and hope that one comes up well in the rankings? Do you pray to the duplicate content filters gods and just wait and see what ranks? Do you beg the catalog managers to change the collections (which is probably an uphill battle that could have a dangerous effect on usability and sales). Or do you get more proactive in your SEO efforts?
Having 30 pages that speak to the same thing gives you a lot of opportunity, after you trim the fat. How are the duplicate pages created? Are they based of parametric filters? Sorting? Internal search queries? Which pages get the lion’s share of the traffic? What have the best conversion rate or least amount of bounce? Which are really the important pages in your pack? Determine the fat and the lean; make this step one.
Once you figure this out, you should work to close these paths. Granted, on a dynamic platform you may not have the ability to add per-page meta robots, nofollows, or a tightly tuned canonical tag. What about Webmaster Tools, and their cool parameter-blocking feature? It might be an option.
Let’s be honest – in enterprise SEO, it’s really not likely that you’ll be able to tighten the site to your exact specifications, no matter how big (and hard working) your team is. In enterprise SEO, it’s about locating the biggest holes and plugging them first. It might be more of the 80/20 rule, or a matter of specific initiatives to tie with other channels. Just don’t expect to eat the whole pie.
So back to our opportunity. Once you’ve cut the fat and done your best to get it off your plate (which may require you to monitor Google to see if their internal duplicate content consolidation is working), you get to have fun. Let’s say you’ve been able to identify five pages that would be good candidates for optimizing with the term “shirt.” Let’s go a step further and pretend they’re the same types of shirts – tee shirts. Do your keyword research. Grab up the best terms. Compare your current placement for each term, against your competitors. Look at PPC data (helps you understand demand and opportunity). Look at each page’s back link portfolio (or hit up Linkscape) to get a sense of what page is likely already juiced up.
Pretty soon you’ll find that for your five pages, you have a handful of similar terms that need a home: tee shirts, printed tee shirts, cotton tee shirts, etc. That’s right, folks. You’re probably going to be playing with the long-tail… the SEO devil’s playground. Expect to keep notes while you plug some content in, and fuel with backlinks. Wait a few weeks (or months depending on how strong your site is), and measure. Was there MOM/YOY growth (a lot of eCommerce is seasonal)?
The goal here isn’t just to avoid cannibalizing terms, but not to cannibalize themes as well. You really want to theme these five pages out. For duplicate page A, start theming it about printed tees. The cool style, what people like about them, the variety in full body prints, etc. For duplicate page B, start theming out the cotton blends and how they’re durable. Get creative, and start dabbing your uniqueness all over each canvas, using different colors.
Once your done, sit back and watch. Start measuring. Keep a log. You put in a lot of work that hopefully benefits depending on how well your decisions were made at step one. Maybe you find it didn’t pay off as well as you’d like, but I assure you, after doing eCommerce SEO for 10 years, it takes practice getting your methods down. And because ROI is your challenge, you might find it’s also you’re best friend since it’ll keep you happily employed.
Google, always trying to push social search (despite early missteps), has a good one here. The new +1 button, rolling out now (when you’re logged into your Google account and choose to opt into the experiment) lets searches know how popular a listing is, ala StumbleUpon’s plugin.
After speaking with Google, this is not yet affecting rankings or quality score, but they said it will. Straight from a product manager’s mouth. He couldn’t tell me when – they need to evaluate, I’m sure.
But this is a useful thing to reinforcing CTR on a brand, not to mention owning more search engine result page real estate. Thanks to social additions, my listing now pushes other sites below the fold even more than before. If I own a few top listings, I can ultimately “black out,” or push a fourth or fifth under the fold:
Is it like Facebook’s Like button? Definitely, especially since soon you’ll be able to add a +1 widget to your websites. But this has something Facebook doesn’t have… the ability to “like” something in Google results. That’s pretty big.
More info here.
Updated: Feb 23 2011
Customers still rely heavily on search engines to find web-based mobile sites. It’s not unlike traditional SEO in many technical ways (Google still cares about the keywords and the links), but is very different when optimizing for user or customer value. To optimize for search engines on behalf of the mobile user or customer, you have to think about what the mobile searcher is looking for when searching on a phone. The answer: relevancy, speed, and good usability. Identify your landing pages that are best suited for them and think about how you can optimize for the phone. We’re trying to attract mobile users in addition to desktop/laptop users. But mobile users have a larger sense of urgency.
Phones are not used like desktops and laptops. They’re not even like iPads. Customers on the mobile are on the move. Assume they’re short on time; they may quickly be approaching a bus stop, or walking into a store. Maybe the light just turned green (scary but true) and they need to get back to driving. When we optimize for a mobile page, we need to identify and provide the key answer in the title, meta description, and body copy with as few words (and keywords) as possible. We need to be much more concise and specific so the mobile user can identify the best results faster. We need to spend closer attention to the query intent. If that means more specific mobile landing pages (and less general, high-keyword frequency pages), so be it. Granted, that goes against some traditional SEO strategies. From the little data that’s been revealed from Google about differences in the way they approach mobile sites, it’s our best hypothesis that they’ll continue reevaluating your keyword choice from a mobile perspective. You already get personalized, GPS powered mobile results from Google sniffing your smart phone browser now, so this isn’t really a stretch.
The mobile searcher is likely searching for a quick one sentence answer. Or a price. Or a location. Or a quick review. Microformats and location tagging will likely take a larger role. Mobile users don’t want to zoom in/out of a page all the time (if their phone even enables it); they’ll often back out and view other Google results for the best visual snapshot (even if it’s not the most relevant page to the query). Usability plays a different, but equally as important role as it does now. In general, if our goal as SEOs are about driving qualified traffic from the query all the way to the shopping cart, sometimes we need to be focused on design and usability.
Old school technical SEO still needs to be a factor. Most developers create a different URL for mobile sites when it’s not necessary. I see the “m.” subdomain used. If you share your mobile link through an online social channel, you’re sharing the m. version. If your logic properly redirects a user through that link to your desktop version, you’re still being served a redirect. Some loss in link juice there even if its a 301. At least use an /m/ directory and turn off internal linking user agent switching so you can get some links that help your overall domain authority. Currently Google has their normal Googlebot, and Googlebot-Mobile which crawls content for traditional phones – not smart phones (with the exception of a recent iPhone Googlebot that’s been testing). Google believes that smart phones can see the web just fine and doesn’t need their own bot. If that’s the case, there really isn’t much reason to create a new URL anyway if the content you want a person to see on the phone is the same as the content on the desktop. Just create a different CSS sheet to create a more mobile layout.
Mobile will only continue to grow. Additionally, more iPad-like tablets are slated to come out, which blurs the lines a little more between what is a mobile device and what is a desktop device. Google will continue to take the non-desktop search and web experience seriously. So should we.