Update: 1/22/2013 – This post was written about the time Panda and Penguin were starting to make huge waves. We didn’t really have our hands around their targets. Although this is an old tactic that probably doesn’t have legs anymore, and could get you a date with a Google hand editor if you abuse it, it’s still somewhat valid at least as a general marketing play. If you are a full time content marketer, you’re probably still talking about comment marketing in your circles. Many marketers I know still claim huge value in comment marketing as a source of generating new relationships. So I still use this tactic – more so to find areas where a good conversation may exist (and I can leave a link that can help my SEO). That’s not to say I won’t / don’t comment on nofollow blogs. I go where the conversation is (with an eye to where my editorial comment could add some trust to Google’s algorithm). Google would be asinine to remove comment value all together. For those that play by the rules, that’s about as editorial as you get. Killing comments would be cutting off their nose to spite their face.
I submitted this tip for a chance to win an 8 minute presentation at the Search Church through SEOmoz. I didn’t win. Am I bitter? Hell yeah I’m bitter, but instead I’ll probably be sitting in the audience with a basket of tomatoes ready to peg anyone with a worse tip than this.
As a link builder doing white hat work, you know it’s about PR, the pitch, and the R&D (researching prospects and developing relationships). It’s time consuming, and takes a lot of organization. “Did I follow up with that prospect? Did I just email him twice? Damn!” Sometimes you just want an easy way to get a few links. There’s always blog link networks… wait, scratch that. Well, there’s also CommentLuv.
What is CommentLuv?
CommentLuv is a WordPress plugin. With 73 million WordPress blogs out there, there’s plenty of people who might be using this relatively popular plugin. To get the benefit of this tip, you need to be visiting a site with CommentLuv installed. You don’t need to have CommentLuv installed on your WordPress installation, but you do need a WordPress installation. Yup – it’s a WordPress thing. Edit: Apparently CommentLuv will pull in posts from other non-WordPress sites. I didn’t know that. More joy!
Check it out on YouTube.
A site that has CommentLuv installed looks something like this, typically after the standard WordPress comment box.
Now watch what happens when I enter a comment, and use my GreenlaneSEO site as the website (that’s right kids, I use WordPress. Now go ahead and hack me. I have nothing left to lose!)
Boom. CommentLuv reached out to my WordPress blog, saw the last post I had created, and created this link. I can’t wait to see what it looks like when the blogger approves this post. And since I’m a white hat SEO (for this post), I actually took the time to read this blog post and comment with something that added to the conversation. I used my real email address so I don’t look spammy to the blogger. I even used my real name because I’m, well, cocky.
Why are some of the CommentLuv links nofollowed, while others aren’t? No clue. It’s blogger preference, and a setting in their plugin. If the blogger opted for the paid / pro version of CommentLuv you’ll get even more option available to the commenter, notably the choice of a few other recent blog posts to link. You’ll see that from time to time. That’s right – some bloggers are actually OK giving you PageRank in your comment links (hint – I’m one!!!).
A side note on submitting comments on WordPress. Some installations will clear the comment once you submit, and some will show your comment in a moderation state. I’ve even seen the CommentLuv link get nofollowed while in the moderation state. Don’t panic. Drink a beer and relax. You have to be patient and wait for the blogger to approve your comment. If you leave a stupid comment, prepare to get canned.
How Do You Find These NoFollowed CommentLuv Blogs?
Google, of course. Just ask the all-powerful, all-wise, all-knowing algorithm who seems to get everything right but how to make use of social signals. Enter this query into Google’s search box:
inurl:”2012“+intext:”CommentLuv badge”+”recently posted”+”keyword“
The items in bold are knobs, meaning you can change them. So what are we looking at here? The inurl operator tells Google to return back a page with 2012 in the URL string. Wordpress by default likes to post dates of your posts, and since you’d prefer more recent posts (so you know the blogger is still alive), you can enter 2012. Or, you can enter in a keyword that you think will be in the permalink. For example, if I just wrote a post about stratocaster guitars, I’d probably want my link to appear in a blog post about stratocasters. More relevant link juice could get passed. It’s possible I’d enter “stratocaster” in where “2012” currently is, that way I could bring up posts like http://www.jag-stang.com/faq/general/will-a-mustang-neck-fit-on-a-stratocaster/.
The next knob is the “keyword” knob. You want to dig up some posts with some similar keywords not just in the URL, but in the body as well. Enter that here. In the case where my inurl is “stratocaster,” my keyword might be “guitar” or “Fender” or “Eric Clapton.”
But we’re not done. By doing this you’ll get lots of potential (and relevant) CommenLuv pages. But are they fresh? Sure Caffeine is supposed to make the results fresher, but not as fresh as humanly possibly. But don’t worry. We have a filter for that as well.
1. Install a nofollow checker into your browser.
2. Use the search query I provided – tweak as necessary to find relevant blogs.
3. Scan the page quickly to see if the CommentLuv links are followable. If not, go back to the SERPs and pick the next link.
4. Once you find a good page, read (scan) the post and leave a thoughtful contribution.
Will this get you the ban hammer? It shouldn’t if you play it right. If there’s one thing I know, it’s how to get banned from Google (hey, everyone needs a hobby). But going into this with the actual intent of adding editorial value is what Google’s vague Webmaster Guidelines want.
If you wanted to go a little gray / black on this, there are scripts that can overwrite your RSS feed (where this info is being pulled) and change your titles so that the anchor text that shows in the links is more of a keyword. But if an exact match anchor text is what you want you want in your link, I suppose you could also just name your post with that keyword. Done deal.
I actually find this to be a pretty fun tactic. Not only do I discover good content, but I get to engage in conversations that are relevant to my site and interests. I get to find content ideas, and I get to archive some new potential link prospects for guest posts. I’ve created a shared Google Doc with a few of my close SEO friends where we share a bunch of the good sites that we found using this technique.
Happy link building.
What a wild 2012 so far for Google. Through all my years in this game I’ve never seen them so aggressive in updating the organic search algorithm. If we’re entering a post-panda era, it appears only to be in name. Labels aside, it’s still ratifying one of the same fundamental pillars – clean the index of low relevance litter.
In March, Google’s stopped sweeping nearly everything into the supplemental index (as Panda would do). Now they’re actually taking some of the trash bags they stored in the garage out to the curb. Instead of just the traditional filtering we’ve become accustomed to, tens of thousands of sites have been deindexed in March alone. It’s not clear whether it’s a penalty or just a new way of looking at quality.
What? You didn’t hear about it? Well, it’s not getting a lot of mainstream attention, possibly because we’re getting numb to all the flux. But if you play with link building blog networks (like Build My Rank or Authority Link Network), you’re probably in the know. These networks, which conduct through thousands upon thousands of interrelated blogs, are now reporting a huge chunk of deindexed inventory – well over 50% of the sites. These are typically thin sites on flimsy domains with junky spun posts. But managing that was supposed to be a job for Panda – not a rabid grizzly bear.
Maybe the jail is just too full of criminals, and they’re starting to escape through leaks in each algorithmic update? I guess if that were the case, you need to eventually get the big guns out. In my opinion they’re a little trigger happy now.
I personally have a stable of “experimental” sites where three were kicked out last week. They weren’t the best sites, but they had original content and were true passion pieces. They were about things I actually cared about, and did have some expertise and strong points-of-view posted on them. They weren’t big money makers, and didn’t have much of an audience, but they did link (in some cases) to affiliate landing pages – in a tasteful and useful way. They weren’t over SEO’d, and they didn’t have too many ads above the fold. They had nothing to do with a link building blog network. If these sites got swept up in an aggressive algorithm, I’d say Google grossly miscalculated their value, or identified them incorrectly. To a few very niche visitors, I’m sure these sites were a decent pit stop on the world wide web. Hardly kickworthy.
With all the sites being cut I wonder how many innocent bystanders are getting caught in the net? Granted, that’s always been an issue with a search engine filter, but I’ve accepted them as the cost of working with Google. A lot of the “healthy” medicine that Google gives us winds up having side-effects. But a deindexing is a much more complex road to recovery. Even if I could turn these zombies back into people, how long would the scarlet letter be on them?
At the end of the day, Google’s spring cleaning might be a great thing, or a terrible thing for your site. If your competition is using these thin sites for links, or if these thin sites are blocking you out, you’ll likely see a bump in the SERPs. But if you’ve been working with some of these thinner sites (including sites like mine that really didn’t go against what is acceptable by Google), then it looks like a lot of your effort may have just been flushed.
I’m an SEO for a living, but even I get pitched. I don’t know if it’s a stale lead list, business directories, or random phone dialers, but yes – I get calls too. They always seem to know what’s wrong with my site – not just this SEO site, but my other websites as well. They’re clearly not always paying attention to who they’re pitching.
Oddly enough, every site I own seems to have the same issues according to the canned emails. Absurd.
Legit SEOs know they have to deal with the snakes in our industry, but if they weren’t successful, they probably wouldn’t be trolling businesses with largely scaled SEO offers. For the millions of spammy emails they send, they do hit some targets in the form of small to medium sized businesses. I’ve gotten many calls from old friends who work at, or own, a business that got hit with this pitch. “Bill – we just got this email. Have you ever heard of them? This seems too good to be true!” It is.
However, good SEO companies do outreach too. You can get pitched by good, reliable, talented SEO companies. But you’ll benefit by asking them the right questions, to make sure you’re weeding out the snakes. To my SEO readers, what would you ask and why? Add it to the comments.
- Can you guarantee rankings?
If they say yes, be suspicious. Any SEO can get you some rankings, but even the best SEOs can’t guarantee every ranking. I’d warn you against any SEO that says they can guarantee a top ranking for any term. First, you can’t make Google do anything (SEOs are influencers, not negotiators). It’s possible that their favored rankings and universal results are immovable.
- What would you consider a successful campaign?
If they say “lots of traffic,” ask them what they think about being held to conversions. Any SEO can get you traffic. They can spam through Twitter, or create irrelevant doorways. But what good is traffic that doesn’t convert? The great thing about SEO is it’s inbound by nature. It pulls in the searchers who are looking for you. People can be tricked into visiting your page, only to realize that they’re not where they wanted to be. They’ll bounce in seconds, but your SEO serpent will strut around like a hero. Hold them to conversions, and you’ll find out pretty quickly how confident your SEO is.
- What tactics do you use?
Some SEOs focus on link building, some focus on on-page contextual work, some focus on content marketing, and some focus on sitewide technical optimizations. If you have an SEO that only does one, you may be missing some necessary ingredients. Very few sites can be fully successful without all these ingredients – though it’s possible. If an SEO approaches you without giving you a real sense of skill in those areas, you may have someone who’s creating a quick, automated package that doesn’t have a lot of substance.
- What does your reporting look like?
The reason to ask this is obvious, but sometimes the obvious escapes us in times of need. Once you see a report, you’ll start to ask a lot of questions. You’ll be inspired. The reports that an SEO provide can be a real indicator into how deep their engagement is going to be.
- Can you provide some references?
It’s awkward to call a stranger, but this is really important if you’re banking on an SEO. You need a review of the SEO. If they can’t create happy clients that are willing to share their experience with you, then you have to raise a red flag.
If you’re a seasoned SEO, you’ve probably heard about one of Google’s latest rich snippets – the “author image” snippet. Well, I finally got mine. And since there’s still a dust-up on how to get it, hopefully these super simple instructions will work for you.
To see it in action, you can Google the term “free seo audit“. You should see me in position 4.
The image appears to have had a nice impact. Not on rankings or impressions, but click-throughs. And that’s what I would expect. Typically in my days of ecommerce, the reviews and price rich snippet showed a much larger interaction rate than the listings without it. Why wouldn’t it be the same for a picture? It helps the listing stand out in a more profound way, and draws the eye to the title and meta description. Thus, a higher likelihood for a click.
From what I noticed, I got this image on February 4th. I was lucky enough to catch it on the day it went live.
Traffic for the first position, non-image listing was an average of about 15-20 visits a week. Now, in my two weeks since having the image, I’m averaging 30-40. I’m assuming “free seo audit” didn’t become a new trending topic in the last two weeks. I believe it was the image, and I’ll continue to monitor. I’ll let you know if anything changes.
How Do I Get The Author Image?
Here are the exact steps I took to implement Google’s authorship markup. I’ve seen a few pages that were either wrong or overly complicated it. All you need is an “about me” page on your blog and a Google Plus profile. That’s right, folks. It’s a perk to buying into Google’s “Facebook Killer.”
- Create a link from your Google Plus profile to the “About Me” author page on your blog (Google will automatically add the proper “me” microformat to the link). For example below, I created this link which goes to my author page – greenlane.wpengine.com/blog/about. This page has all sorts of information on me as a human being on the internet.
- Next, I created a link from that about me page to my Google+ profile (I used a graphic, but it doesn’t need to be). Important: The link must have the microformat rel=”author”. So for example, in my case the code for that link is:
- Now, this is the part I’ve seen wrong from other sites. As you start to write for your blog, you need to add rel=”author” to your name on each post. Nowhere else. I use WordPress, and was able to do it in the single-post.php template, by the variable that dynamically pulls in my name. Unfortunately if you’re using another blogging platform, you’ll need to do a little homework on the web to see how to get rel=”author” in the link for your name. Maybe you’ll be lucky and your blog already does it by default. Update: 6 8 2012 – Yoast says Google reads it in the <head> tag, if that makes it easier to add for you.
Once the changes were made, it took about 3 weeks. Your time may change. I’m not even positive Google will show these icons for everyone, or every post (as I’ve heard some people say). I’m not particularly active on Google Plus, so I would venture to guess that it’s not required to be a power user (as some people have suggested).
Updated fun fact 6-8-2012: At first, if you had several pages rank at once for a keyword, you’d get your mug on every one of the listings. Looks like Google just updated this – now it appears your image will only show for your first listing.
I’ve given up reading Top 10 lists.
We’re an industry that taught the world “content is king,” and we certainly practice what we preach. We’ve also read a million times that a successful way to draw a reader or search engine spiders is to use something kitschy like a top 10 list (or top 20, or top 30). Clearly it’s worked for Billboard and Mashable as entertainment. I’ve certainly been swept up in the hype and recommended it to my clients more than once. I actually have a “Top X” list somewhere on this blog. But now I find myself ignoring tweets after tweets promoting another “brilliant” top 10 list. I’ve seen a million white papers in the last year that have promoted the “Top 10 Best Landing Page Tips,” or “Best Social Media Tips,” “Best SEO Tips,” etc. I’ve also seen the same posts again just using different words, almost as if it was spit out of The Best Spinner.
We’re talking about online marketing. It’s bigger than 10 – it’s bigger than a million – and these fluffy pieces tend to make people forget it’s still only as applicable to your marketing campaign as it is relative.
Today I broke my rule. I just read a Top 10 from a popular search company, put out as a downloadable white paper (I know it’s a lead generation trick – I’m expecting to be ignoring a call any moment now). This document was clearly written to be generic “industry” fodder.
On this list, number four definitively suggested the best marketing landing page is bare-bones, one font page, with very little content, functionality, or design. Sure, you’ve seen that before, but you’ve also seen others say the complete opposite – that a long, content rich page is the way to go.
In our industry, for every expert opinion, there’s an expert opposing-opinion. But not everyone takes it with a grain of salt.
Both of these design “tips” are general, and don’t know a thing about your vertical, customer or visitor habits, business goals, products, brand history, or your own company experience. To me, that makes a lot of Top 10 lists nothing more than noisy fluff.
For example, are you running an inbound marketing campaign, where your top keywords are for a term or concept that the public isn’t really familiar with? Do you need to be brief because your searchers are qualified, or do you need to provide options or funnels to support further information gathering? In this case I’d have to think about what kind of landing page I’d want to create, but I’m fairly sure I’d be misled if I blindly followed this particular Top 10 list.
Personally, I think these lists need context, and need to be way more granular. Granted, they wouldn’t have as sexy a headline or as wide an audience appeal, but they’d be targeted and, well, useful. They’d actually provide content that is capable of moving the reader forward in their own goals. If these lists exist, then I’d be all for them, but right now they’re as real as unicorns. Maybe they’re just off my radar. My Twitter stream may just be too polluted with fluff.
In early November Google promoted their ability to execute AJAX/JS to index some dynamic comments. A few years ago, Jed Singer and I did some digging to see just how well Facebook pages were crawled and indexed. The answer – not very well, but Facebook still enjoyed decent rankings for profile and brand pages alike, despite spidering issues. Our review suggested a heavy dose of domain authority and backlinking signals, and not necessarily on-page relevance.
Then, suddenly, Facebook pages started to show up less and less (somewhere around the time “Google Me” was the rumor) except for specific people and brand searches. I assumed a manual algorithm tweak to clean up the search engine result pages, and make general Facebook pages less of a player. The same kind you saw with Digg pages, Amazon subdomains, etc.
But when I read that Google is getting better at interpreting Facebook comments, I assumed they also got better at reading all Facebook’s public tabbed content. Still, I assumed they wouldn’t change their algorithm suppressing Facebook rankings.
This is one of several examples I found. It worried me about my long-tail for my websites, and sure enough, Facebook SERP SPAM there too. I don’t know if Google took their eye of their algorithm and made some changes without considering their prior intention, or if this is a real decision (can’t imagine why, though). I expect it won’t last long.
In the meantime, black hats will go at it, and white hats (and shoppers) will need to be annoyed by it. What a thoughtful holiday gift, Google!
If you work in an agency or run your own consulting business, I know you’ve been frustrated. So put your tongue in your cheek, and have fun venting with me.
By the way, this post was admittedly written with a few past clients in mind, in a previous life. I’ve since put myself in a position to only work with clients that match with my ideals. In consulting, that’s how it should be. I highly recommend happy relationships. It makes SEO fun.
I remember the Quality Deserves Freshness algorithm over a year ago. This update allowed fresher content to have a better chance at ranking. For example, if you’re looking for information on SMX East, you’d want to be served the 2011 conference… not the 2010 conference.
Well sometimes it worked, sometimes it didn’t. Like many people, we were directed to the 2010 SMX East page in September.
It makes sense that Google would reintroduce this now following so many Panda updates. It says to white hats, “go forth and write more. We appreciate it.” It says to black hats, “don’t quickly write crap as a spam tactic. Panda will get you.” Whether that’s true or not, I’m sure the timing of this update was by design.
The potential issue with this in my mind is how domain authority may play into it. Is this a factor that will weigh higher with big websites (high domain authority), of which usually has a higher budget to be able to hire writers to crank out new content more regularly? I still like the old feeling of natural search being a ground where small guys are on a level playing field with the rich dudes.
Need proof? Ask your mom to name her favorite store in the mall. Then ask her to their website online. If she doesn’t use the search bar in her browser or go directly to Google.com, I’ll give you a dollar.
We know people are creatures of habit. Search engines have become the main touch point in their online day-to-day. For many people, if you restrict their access to a search engine, they’ll fall apart like I do when I’m given an O’doul’s at a party. Why didn’t your mother try to type the website directly into the URL bar?
On the bright side, when 50% of your natural search visits are from variations of your brand name or URL spelling, at least your numbers look good – that may give you some job security if you spin it right.
But Google is a pain in the ass because with all this ownership and inside data, they don’t really play well with others. They don’t share. Take WMT for example – it’s very thin data in the grand scheme of things. It’s like giving a free sample, but never letting you buy the product.
And what’s with the rich snippets? I give you the content, and you post it in your result pages? I want people to click through. How do I know they’re not clicking through? Because I’m not even clicking through on my own site. I’ve been sucked into the Google vortex.
Don’t get me started on feeds, schema.org, or Google Places.
Granted, there’s a lot to gripe about. When I go to a search convention, I see I’m not alone. I never skip a panel with a Googler – it’s always a fun beatdown. But the same thing happens every time. By the end of the scolding, the Googler – be it an engineer or product manager, essentially commits social suicide, as he throws his hands in the air with the answer, “I don’t know.” As hard as he tries, he can’t answer all the questions he gets. It’s not always because he’s not allowed to, but I think it’s because he doesn’t know how to.
I think there are ghosts in the machine. I envision Dali painting with a thousands of gears all clumped together. Turn one, there’s no telling how many others will connect. At this point, I don’t think Google knows. They can try to reshape the monstrosity, but at this point the algorithm has to be pretty insurmountable. Add on top of this, there are several other algorithms running different Google properties that are probably comparably unruly.
But let’s face it. Google created this mess, and search engines created the SEO – both the good and the bad ones. But despite the hat you wear, we’re all dealing with our own KPI’s against this mutant algorithm. So are our competitors. Trial and error, testing, and patience are the key to building your experience. When you put together a marketing strategy, you’re typically trying to overcome an obstacle. You’re putting together a plan to move past the immovable objects. Google is an immovable object, so the SEO needs to strategize with that in mind. That’s far different than what many SEOs do by trying to defeat or complain about it. Since goal setting is the key to proper strategy, the SEO (and the employer of the SEO) need to plan touchpoints that accept these realities.
Things to ask yourself when choosing your tactics:
- Why do I think this tactic will work? (What experience or data do you have)
- What will this affect if it does not work? (Will your scheduled touchpoints be affected)
- What is this tactic’s plan B? (To stay on schedule, what is your alternative tactic and bandwidth)
- What are the points where we measure and analyze? (When do you take a break and make sure you’re on track)
- When is it time to abandon the strategy? (Sometimes your best plans fall flat – be willing to accept that)
With this clarity comes opportunity. Again, many of our competitors are dealing with Google head on, and trying to plow through the algorithm instead of dance with it. They’re probably granted the same amount of time and budget as you. SEO is a household word in business, but it’s still rarely done right in the grand scheme of things. Spend some time with a clear understanding of what Google really is to your website, and spend more time in the planning stage.
If you haven’t heard, about a week ago Google rolled out a change that affects your data in analytics. I started seeing (not provided) as a natural search keyword. Google has decided to go all SSL on us people who log into Google, and based on how they built it, that hides the keyword data from our analytics. They’re telling us it’s for privacy.
Let’s put it this way – if you have 100 natural search visitors in a month, and they all come to your site by Googling a different keyword, you’d expect to see 100 different keywords in your natural search keyword report. Now, if 50% of those 100 people were logged into Google (ie., logged into Google Plus, or Gmail, or Docs, etc.), you’d see 50 different keywords, and one (not provided) stat showing 50 visits.
When I first heard the news I looked at my Google Analytics. The (not provided) only represented .001 of my natural search visitors. I checked again today, and it’s up to .05 of my visitors. I expect it to grow as it continues to roll out through data centers, and as more people started joining up with the Google products that make them log in. Does this roll into mobile too? I assume so.
Why Is Google Really Doing This?
I don’t know. Right now it only affects natural search. If a user clicks an AdWords ad from Google.com, the keyword referral data is still passed through whether the user is logged into Google or not. Speculation is that display companies are using natural search data to better target their ads, and since Google is focused on the display game now (trying to own it… which they’re completely on par to do), they’re possibly trying to lock away some of their keyword data. But those same companies can normalize the same keyword data from Yahoo/Bing and still be close.
If (not provided) grows, and a percentage of your keyword referral data is lost, will people start getting “rank crazy” again? Will people start scraping Google for rankings they think they should rank for, versus knowing they should (or shouldn’t) rank for with traditional ranking reports? Google hates when we scrape them and inflate their AdWords numbers.
But what really ticks me off is that I use my keyword data to better my visitors’ experience. With personalized search, social search, and all the cute little things Google does now, I get a lot of interesting queries in my keyword report. Sometimes they’re things I wouldn’t normally rank well for, but because there’s”some relevance” with my site, I get these rare keyword entries. They often inspire me to create content.
If I had a site for plastic sneakers, and I got a one time natural search keyword visit for “how to run with plastic sneakers and not get blisters,” I might assume there’s a pocket of people with that same question. I might write a blog post and answer the question. I might put an article on my main site to attract visitors. In the end, this might provide a great value to searchers, and my own website. But now, if the user who entered this query was logged in, I’d never see it in analytics. Inspiration may never hit. Everyone loses.
Ok, maybe right now it’s not something to freak out about. It’s another “Google wait and see” game, but we’re used to that now, aren’t we? This is just an odd one. Data is so important to content providers.