Updates to this post here: Maybe Authorship Is A Defensive Play
I buy into authorship / author rank as a ranking signal. I can envision the Googlers drawing a web on the white board with the outline to making this “one” supplemental ranking algorithm. The social signals we expected to see take dominance seem to have flamed out before they became a product release, while the spammer market continues buying and selling “likes” and “pluses” like blood diamonds.
The question I see bloggers ask is, “how can Google favor sites fairly through authorship when some high profile domains don’t really promote authors, or have enough concentrated authors?” Think through this. Here’s my WAG on what’s going to happen (I’ve actually been pretty accurate so far… I know, I’m surprised too!).
First, this is likely going to be one algorithm in the pack of all the others. While some algos routinely get turned down with others being turned up (remember, “the natural search algorithm” we refer to is more like a rope made up of smaller algorithms embedded or encasing), this one would be vibrant. It will influence, but not at the cost of domain authority. It would have to be a conditional algorithm, maybe based on the type of sites it ranks against. It’s not going to be a game changer and make everyone change the way they’re producing content on the web. As crazy as Google can be, they’re not trying to turn users into their dogs.
Whether an intended component of author rank, I see it as a counter-balance to the “google favors big brands” complaint that they experienced in the past (and I accuse them again of now thanks to Penguin). Authority (using this not of the more common off-page metric, but instead of on-page authority) could only be counted by the brands impact on the web. Now, it’s digging deeper to spread the experts and relevance apart, opposed to their former approach of aggregating everything. Google makes things we think aren’t scalable just that.
In this there’s a chance that good writers can get the chance to beat, say, Mashable calibur sites, and give their little webpagse the chance for its 15 minutes of fame, because Google could potentially see it was deserving of it based on the writer’s past proving of themselves. Here’s an example – once upon a time I wrote a post about the link shortener services that passed SEO value. I went through every one I could find and tested each one. I got some traffic because I tapped into an interest before anyone else (it seemed). But then, the might Search Engine Land posted a very similar article, and mine got crushed. Still some first page rankings, but ultimately it faded. In this new model, despite my GreenlaneSEO having a PR of 2, I might actually have a chance to beat SEL and maintain some rankings for my own authority thanks to the added push of my personal brand equity.
Maybe I got authorship all wrong, but it makes sense to me. This is an optimization Google needs to make, and it looks to me like they know it. What do you think?
A clarification before we begin. I don’t think Penguin will be eradicated or the name becomes a distant memory like Joe Piscipo (spelled wrong in case Joe has himself on Google Alerts – I’ve seen his arms!), but I am referring to pulling back on some of the areas that are overclocked, and reversing SOME of Penguin. There was some confusion by people who jumped at the title without reading the post. Fair enough, so I’ve tweaked the title for those I’ve infuriated. You’ve made my day.
I don’t know if Penguin is a penalty or a tweak (Update 7.28.2012 – you’re reading a post that’s a bit outdated; Matt Cutts said this is not a manual penalty) – I don’t care. At the end of the day it is a furthering attempt to organize the index to show less webspam. Either by downplaying some factors or emphasizing others, it’s a calibration nonetheless, and one that has thrown more babies out with the bath water than I have to believe was intended.
The best time to go on vacation is when Google makes a big algorithm adjustment. Ignore the posts for a couple weeks. When the dust starts to settle, and you see the end result is publically declared “bad rankings” across the board, it’s pretty hard for a company not to be reactionary. Google, who is usually pretty staunch, has to be listening to this one.
The SERPs look like they did a few years ago, when Google was getting heat for favoring big brands, which ultimately came from high domain authority. That didn’t bode well for them then, and it will be worse now.
These Google engineers are smarter than I’ll ever dream to be, but I truly believe that the algorithm they created is a monster. A series of thousands of gears built upon each other, so deep and complex that a master blueprint doesn’t even contain it. Until a Googler tells me otherwise (and even then I’m sure I’ll doubt), I think a lot of their search quality meetings end with, “Ok – let’s make that change and see what the hell happens.” I don’t think they will ever understand the true extent of what even a simple tweak will do. Forget this 3% or 7% shit – it’s clearly been a variable number with a huge “give or take.”
So, I think Google will silently develop another tweak and pull this one back. Maybe the requirement will be to pull back an offending Panda update that just isn’t meshing well anymore in this jumble, or pulling back the scrutiny lens on anchor text (including internal). I don’t know. But I do believe that headlines like As Google Tweaks Searches, Some Get Lost in the Web, from the Wall Street Journal, get passed around in the C-suite pretty quickly. Google knows perception is reality, and doesn’t want to be seen beating up the little guy.
If Google does reverse Penguin – and by “reverse” I mean pull back some of the overclocking errors, I have to think (and hope) there’s a down-turn in current domain authority factors, and a real algorithm thread that truly values this “quality content” we’re told by every SEO post to create. But without parameters, who the hell knows what a confused algorithm will consider quality.
I’m going back on vacation. Let’s see if Sebaldamus is right on this one!
Update 7.28.2012 – They still haven’t. Damn. The Google Dance is still on high. Results are still favoring brands, and SEOs are scrambling to develop more link tools to make a quick buck (like link removal tools). Penguin is starting to settle in, and as Wil Reynolds said at Mozcon 2012, it might have been the best thing to happen to our industry, and might actually improve the reputation of the SEO industry while allowing us to benefit Google. If they can just make the rankings good, well, that would be super.
Like this post? Vote for it on inbound.org.
Google Analytics is rolling in reports to help you answer this question. Well… kind of.
Check out the Social Sources report:
First thing you’ll notice are two graphs to compare against each other. The top is your social referrers (that is, traffic from all the sites that Google buckets out as a social site), which is detailed deeper in 1 – 10 detail list further down the page.
Let’s drill in one step deeper. Click one of the listings (ie, Twitter, Facebook, Google+, etc).
Clicking the social platform you to compare takes you into that profile. You can change between different pages now with a new selector that appears above, which looks like this:
So what are we comparing?
We’re comparing Visits via Social Referral (blue) with All visits (orange). So, it’s a quick view of how much social traffic contributed to your overall traffic. Are you doing a lot of social media work? Did you have a bump on a Friday, and wanted to see where it came from? Go to this report. Set your date range and you’ll be able to see pretty quickly.
But it gets more interesting. Click the Activity Stream tab:
Now the comparison changes to show Data Hub Activities (blue) vs. Visits (orange). These are the same “visits via social referrals” that were in the first snapshot. So what’s this Google Data Hub? Google says, “The social data hub is a free platform that social networks and other social platforms can use to integrate their activity streams.” Sounds like Google’s version of Facebook’s social graph.
So this makes sense. If you notice in the Activity Stream, there are far less sites than Google was originally reporting. Missing for me are Twitter, Linkedin, and Facebook, etc. Why? Because they’re not playing ball with Google and the data hub. Google doesn’t have information about shares and retweets here. In other words, they’re not behind the wall. They know Twitter is a social network, and buckets it that way, but they don’t have accurate data out of the Twitter firehose.
But what we can see from the sites participating with Google is what traffic you receive from social engagement. For example, here’s what it looks like filtered to Google+.
By looking at the graph above, I can see that on Monday, May 7, a link from my site was interacted with 4 times (blue), and led to 2 Google+ referrals (orange). For you data junkies, if you have enough data you can put together your own value of social with your own KPIs per platform. You can determine that spending most of your time on one network, vs. another, is a wise or dumb move. Or, you can rely on the “conversions” report right below the sources report (if you use “goals”). Do you have to be more social for your KPIs? Or do your current circles, say Google+, just not give a damn about the latest kind of content you’ve been sharing? […]
Once upon a time, the prince of SEO, Matt Cutts, said that only a small percentage of web links are nofollowed, and we shouldn’t pay much attention to it.
I must only be surfing that small portion of the web because I rarely see external “editorial” links that are followable anymore. Or, it’s media sites that are creating entity profiles so they never have to link out externally (ala Wikipedia). I think it’s sad. Many of these are the editorial links Google originally built an algorithm around, but simply failed to measure the link graph for a publisher’s intent.
Google. What have you done???
So I have questions for all you bloggers, webmasters, spammers, etc:
The major blog and social media platforms nofollow all posted links by default. They blame spam, but in this automated world, where they press a button to spam 10k blog comments, is the nofollow really deterring anyone?
We’ve been told that Page Rank scultpting doesn’t work anymore, but are some of us still concerned with leaking Page Rank? Or have the other signals stepped up to pick up where Page Rank leaves off?
Does having a lot of nofollows signal to Google that you care about them not misunderstanding your endorsement, or does it signal that you really don’t care who you link to?
Or do you think the nofollow is being counted (somewhat) by Google now anyway, and it doesn’t really matter?
Personally, I leave this blog dofollow. I get a lot of spam that gets caught either by my spam script, or by my own eye. It’s not difficult to moderate – in fact, it’s actually fun. I see the comments and get a chance to contribute to the conversation. My old company used to moderate comments for the NFL and other leagues; it was quite managable. In the past I had clearly marked rules and regulations for my own sites, where I would clearly state what kind of comments and guest posts I would allow (or turn on the “dofollow” for). If someone gave enough of a damn to leave me a comment and engage me, I’d like to see them get a little token of my appreciation.
I think the whole nofollow thing is a Google protocol that has gotten out of hand, and in light of Pandas and Penguins, I think we need these good editorial links back. I think we need a fundamental shift in this industry, but I don’t have the voice to declare it.
What do you think?
Like this post? Vote for it on inbound.org.
Well, since I slept through April in its entirety, and missed April Fool’s Day, I’m dedicating May 1st as its make-up day. Yup – that just happened.
We all know about the Miserable Failure Google Bombs, but I started to think about other pranks. There had to be more, right? Yup.
Smack The Link Finding Tools
Update your websites terms and conditions to include a “service fee” for automated scraping without written consent. Send Open Site Explorer, Majestic, etc. an invoice with reference to the T&C’s. Just don’t stand by your mailbox waiting for a check.
Share Analytics Code
Copy the Google Analytics code from the source of a website, and paste it onto one of your crap, spam sites. Hilarity ensues as they start to notice traffic for Viagra terms. (Ok, I don’t know if this really works, but I’ve been told it does, and that’s just ridiculous on Google’s part).
Mess With Keyword Reports
Not too unlike the last prank, start Googling terms that will force the target website to appear, while appending funny movie quotes. When the website shows, click the listing, and laugh at the thought of the SEO looking at their keyword report and seeing “target.com Do You Like Movies About Gladiators?”
Have any of your own pranks to share?
Disclaimer – This post is for entertainment purposes only. If you actually do this stuff, you have way too much time on your hands, and probably need to find a relationship. Quick.
Today was interesting. I’m in beautiful visiting Mexico on vacation, and while shopping at a local flea market, I was approached – practically all at once after distinctly hearing one merchant yell out “Americans” – by over 30 different people selling random products. They were aggressive in a “buy my shit or I’ll make sure you don’t leave here alive” kind of way, but some were pretty good at targeting my interests. My first feeling was that of being overwhelmed, but slowly I got my balance back.
One merchant said, “I have best price on Harley Davidson shirt. Almost free!” Interesting. I wasn’t wearing any Harley stuff, but I do ride a Harley. I am fond of them. Maybe it was my 2 week old scruff?
“We have 50 different kinds of tequila here,” said another. Again, brilliant relevance. I have a soft spot for hard liquor.
“Buy your lady a hat!” Shit, that guy tried to chump me right in front of my girlfriend, so of course that would make me bite.
But then the last one said, “Blow? Weed? Viagra?”
I’m not a drug guy, and don’t need… umm… Viagra (yet). So maybe since I looked like a drunk hippie biker, I looked like a good prospect? Either way, it was off target, and I got the hell out of that part of the market.
The digital world can be like a flea market. We’ve gotten better at yelling more relevant things at prospects thanks to remarking cookies, analytics, and so on. But with all the growing noise, it’s still really hard to tune into any one voice – especially if you’re not in a buying mood. We know the majority of people in a buying mood are using search. So, the inbound marketers try to create more relevant landing pages, but even we can miss our target if we assume we’re being heard correctly. In this case, a landing page about Harley Davidson products, tequila, and Mexican hats for my girlfriend would have probably held my attention perfectly, and pushed me towards a conversion. But one awry signal too many and a red flag goes up. We wind up punching out, going back to Google, and hitting the second listing… if not refining our search.
What Can You Do To Convert In Your Marketplace The First Time?
Only hit people with exactly what they want, and don’t hit too hard. It’s my belief that people really don’t want to search. They use Google because they have to, but if they have to search your site (or even your landing page) once they click off Google, you’re risking a bounced visit. Make the items worth highlighting big, bright, and bold. Assume you have 3 seconds to lock them in before they retreat back to the SERPs.
Make they’re life easier. Just give them the offers or content, with as little fluff or obvious funneling as possible. For most of us who aren’t major brand stores or news outlets, less is more. You have a better chance at being a “convenient store” than trying to go up against an Amazon, but if you’re landing pages are also uber-niche to boot, you’ll be more successful.
Finding out exactly what to write about isn’t too difficult. I look at Ubersuggest, a great keyword research tool run off Google Suggest. Run some queries relevant to your topic idea, and jot down a few that seem like potentials. Do they inspire buckets? Or do they inspire a single paragraph, or maybe a whole stand-alone page?
Also look at the new attribution feature in Google Analytics. Do you see any back to back searches that suggest what a user is really looking for? Did they search Girl’s Harley Hat, then Gifts For Women? If I saw that I’d sure be wishing I had a page titled “Harley Davidson Gifts For Women – Harley Hats” (or something like that). What I wouldn’t want to do is add men’s hats, or other motorcycle hats on this page (except maybe in the navigation). Too much noise and irrelevance doesn’t make your landing page convenient. I’d even try to find a way to write something interesting about women bikers who are passionate about the Harley brand and collect hats (yes, these women do exist). Do this, and you’d have just created a small, relevant, clean, clear, niche page that may not get a huge amount of traffic, but can get a few interested shoppers who intend to stick around. Better yet, you won’t have to chase them down and yell at them. Great success!
Oh, and some proof that this is a true story, here’s my girlfriend walking out with the hat I got suckered into buying:
Like this post? Vote for it on inbound.org.
If you haven’t visited inbound.org, try it. It’s a nice aggregator of digital marketing and design news. User submitted, and user promoted. Sound like Sphinn? Yup.
I bailed on Sphinn pretty early. Back then the SEO rockstar thing wasn’t as big as it is today, but it was still in play. I’ve been blogging since 2008, and never got anything “sphunn” up to a visible level, even though I had a few really good articles. I received thousands of visits off Google for a post I did about SEO friendly link shorteners (before SEL came and did the same article… bastards!) It was a highly searched topic, and I was first to market, but on Sphinn, I was shit. I pretty much determined it was because I wasn’t endorsed by a regular Sphinner.
This happened a few more times. I couldn’t break in to get any traffic. I couldn’t get any endorsements. Now I don’t have heaps of empirical data, but I have come to the conclusion that it became a popularity contest. That reminds me of High School, and I hated High School. More bastards.
Let’s not let this happen again. Here’s what you – the community – can do to prevent it.
1. Click the “Incoming” button. Don’t just troll the “What’s Hot” – I promise you that plenty of awesome content lives there. I promise you find so many more posts that are relevant to your interest. Give them a vote. Unlike the SERPs, there is life on the second pages. In my opinion, the “Incoming” page should be the homepage. How’s that for a twist? Give all the people the same power!
2. Don’t submit low quality. If it’s not something that’s new, or a fresh perspective, pass on it. Even if it’s written by your favorite repeat SMX speaker. Is it actionable? Is it something that’s going to get people thinking? Is it something that will garner a lot of comments? A lot of the rock star SEOs post the same generic stuff over and over because they’re flushed for ideas. This is a great way to build your real-life authority as a curator.
3. On the same tip… don’t vote something because the person who submitted it is a rock star. It doesn’t make you a rock star by default. It makes you a sheep.
4. Please don’t try to game it. It’s not a sophisticated system. History shows that all these “gamed” voting sites end up blowing up after they’re manipulated to hard.
5. Please don’t spam it. I’m pretty sure I don’t need to say this, but my OCD wanted me to make this a Top 5 list. So there you go.
Update: 1/22/2013 – This post was written about the time Panda and Penguin were starting to make huge waves. We didn’t really have our hands around their targets. Although this is an old tactic that probably doesn’t have legs anymore, and could get you a date with a Google hand editor if you abuse it, it’s still somewhat valid at least as a general marketing play. If you are a full time content marketer, you’re probably still talking about comment marketing in your circles. Many marketers I know still claim huge value in comment marketing as a source of generating new relationships. So I still use this tactic – more so to find areas where a good conversation may exist (and I can leave a link that can help my SEO). That’s not to say I won’t / don’t comment on nofollow blogs. I go where the conversation is (with an eye to where my editorial comment could add some trust to Google’s algorithm). Google would be asinine to remove comment value all together. For those that play by the rules, that’s about as editorial as you get. Killing comments would be cutting off their nose to spite their face.
I submitted this tip for a chance to win an 8 minute presentation at the Search Church through SEOmoz. I didn’t win. Am I bitter? Hell yeah I’m bitter, but instead I’ll probably be sitting in the audience with a basket of tomatoes ready to peg anyone with a worse tip than this.
As a link builder doing white hat work, you know it’s about PR, the pitch, and the R&D (researching prospects and developing relationships). It’s time consuming, and takes a lot of organization. “Did I follow up with that prospect? Did I just email him twice? Damn!” Sometimes you just want an easy way to get a few links. There’s always blog link networks… wait, scratch that. Well, there’s also CommentLuv.
What is CommentLuv?
CommentLuv is a WordPress plugin. With 73 million WordPress blogs out there, there’s plenty of people who might be using this relatively popular plugin. To get the benefit of this tip, you need to be visiting a site with CommentLuv installed. You don’t need to have CommentLuv installed on your WordPress installation, but you do need a WordPress installation. Yup – it’s a WordPress thing. Edit: Apparently CommentLuv will pull in posts from other non-WordPress sites. I didn’t know that. More joy!
Check it out on YouTube.
A site that has CommentLuv installed looks something like this, typically after the standard WordPress comment box.
Now watch what happens when I enter a comment, and use my GreenlaneSEO site as the website (that’s right kids, I use WordPress. Now go ahead and hack me. I have nothing left to lose!)
Boom. CommentLuv reached out to my WordPress blog, saw the last post I had created, and created this link. I can’t wait to see what it looks like when the blogger approves this post. And since I’m a white hat SEO (for this post), I actually took the time to read this blog post and comment with something that added to the conversation. I used my real email address so I don’t look spammy to the blogger. I even used my real name because I’m, well, cocky.
Why are some of the CommentLuv links nofollowed, while others aren’t? No clue. It’s blogger preference, and a setting in their plugin. If the blogger opted for the paid / pro version of CommentLuv you’ll get even more option available to the commenter, notably the choice of a few other recent blog posts to link. You’ll see that from time to time. That’s right – some bloggers are actually OK giving you PageRank in your comment links (hint – I’m one!!!).
A side note on submitting comments on WordPress. Some installations will clear the comment once you submit, and some will show your comment in a moderation state. I’ve even seen the CommentLuv link get nofollowed while in the moderation state. Don’t panic. Drink a beer and relax. You have to be patient and wait for the blogger to approve your comment. If you leave a stupid comment, prepare to get canned.
How Do You Find These NoFollowed CommentLuv Blogs?
Google, of course. Just ask the all-powerful, all-wise, all-knowing algorithm who seems to get everything right but how to make use of social signals. Enter this query into Google’s search box:
inurl:”2012“+intext:”CommentLuv badge”+”recently posted”+”keyword“
The items in bold are knobs, meaning you can change them. So what are we looking at here? The inurl operator tells Google to return back a page with 2012 in the URL string. Wordpress by default likes to post dates of your posts, and since you’d prefer more recent posts (so you know the blogger is still alive), you can enter 2012. Or, you can enter in a keyword that you think will be in the permalink. For example, if I just wrote a post about stratocaster guitars, I’d probably want my link to appear in a blog post about stratocasters. More relevant link juice could get passed. It’s possible I’d enter “stratocaster” in where “2012” currently is, that way I could bring up posts like http://www.jag-stang.com/faq/general/will-a-mustang-neck-fit-on-a-stratocaster/.
The next knob is the “keyword” knob. You want to dig up some posts with some similar keywords not just in the URL, but in the body as well. Enter that here. In the case where my inurl is “stratocaster,” my keyword might be “guitar” or “Fender” or “Eric Clapton.”
But we’re not done. By doing this you’ll get lots of potential (and relevant) CommenLuv pages. But are they fresh? Sure Caffeine is supposed to make the results fresher, but not as fresh as humanly possibly. But don’t worry. We have a filter for that as well.
1. Install a nofollow checker into your browser.
2. Use the search query I provided – tweak as necessary to find relevant blogs.
3. Scan the page quickly to see if the CommentLuv links are followable. If not, go back to the SERPs and pick the next link.
4. Once you find a good page, read (scan) the post and leave a thoughtful contribution.
Will this get you the ban hammer? It shouldn’t if you play it right. If there’s one thing I know, it’s how to get banned from Google (hey, everyone needs a hobby). But going into this with the actual intent of adding editorial value is what Google’s vague Webmaster Guidelines want.
If you wanted to go a little gray / black on this, there are scripts that can overwrite your RSS feed (where this info is being pulled) and change your titles so that the anchor text that shows in the links is more of a keyword. But if an exact match anchor text is what you want you want in your link, I suppose you could also just name your post with that keyword. Done deal.
I actually find this to be a pretty fun tactic. Not only do I discover good content, but I get to engage in conversations that are relevant to my site and interests. I get to find content ideas, and I get to archive some new potential link prospects for guest posts. I’ve created a shared Google Doc with a few of my close SEO friends where we share a bunch of the good sites that we found using this technique.
Happy link building.
What a wild 2012 so far for Google. Through all my years in this game I’ve never seen them so aggressive in updating the organic search algorithm. If we’re entering a post-panda era, it appears only to be in name. Labels aside, it’s still ratifying one of the same fundamental pillars – clean the index of low relevance litter.
In March, Google’s stopped sweeping nearly everything into the supplemental index (as Panda would do). Now they’re actually taking some of the trash bags they stored in the garage out to the curb. Instead of just the traditional filtering we’ve become accustomed to, tens of thousands of sites have been deindexed in March alone. It’s not clear whether it’s a penalty or just a new way of looking at quality.
What? You didn’t hear about it? Well, it’s not getting a lot of mainstream attention, possibly because we’re getting numb to all the flux. But if you play with link building blog networks (like Build My Rank or Authority Link Network), you’re probably in the know. These networks, which conduct through thousands upon thousands of interrelated blogs, are now reporting a huge chunk of deindexed inventory – well over 50% of the sites. These are typically thin sites on flimsy domains with junky spun posts. But managing that was supposed to be a job for Panda – not a rabid grizzly bear.
Maybe the jail is just too full of criminals, and they’re starting to escape through leaks in each algorithmic update? I guess if that were the case, you need to eventually get the big guns out. In my opinion they’re a little trigger happy now.
I personally have a stable of “experimental” sites where three were kicked out last week. They weren’t the best sites, but they had original content and were true passion pieces. They were about things I actually cared about, and did have some expertise and strong points-of-view posted on them. They weren’t big money makers, and didn’t have much of an audience, but they did link (in some cases) to affiliate landing pages – in a tasteful and useful way. They weren’t over SEO’d, and they didn’t have too many ads above the fold. They had nothing to do with a link building blog network. If these sites got swept up in an aggressive algorithm, I’d say Google grossly miscalculated their value, or identified them incorrectly. To a few very niche visitors, I’m sure these sites were a decent pit stop on the world wide web. Hardly kickworthy.
With all the sites being cut I wonder how many innocent bystanders are getting caught in the net? Granted, that’s always been an issue with a search engine filter, but I’ve accepted them as the cost of working with Google. A lot of the “healthy” medicine that Google gives us winds up having side-effects. But a deindexing is a much more complex road to recovery. Even if I could turn these zombies back into people, how long would the scarlet letter be on them?
At the end of the day, Google’s spring cleaning might be a great thing, or a terrible thing for your site. If your competition is using these thin sites for links, or if these thin sites are blocking you out, you’ll likely see a bump in the SERPs. But if you’ve been working with some of these thinner sites (including sites like mine that really didn’t go against what is acceptable by Google), then it looks like a lot of your effort may have just been flushed.