Founded in 2005, we're a boutique SEO consulting group with big experience and
industry recognition. We help companies reach their inbound marketing goals through
education and strategy development.
 
 
 
0

How Entities (and Knowledge Cards) Can Help With Query Intent

Vote for this post on inbound.org.

Entity optimization as a big SEO play isn’t quite upon us yet. It’s a slow, growing Google addition. I know – it frustrates me too. So much potential, of which I believe will greatly improve search results in the future. Google isn’t nearly showing the fruits of everything it knows through entities, whether through cards or search results – at least not relative to the way they rank on keywords alone.

But can knowledge cards help bring qualified traffic while considering searcher intent? SEOs always talk about searchers intent. Anyone who’s been doing SEO for a while knows that building for intent can be a challenge.

Untitled-1

Take a query like “batman the dark knight”. Was the searcher looking for the 2008 movie? The graphic novel? The upcoming game? Were they looking to buy something, or just curious about a release date? What the hell were these people thinking? This is certainly very top of the funnel stuff, and would normally yield lower conversions, but it is where many Google non-power users would start.

foil.0_cinema_1200.0Google knows these searchers expect them to be mind readers. They’re keenly aware of this. They may be working on mind-reading devices in their labs (to which I will finally invest in the tin-foil hat – I’ve got a lot of junk swimming in my head that should stay hidden). But in the meantime, through their results they give us personalized search, or this cute little cluster of links, but I doubt many click on anything here:

Untitled-2

But if you properly create an entity, you can get better “related results” in the knowledge graph:

Related Results

Pop into Freebase and look up either of these entities, and you’ll see the details above listed out. Coincidence? Probably not. The data could have come from there. We know the Google-owned Freebase is part of their brain now. But unfortunately, the huge database of great information (granted, which needs to be checked against other sources), simply isn’t producing results yet. Whether a limitation in the knowledge card product or limitations in processing the data, I’m not sure – but I’m always hopeful Google steps it up soon.

Of course I recommend optimizing now and getting your entities in place for when Google pushes the pedal to the metal.

But for those who are working on campaigns where entities are being shown, you’re in luck. Google’s using your search history and their knowledge cards to personalize the results – sometimes in a more valuable way than the general results.

The Jaguar Example

If I were doing SEO for Jaguar, a well-known luxury brand car, I already have the benefit of Google knowing what my product is. They show some of it in their knowledge card with a simple “jaguar” search:

Jaguar Result

Obviously this isn’t all Google knows – just what they feel like showing at the present time. They’re getting this from Google+, Wikipedia and Freebase at a minimum.

Since Ralph Speth can’t go back in time and choose a new name for the company, they have to compete for search result real-estate and millions of monthly searches for the term “Jaguar”. That is, against other pages that want to rank – like the Jacksonville Jaguars, the animal, the Atari Jaguar, comic book characters, and movie titles.

Now, if I were doing SEO for the defenders of wildlife, and I wanted this top-of-the-funnel term to potentially bring me traffic and awareness, the default (above) results suck for me. It’s all cars, football teams, or pictures.

But Google does something cool…

Search history plays a role in results. It uses keywords, and ideally entities, to see relationships through queries. A query like “animal,” “panthera,” and “wild animal” is related to Jaguar. Specifically, a query like “panthera,” followed by a new search for “jaguar” gives a different result. The Jaguar car listings, ads, and knowledge card are supressed, for an option where one can click to refine their search. This isn’t even slightly hidden. See the difference between the below results and the above example?

Related Searches

Clicking the link (pointed to by the red arrow) shows a new refined search where defenders.org has a listing (at the time of this writing). The query has been changed to “jaguar animal” but, through a new click-path, defenders.org has the opportunity to benefit from this “jaguar” head term. I believe this is at least partially entity driven. And, I believe this is a small example of how entities can be used in the future as Google’s products become more robust.

What do you think?  Am I seeing a connection where there isn’t one?


Permalink
1

The New Google Doesn’t Like Old SEO

I read – and commented on – a great post called Panda 4.0 & Google’s Giant Red Pen by Trevin Shirley. Panda 4.0 just hit; the SEO space is hiding under their desk, with some reacting either out of panic or for show.

It’s definitely news, but at this point, I don’t see any reason to scream from the rooftops at Google. It’s what we should be expecting by now.

In 2011, the first Panda showed us Google is not afraid to drop atom bombs. Panda opened the door for Penguin, and many updates have come since. Matt Cutts said he wished Google had acted sooner, and in his shoes, I’d probably agree.

Let’s not forget how spammy the results used to be:

Google junk

I can imagine the conversation at the Googleplex between the webspam and search team:

“Man, how did you let this get so bad?”

“Me? I though you were paying attention…”

“Look – we need to fix this. But the algorithm can only be tweaked so hard. I mean, it’s not Skynet yet.”

“But people think it is…”

“We’re going to lose our shirts if we don’t act quick. How about we take drastic measures.”

“But the SEO community will have a cow.”

“But hopefully the rest of the world won’t notice and just start loving, trusting, and using a cleaner Google!”

“Agreed. Hey Navneet Panda… do you have any ideas?”

It’s A New Google – We Need To Accept It, Rebuild

gojira

Maybe they should have named these things Godzilla instead of Panda or Penguin. The battles that ensued since the birds and the bears were nasty. Some search results were leveled. I’m not being dramatic for the sake of a metaphor – I’m pretty sure we can all agree the results have never been the same. Some SEOs were/are slow to give up the fight. Some agencies still sell SEO that doesn’t work. Others, however, have realized the new rules – while different – still offer great opportunity.

Google declares their war on spammers a victory, noting black hat forums have slowed down. They’ve admitted to throwing some FUD into the mix like Kim Kardashian’s publicist might do, but for the greater good of their mission – to fix the results and uphold their “reputation.” All the hatemail and tweets to Matt Cutts isn’t going to change this. I’m pretty sure he’s holding steadfast. While Google won’t nod to the fact that some good got swept up in the bad, they obviously know it.

But honestly, I think it works for me. I think the changes, and casualties, were necessary. Were they supposed to wait until they were perfect? Plus I was getting tired of the lack of imagination… not that some of the dark arts weren’t brilliantly designed and executed. But in some sectors, SEO is very slow to change.

What I mean is, I was missing the marketing. In 2007 I was in a full-service agency’s marketing department doing SEO. Yet, SEO didn’t feel like marketing then. It was still firmly planted in web development. But in my situation, marketing and web development were siloed. Our departments weren’t friends (some internal politics at play). As asinine as that sounds now, I learned it wasn’t uncommon in big agencies back then. So, to make our SEO offering work, I had to tie “marketing” and “technical” together.

As evolution would have it, there’s no doubt that SEO is a marketing channel now… so I kind of lucked out by getting an early jump on it. The more I tied the two together, the more long-lasting the results were. Even today. It’s the only real Panda/Penguin proof strategy I’ve seen.

Ch-Ch-Ch-Ch-Changes

Like many rock bands, Google has changed their formula. I agree – relatively speaking, Google now works pretty well. Or at least they’re finally poised to substantially improve. And that’s from me – a guy who hates change. Update your website or UI and I throw a temper tantrum. But realistically, has anything ever stayed the same? Did David Bowie not continue to produce great music, albeit different? Did Empire Strikes Back not kick more ass after changing directors? Did Windows 8 not improve upon Windows 7?

Granted, it’s still Google’s property, and they can do with it as they please, so if they only want to represent a portion of the web, I suppose they have that right. Maybe in hindsight it was kind of ambitious to attempt to organize all the world’s webpages. Ah, the dreams of two bright-eyed Stanford students.

In his post, Trevin quoted something from Hacker News that I found very interesting: ““We are getting a Google-shaped web rather than a web-shaped Google.” I sat with this for a few days. Ultimately I don’t think we’re getting a Google shaped web or a web-shaped Google. I understand the concern, especially when Google is a massive part of discovering new content and a provider of big revenue. But the web is much larger than Google. The citizens that create on the web, outside of the SEO bubble, are very much their own people, inspired by anything and everything. Alternatively, a web-shaped Google – which I argue was their first attempt – was a bit unrealistic.

When I worked with a client who was an innocent casualty of an update, I used to get angry. I used to think Google was a bunch of jerks. Then, I got creative, and found ways to get the client back onto Google’s radar – usually to a larger traffic and brand-recognition increase. Plus, I started relying on some of the other valuable internet marketing tools and channels. Talk about silver linings.

But honestly no client I’ve ever had, who got hurt by a Google update, was a true victim.  Google always told us they wanted to rank the best, most useful content to their users. I’ve worked with some clients who got the traffic, but only because Google didn’t realize they weren’t the best. I’ve seen sub-par, homogenized content ranking well, and though, “meh – might as well ride it while Google is still dumb.”

Now looking back, if they got swept up in an update, it’s because they really weren’t doing more than the bare-bone basics – Google simply stepped up their game. These sites weren’t the originator of content, topics, and incredible ideas. They were just “running through the motions”.

Maybe it’s time to accept Google has graduated from grade-school.

Conclusion

In another post I wrote about lazy SEO. The more I think about it, I think old-school SEO is lazy SEO because it simply doesn’t move the needle enough to quantify hitching your wagon to. I truly think if you haven’t moved on by now, you’re only going to be playing catch-up in the next couple years.

So what do you think? Am I right? Or have a misguided myself?


Permalink
1

Failing Reinclusion Requests? How To Uncover Those “Harder To Find” Links.

Sometimes desperate times call for desperate measures. This post is about a desperate measure.

We had a client with a manual link penalty. We did some work (using my outline from this post). Rankings started going up and traffic/conversions started boosting. Then, a few days later, the next Google Notification came in. It’s like playing digital Russian Roulette with those things – you’ll either be thrilled or be in a lot of pain.

This time Google said they “changed” our penalty, as there were still some spammy links out there.

Remember, not all penalties have the same impact. Clearly ours was lessened (which was continually proven in the weeks to follow), but our client – rightfully so – wanted to have the whole penalty removed. The problem was we couldn’t find anymore bad links. Everything from Ahrefs, OSE, Google Webmaster Tools, Bing Webmaster Tools, and Majestic (etc.) was classified and handled appropriately.

now-what

Google’s notifications sometimes show some additional samples of poisonous links. This time we were showed only two links of forum spam, something we found zero instances of previously. Old school, dirty forum spam usually is belched out in huge, automated waves. We asked the client, who asked their previous vendors, if they had any knowledge of the link spamming. Nobody knew anything about it, so any chance of getting a list of these URLs (which was probably very low anyway) was now nil. But how did we miss all of it?

The problem was, this forum spam was so deep in the index that the major tools couldn’t find them. Even Google’s Webmaster Tools report didn’t reveal them. That’s right – Google’s notification was showing us links existing, but weren’t even giving us insight into those links through Webmaster Tools. They never got any clicks so we weren’t finding them in Google Analytics. Google’s vague link reporting functions and vague, boilerplate notifications weren’t helping us help them.

Matt Cutts Facepalm - Google

The only way to find these deep links was through the use of Google’s search engine. Unless you have a staff of hundreds and nothing but time to manually pull results and analyze one by one, this didn’t seem possible. But we came up with with a reasonably easy process using Cognitive SEO, Scrapebox, Screaming Frog, and good old Excel, to try to emulate this activity with at least some success.

Note: I feel obligated to tell you that this is not going to be an exhaustive solution. I don’t think there is one. There’s limitations to what Google will actually serve and what the tools listed in the post can actually do. To give you some good news, Google will likely release you from a penalty even though you didn’t clean up every single spammy link. All the clients I’ve gotten out of the doghouse still had some spam out there we weren’t able to find. To Google’s credit, at least they seem to understand that. Hopefully this process will help you out enough to get the job done when your repeated reinclusions are denied (even after really, really trying).

Determining the footprints

We’re going to have to beat Google into giving us opportunity. The problem is, we’re going to get a serious amount of noise in the process.

We know the inanchor: operator can be helpful. It’s not as powerful as we’d like, but it’s the best we have. A search in google like inanchor:”bill sebald” will ask Google to return sites that link using “bill sebald” as anchor text. This will be very valuable… as long as we know the anchor text.

inanchor

Step 1. Get the anchor text

This can be done in a few ways. Sometimes your client can reveal the commercial anchors they were targeting, sometimes they can’t. All the major backlink data providers give you anchor text information. My favorite source is Cognitive SEO, because they give you a nice Word Cloud in their interface right below their Unnatural Link Detection module (see my previous post for more information on Cognitive).

word cloud

Collect the anchor text, paying special attention to any spammy keywords you may have. I would recommend you review as many keywords as possible. Jot them down in a spreadsheet and put them aside. Don’t be conservative here.

You also want to be collecting the non-commercial keywords. Like, your brand name, variations of your brand name, your website URL variations, etc. Anything that would be used in a link to your website referencing your actual company or website.

Together you’ll get a mix of natural backlinks and possibly over-optimized backlinks for SEO purposes. We need to check them all, even though the heavily targeted anchors are probably the main culprit here.

Get The Results

This is where Scrapebox comes in. I’m not going to give you a lesson (that’s been done quite well by Matthew Woodward and Jacob King). But if you’re not familiar, this powerful little tool will scrape the results right out of Google, and put them in a tabular format. You will want proxies or Google will throw captchas at you and screw up your progress. Set the depth to Scrapebox’s (and Google’s) max of 1,000, and start scraping.

Step 1: Enter in your queries

In the screenshot example below, I entered one. Depending on results, and how many commercial anchor text keywords you’re looking for, you want to add more. This might require a bunch of back and forth, and exporting of URL’s, since you have a limitation in how much you can pull. I like small chunks. Grab a beer and put on some music. It helps ease the pain.

But don’t just do inanchor: queries. Get creative. Look for your brand names, mentions, anything that might be associated with a link.

Step 2: Choose all the search engines as your target

In most cases you’ll get a lot of dupes, but Scrapebox will de-dupe for you. In the errant case where Bing might have some links Google isn’t showing, it may come in handy. Remember – Google doesn’t show everything it knows about.

Step 3: Paste in your proxies

It seems Google is on high alert for advanced operators en masse. I recommend getting a ton of proxies to mask your activities a bit (I bought 100 from squidproxies.com, a company I’ve been happy with so far.  H/T to Ian Howells)

scrapebox graphic

Step 4: Export and aggregate your results

After a few reps, you’re going to get a ton of results. I average about 15,000. Scrapebox does some de-duping for you, but I always like to spend five minutes cleaning this list, filtering out major platforms like Youtube, Yahoo, Facebook, etc, and removing duplicates. Get the junk out here and have a cleaner list later.

Find The Links

Got a huge list of webpages that may or may not have a link to you? Wouldn’t it be great to find any links without checking each page one by one? There is. Screaming Frog to the rescue.

Copy and paste your long list out of Excel and into a notepad file. Save as a .txt file. Then, head over to Screaming Frog.

Choose: Mode > List

Upload your recently created .txt file.

Screaming Frog 1

Then choose: Configuration > Custom

Enter in just the SLD and TLD of your website. See below:

Screaming Frog 2

Now when you click start, Screaming Frog will only search the exact URL in your text file, and check the source code for any mention of yoursite.com (for example). In the “custom” tab, you can see all the pages Screaming Frog found a match. Be careful, sometimes it will find hyperlinks that aren’t actually linked, email addresses for you, or hotlinked images.

Boom. I bet you’ll have more links than you originally did, many of which are pulled from the supplemental hell of Google’s index. Many of these are in fact so deep that OSE, Ahrefs, Majestic, etc., don’t ever discover them (or they choose to suppress them). But, odds are, Google is counting them.

The (Kinda)Fatal Flaw With This Procedure

Remember earlier when I said this wasn’t a perfect solution? Here’s the reason. Some of these pages that Google shows for a query are quite outdated, especially the deeper you go in the index. In many cases you could grab any one of the URLs that you found that did not have a link to your site (according to Screaming Frog), and look at the Google cache, then find the link. Did Screaming Frog fail? No. The link has vanished since Google last crawled the URL. Sometimes these deeply indexed pages don’t get crawled again for months. In a month the link could have been removed or been paginated to another URL (common in forum spam). Maybe the link was part of an RSS or Twitter feed that once showed in the source code but has since been bumped off.

The only way I know to overcome this takes a lot of processing – more than my 16gb laptop even had. Remember the part where you upload the full list of URLs into Screaming Frog in list mode? Well, if you wanted to pull of the governers, you could actually crawl these URLs and their connected pages as well by going to Configuration >Spider>Limits and remove the limit search depth tick, which applies a crawl depth of ’0′ automatically when switching to list mode. I was able to find a few more links this way, but it is indeed resource intensive.

Limit Search Depth

Has It Really Come To This?

This is an extreme example on rare cases.

Yesterday we had a prospect call our company who was looking for a second opinion. Their site had a penalty from some SEO work done previously. The current SEO agency’s professional opinion was to burn the site. Kill it. Start over. My gut-second opinion was that it should (and could) probably be saved. After all, there’s branding on that site. The URL is on their business cards. It’s their online identity and worth a serious attempt at rescue. In this case I think extra steps like the above might be in order (if it should come to that). But if it’s a churn-and-burn affiliate site, maybe it’s not worth the effort.

Post-penguin we find that removing the flagged links, with the parallel event of links just becoming less and less valuable as the algorithm refines itself, does keep rankings from bouncing completely back to where they were before – in most, but not all, cases. That’s a hard pill for some smaller business owners to swallow, but I have never seen a case of penalty removal – where all the levels of rank affecting penalty were removed – keep a site from never succeeding in time. Time being the keyword.

So yeah, maybe it really has “come to this,” If your site is worth saving. At the very least you’ll be learning your way around some incredible powerful tools like Scrapebox, Cognitive SEO, and Screaming Frog.

I’m excited to see if anyone has a more refined or advanced way to achieve the same effects!

 


Permalink
1

Review of Linkody (Daily Link Discovery)

There must be thousands of SEO tools. While many tools are junk, a few great tools rise up each year and grab our attention. They’re often built for some very specialized needs. Of all the industries these brilliant developers could build in, they chose SEO. I’m always thankful and curious. As a fan of SEO tools, both free or paid, I’m excited to learn about new ones.

A few months ago I got an email from François of Linkody asking for some feedback. It did a nice job of link management and monthly ‘new link’ reporting. Pricing was very low, it’s completely web-based, and is very simple and clean. It pulls from the big backlink data providers, and even has a free client-facing option (exclusively using Ahrefs) at http://www.linkody.com/en/seo-tools/free-backlink-checker. Great for quick audits. I’ve used it quite a bit myself, and was happy to give a testimonial.

The link management function isn’t new to the SEO space. Many tools do it already, like Buzzstream and Raven – and they do it quite well. Additionally, link discovery is an existing feature of tools like Open Site Explorer, yet this is an area where I see opportunity for growth. I love the idea of these ‘new link’ reports, but honestly, haven’t found anything faster than monthly updates. I know it’s a tough request, but I mentioned this to François. By tracking “as-it-happens” links, you can jump into conversations in a timely manner, start making relationships, and maybe shape linking-page context. You might even be able to catch some garbage links you want to disassociate yourself from quicker.

The other day I received a very welcomed response: “I wanted to inform you of that new feature I’ve just launched. Do you remember when you asked me if I had any plan to increase the (monthly) frequency of new links discovery? Well, I increased to a daily frequency. Users can know link their Linkody account with their Google Analytics account and get daily email reports of their new links, if they get any of course.

Sold. That’s a clever way to report more links, and fill in gaps that OSE and Ahrefs miss.

Click for larger view

Linkody

Upon discovering the new URL, you can choose to monitor it, tag it, or export.

The pros: Linkody picks up a bunch of links on a daily basis that some of the big link crawlers miss. You can opt for daily digest emails (think, Google Alerts). Plus it’s pretty cheap!

The cons: It needs Google Analytics. Plus, for the Google Analytics integration to track the link, the link has to actually be clicked by a user. However, for those who have moved to a “link building for SEO and referral traffic generation” model (like me), this might not be much of a con at all.

What’s on the roadmap?

As François told me, “next is displaying more data (anchor text, mozrank…) for the discovered link to help value them and see if they’re worth monitoring. And integrating social metrics.” Good stuff. I’d like to see more analytics packages rolled in, and more data sources? Maybe its own spider?

Conclusion

If you’re a link builder, in PR, or a brand manager, I definitely recommend giving Linkody a spin. It’s a great value. Keep your eye on this tool.

 

 

 


Permalink
6

Optimize NOW For Entities and Relationships

I remember a few years ago blowing the mind of a boss with a theory that Google would eventually rank (in part) based on their own internal understanding of your object. If Wikipedia could know so much about an object, why couldn’t Google? In the end, I was basically describing semantic search and entities, something that has already lived as a concept in the fringe of the mainstream.

Sketching It Out

Sketching out relationships on a whiteboard

In the last year Google has shown us that they believe in the value of a semantic web and semantic search engines. With their 2010 purchase of Metaweb (which is now Freebase), and the introduction of the knowledge graph, the creation of schema, and the sudden delivery of a new algorithm called Hummingbird, Google is having one hell of a growth spurt. It’s not just rich snippets we’re talking about or results that better answer Google Now questions.

We used to say Google had an elementary school education. They understood keywords and popularity. Now it can be argued Google has graduated, and is now enrolled in Silicon Valley Jr. High School. Comprehension has clearly improved. Concepts are being understood and logical associations are being made. A person/place/thing, and some details about them (as Google understands it), are starting to peek through in search results.

Yesterday was my birthday. Yesterday was also the day I became Google famous – which to an SEO geek is kind of awesome. I asked Google a couple questions (and some non-questions), and it showed me I’m an entity (incognito and logged in):

  • how old is bill sebald
  • what is bill sebald’s age
  • bill sebald age
  • birthday of bill sebald

This produced a knowledge result (like we’ve seen a couple times before). Details on how I got this are illustrated deeper in this post:

sebald-age

The comprehension level has its limit.  Ask Google “when was bill sebald born” or “what age is bill sebald”  or “when is bill sebald’s birthday,” and no such result appears.  For some reason an apostrophe throws off Google – quereying “bill sebald’s age” vs. the version bulleted above, and there’s no knowledge result. Also, reverse the word order of “bill sebald age” to “age of bill sebald” and there’s no result.

Then, ask “bill sebald birthday” and you’ll get a different knowledge result apparently pulled from a Wikipedia page. This doppelganger sounds a lot more important than me.

 

result-2

We know Google has just begun here, but think about where this will be in a few years. At Greenlane, we’re starting entity work now. We’re teaching our clients about semantic search, and explaining why we think it’s got a great shot at being the future. Meh, maybe social signals and author rank didn’t go the way we expected (yet?), but here’s something that’s already proving out a small glimpse of “correlation equals causation.” It doesn’t cost much, it makes a lot of sense for Google’s future, and seems like a reasonable way to get around all the spam that has manipulated Google for a decade.

A new description of SEO services?

I’m not into creating a label. Semantic SEO isn’t a necessary term. You might have seen it in some recent presentations or blog post titles, but to me this is still old-fashioned SEO simply updating to Google’s growth. This is the polar opposite to the “SEO is dead” posts we laugh at. Someone’s probably trying to trademark the “semantic SEO” label right now, or at least differentiate themselves with it. To me, as an SEO and marketer, we always cared about the intent of a searcher – semantic search brings us closer to that. We always cared about educating Google about our values, services, and products. We always wanted to teach Google about meaning (at least for those who were doing LSI work and hoping it would pay off). If this architecture becomes commonplace, it becomes part of any regular old SEO’s job duties. Forget a label – it’s just SEO.

The SEO job description doesn’t change. Only our strategies, skills, and education. We do what we always do – mature right along with the algorithms. We will optimize entities and relationships.

Where have we come from, and where are we going?

Semantic search isn’t a new concept.

I think the knowledge graph was one of the first clear indications of semantic search. Google is tipping its hand and showing some relationships it understands. Look at the cool information Google knows about Urban Outfitters. This suggests they also know, and can validate this information – like CEO info, NASDAQ info, etc. Google’s not quick to post up anything they can’t verify.

urban

Click through some of the links (like CEO Richard Hayne) and you’ll get more validated info.

hayne

These are relationships Google believes to be true. For semantic search to work, systems need to operate seamlessly across different information sources and media. More than just links and keywords, Google will have to care about citations, mentions, and general well-known information in all forms of display.

Freebase, as expected, uses a triple store. This is a great user-managed gathering of information and relationships. But like any human-powered database or index, bad information can get in – even with a passionate community policing the data. Thus, Google usually wants other sources. Wikipedia helps validate information. Google+ helps validate information.

The results I got for my age (from Google above) probably came from an entry I created for myself in Freebase. The age is likely validated by my Google+ profile where I listed my birthdate. Who knows – maybe Google also made note of a citation on Krystian Szastok’s post about Twitter SEO’s Birthdays where I’m listed there too. I’m sure my birthday is elsewhere.

But what about my height? Google knows that too, and oddly enough, I’m fairly sure the only place on the web I posted that was in Freebase:

bill sebald height

But I also added information about my band, my fiance, my brother and sister – none of which I can seem to get a knowledge listing for. However, Google seems to have arbitrarily given one for my parents, who as far as I know are “off the grid.”

parents

Another knowledge result came in the form of what I do for a living. This one is easy to validate (in this case only helped with several relevant links I submitted through Freebase):

profession

This is just what Google wants to show, not all it knows

This is really the exciting part for me. When I first saw the knowledge graph in early 2013, it wasn’t just a, “that’s cool – Google’s got a new display interface,” type of thing. This was my hope that my original theory may be coming true.

In fact, in a popular Moz Whiteboard Friday from November 2012 called Prediction: Anchor Text is Weakening…And May Be Replaced by Co-Occurrence, I was hopeful again. There was a slight bit of controversy here on how a certain page was able to rank for a keyword without the traditional signs of SEO (in this case the original title mentioned co-citation, where Bill Slawski and Joshua Giardino brought some patents to light – see the post for those links). My first though – and I can’t bring myself to rule it out – might have been that it’s none of the above; instead, this is Google ranking based on what it knows about relations of the topic. Maybe this is a pre-Hummingbird rollout sample? Maybe this is the future of semantic search? Certainly companies buy patents to hold them hostage from competitors. Maybe Google was really ranking based of internal AI and known relationships?

Am I a fanboy? You bet! I think the idea of semantic search is amazing. SEO is nothing if not fuzzy, but imagine what Google could do with this knowledge. Imagine what open graph and schema can do for feeding Google information on creating deeper relationships. Couldn’t an expert (ala authorship) feed trust in a certain product? Couldn’t structured data improve Google’s trust of a page? Couldn’t Google start to figure out easier the intent of certain searches, and provide more relevant results based on your personalization and those relationships?

What if it could get to the point where I could simply Google the term “jaguar.” Google could know I’m a guitarist, I like Fender guitars, and I’m a fan of Nirvana (hell – it’s a lot less invasive than the data Target already has on me). Google could serve me pages on the Fender Jaguar guitar, the same guitar Kurt Cobain played. Now think about how you could get your clients in front of search results based on their relationships to your prospective searchers needs. Yup – exciting stuff.

Google is just getting started

An entity is an entity. Do this for your clients as well. The entries in Freebase ask for a lot of information that could very well influence your content production for the next year. Make your content and relationships on the web match your entries. At Matt Cutts’ keynote at Pubcon, he mentioned how they’re just scratching the surface on authorship. But I think authorship is just scratching the surface on semantic search. I think the big picture won’t manifest for another few years – but, no time like the present to start optimizing for relationships. At Greenlane we’re pushing all our chips in on some huge changes this year, and trying to get our clients positioned ASAP.

On a side note, I have a pretty interesting test brewing with entities, so watch this spot.


Permalink
8

Step-By-Step Google Disavow Process (Or, How To Disavow Fast, Efficiently, and Successfully)

PFor one reason or another, plenty of sites are in the doghouse. The dust has settled a bit. Google has gotten more specific about the penalties and warnings through their notifications, and much of the confusion is no longer… as confusing. We’re now in the aftermath – the grass is slowly growing again and the sky is starting to clear. A lot of companies that sold black hat link building work have vanished (and seem to have their phone off the hook). Some companies who sold black hat work are now even charging to remove the links they built for you (we know who you are!). But at the end of the day, if you were snared by Google for willingly – or maybe unknowingly – creating “unnatural links,” the only thing to do is get yourself out of the doghouse.

Occasionally we have clients that need help. While it’s not our bread and butter, I have figured out a pretty solid, quick, and accurate method when I do need to pry a website out of the penalty box. It requires some paid tools, diligence, a bit of excel, and patience, but can be done in a few hours.

The tools I use (in order of execution):

To get the most out of these tools, you do need to pay the subscription costs. They are all powerful tools. They are all worth the money. For those who are not SEOs, reading this post for some clarity, let me explain:

To truly be accurate about your “bad links,” you need to get as big a picture of all the links coming to your site. Google Webmaster Tools will give you a bunch for free. But, in typical Google fashion, they never give you everything they know about in a report. Hell – even their Google Analytics is interpolated. So, to fill in the gaps, there are three big vendors: Open Site Explorer by Moz, Majestic SEO, and Ahrefs.

Wait – so why isn’t Ahrefs and Majestic SEO on my numbered list above? Because Cognitive SEO uses them in their tool. Keep reading…

Note: Click any of the screenshots below to get a larger, more detailed image.

Step 1 – Gather The Data

1. Download the links from Google Webmaster Tools.

Click Search Traffic > Links To Your Site > More > Download More Sample Links.   Choose a CSV format.

Google Webmaster Tools Links - Step 1

Google Webmaster Tools Links - Step 2

Don’t mess with this template. Leave it as is. You’re going to want to upload this format later, so don’t add headers or columns.

     2. Download all individual links from Open Site Explorer to a spreadsheet.

     3. Copy only the links out of OSE, and paste under your Webmaster Tools export.

     4. Remove any duplicate URLs.

At this point you should have a tidy list of each URL from Google Webmaster Tools and Open Site Explorer. Only one column of links. Next, we head over to Cognitive SEO.

Step 2 – Cognitive SEO Unnatural Link Detection

There are a number of SaaS tools out there to help you find, classify URLs, and create disavow lists. I’ve heard great things about Sha Menz’s rmoov tool, There’s also SEO Gadget’s link categorization tool (everything they build is solid in my book). I once tried Remove’em with OK results. Recently Cognitive SEO entered the space with their Unnatural Link Detection tool. With a little bit of input by you, it has its own secret sauce algorithm. I found the system to be quite accurate in most cases, classifying links into three buckets: OK, suspect, and unnatural. More info on the AI here. Also, if you read my blog regularly, you might remember my positive review of  their Visual Link Explorer.

First you tell Cognitive what your brand keywords are. Second, you tell it what the commercial keywords are. Typically, when doing disavow work for a client, they know what keywords they targeted. They know they were doing link building against Google guidelines, and know exactly what keywords they were trying to rank for. If the client is shy and doesn’t want to own up to the keywords – or honestly has no idea – there’s a tag cloud behind the form to help you locate the targeted keywords. The bigger the word, the more it was used in anchor text; thus, is probably a word Google spanked them over.

A note about the links Cognitive provides: Ravzan from Cognitive tells me the back link data is aggregated from MajesticSEO, Ahrefs, Blekko and SEOkicks mainly. That’s a lot of data alone!

Below I’ve used Greenlane as an example. Other than some directory submissions I did years ago, unnatural link building wasn’t an approach I took. But, looking at my keyword cloud, there are some commercial terms that I want to enter just to see what Cognitive thinks. Note, the more you fill in here, the better the results. The system can best classify when at least 70% of anchor text is classified as brand or commercial.

Cognitive SEO screenshot 1

Click submit, and Cognitive quickly produces what it thinks are natural and unnatural links.

Cognitive SEO Screenshot 2

Cognitive produces nice snapshot metrics. I can quickly see what links I need to review (if any). In my case, Cogntive marked the directory work I did as suspect. Since I don’t have a manual or algo penalty, I’m not going to worry about this work I did when I was younger, dumber SEO.

But, for a client who has a high percentage of bad links, this is super helpful. Here’s an example of results from a current client:

Cognitive SEO Screenshot 3

This site has a highly unnatural link profile and it’s likely to be already penalized by Google.  This happens to be an all too true statement.

Next, Cognitive added a layer of usability by extending with the Unnatural Links Navigator.

 Cognitive SEO screenshot 4

This tool basically creates a viewer to quickly toggle through all your links, and quickly (with some defined hotkeys) tap a site as “disavow domain” or “disavow link”. You get to look at each site quickly and make a judgement call on whether you want to agree with Cognitive’s default classification, or disagree. 9 times out of 10 I agree with what Cognitive thinks. Once in a while I would see a URL labeled “OK” where it really wasn’t. I would simply mark it to disavow.

What should you remove? Here’s a page with great examples from Google. Ultimately though this is your call. I recommend to clients we do the more conservative disavow first, then move to a more liberal if the first one fails. Typically I remove things that link look like they belong on the previously linked page. I also remove pages with spun content, forum board spam, xrumer and DFB stuff, obvious comment spam, resource page spam, and completely irrelevant links (like a viagara link on a page about law). PR spam, directories, and those sites that scrape and repost your content and server info have been around forever – currently I see no penalty from these kinds of links, but if my conservative disavow doesn’t do the job, then my second run will be more liberal, and contain these. 9 times out of 10 my conservative disavow is accepted.

This part of the process might take a couple hours depending on how many links you need to go through, but this is obviously much faster than loading each link automatically, and a lot more thorough than not loading any links at all. I believe if you’re not checking each link out manually, you’re doing it wrong. So turn on some music or a great TV show, grab a beer, tilt your chair back, and start disavowing.

Once complete, you’ll have the option to export a disavow spreadsheet and a ready-made disavow .txt file for Google.

Here’s are the full steps to make the most out of Cognitive SEO.

  1. Create a campaign with your site or client’s site.
  2. Once in the inbound link analysis view, click IMPORT on the top right. Choose Google Webmaster Tools as the Import Format. Choose the Google Webmaster Tools / Open Site Explorer .csv file. Click Import.
  3. Once links are appended, click “start the automatic classification now” and follow the steps.
  4. Click “Launch The Unnatural Links Navigator”, and click the “link” column to sort alphabetically.
  5. Toggle on each link to disavow individually, or choose one link per domain and disavow the domain. This will make sense once you’re in the tool.

Step 3 – Submit Disavow To Google – or – Do Outreach To Remove Links

Google wants you to make an effort and reach out to the sites to try and get the link removed. Painful? You bet. But some SEOs swear it doesn’t need to be done (exclaiming that a simple disavow is enough).

To disavow, take the .txt file you exported from Cognitive, and add any notes you’d like for Google. Submit through your Google Webmaster Tools at https://www.google.com/webmasters/tools/disavow-links-main

But, if you want to attempt to get the links removed, Buzzstream can help you! Buzzstream is like a CRM and link manager tool for inbound marketers. Easily in my top 3 SEO tools. For prospecting, one (of several) things Buzzstream can do is scan a site and pull contact information. From an email that appears deep in the site, to a contact form, Buzzstream can often locate it.

By creating an account with Buzzstream, you can upload your spreadsheet of links into it, forcing Buzzstream to try to pull contact information. Choose “match my CSV” in the upload, and tell Buzzstream your column of links should be categorized as “linking from.”

Here’s a sample. Notice the email, phone, and social icons? This is a huge help in contacting these webmasters and asking for the link to be removed.

buzzstream screenshot

Conclusion

That’s all there is to it. For anyone who has done disavows in the past, and found it excruciating (as I used to), this will hopefully give you some tips to speed up the process. Of course, if you’re not in the mood to do any of this yourself, there are certainly SEO companies happy to do this work for you.

Any questions with these steps? Email me at bill@greenlaneseo.com or leave a comment below.

 

 

 


Permalink
76

Old School SEO Tests In Action (A 2014 SEO Experiment)

Ever wonder how powerful some of the oldest SEO recommendations still are?  With the birds and the bears (and a little caffeine) changing so much in SEO since 2011, I wanted to see first hand some of the results we can get from some moves like internal linking and title tag optimization.  Using my own site as the proving ground, and moving quickly between tweaks and first results to try and exclude any other ancillary update or change, I decided to test some optimizations I still see recommended or used in the field.  The set of competing pages I chose below don’t move very often, so I thought this might be a good group to experiment with.

Note: It’s important to understand that this is not a controlled test at all.  Any single domain I’m competing against could be making some changes at the same time which would naturally skew my results.  Let’s take this with  a grain of salt and consider all of this directional. This is not advice, this is merely my experience and thoughts. If I get hammered on this in the comments, so help me…

The tests were run at various times between November 17, 2013 and January 12, 2014.  Just want the results?  Click for the result summaries: #1 and #2.

Truthfully I think the tl;dr can be summed up pretty well in a single statement:

stop lazy seo

[rant] See, the results of these tests turned out as I (and probably most of you) expected. Virtually no gains on the thinnest of tests. There were very little surprises below.  Yet, these still bring related recommendations all the time from lesser quality blogs – or worse, sometimes agencies and consultants.

Last week I walked into a pitch where the prospect showed me some of the projects his current neighborhood SEO company is working on.  He candidly told me he didn’t know what the SEO company was doing for him (which is why he was entertaining new vendors).  With the draft of this post in my head, he started sharing some of the recommendations he was given – some of which coincidentally are listed below. Others recommendations included press release links and quickly churned video production.

Now I’m not one to “negative sell” over a competitor (ie, downplay someone else’s service to promote my own), and I was extremely respectful to this vendor, but I left the meeting really frustrated for this business owner. It took everything I had to keep from blasting this vendor. The business owner is clearly the victim of lazy SEO.  He was a great guy trying to run a business and relied on the company to be his SEO hero. I respectfully gave him my different opinion on tactics and strategies without truly speaking my mind.  I’m still not sure I shouldn’t have been more truthful. 

In case you’re wondering, none of the local services in my screenshots below is the vendor I’m reluctantly protecting. [/rant]

Updated 2-20-2014: Lia Barrad made a great point in the comments that I feel should be added here. Unfortunately I couldn’t persuade a client to allow me to display bigger data. As a result I was only limited to do the tests on our own site. The amount of traffic and testing options I had on this relatively small Greenlane site didn’t give me much opportunity to also show a lift/loss in traffic. I really wanted to share that as well, because I truly think qualified and converting traffic is way higher on the list of valuable SEO KPIs. Instead I was relegated to using garbage keywords like “Philadelphia SEO” that doesn’t bring much good traffic (I used to rank extremely well for the term and eventually abandoned it because it wasn’t worth the effort in my case).

Enjoy the test!

Exact Match Internal Anchor Text Optimization

Test 1

Situation: On November 17 2013, using Chrome Incognito, my site ranks #11 for a geo-targeted keyword (see graphic below – in this case I don’t want to muddy this test by adding the keyword anywhere on this website except in the testing page).

Click images for larger view

 

The strongest page on my site is my homepage (which is currently ranking for the keyword above).  It has a PA of 52.58, with 420 external links passing link equity from 51 external domains.

My second strongest page is my Outdated Content Finder tool.  It got mentions in Moz, Search Engine Land, Search Engine Journal, and picked up from mentions at Mozcon.   It has a page authority (PA) of 49.07, with 89 linking root domains, for a total of 100 external, equity passing links.  There are already 40 outbound links from this page, with two being to external domains.

On my Outdated Content Finder page, there isn’t a reference to the homepage using any anchor text but “home” in the navigation.

Test: To see if I could pass better PageRank to my homepage, using an exact match anchor text, I implemented the following:

  1. Added a link onto my second most valuable website page (the OCF tool).
  2. Used Webmaster Tools to “Fetch As Google”, and submit to index (for faster crawling).

Expectation:  In many cases, the Fetch As Google URL submission works really fast (I’ve seen it add a new URL in less than 10 minutes), but I’m not really expecting a jump in rank.  I think because the sitewide navigation, where there’s a home link already embedded, this second link may not have much power.

Result:  It took a few days, but there was a single-position gain on 11/18 (same as the new cache date).  The bump went from position 11 to position 10.  Nothing to hang my hat on normally, but for a page jump, I’m somewhat satisfied in this case.

 

Test 2

To push the rankings a little higher, let’s try a partial-sitewide, exact anchor link to the same homepage.

Test:  11/19 – My blog has a different sidebar than my non-blog pages.  With a widget in WordPress I can add a simple piece of copy with an exact anchor text link:

copy

This isn’t a true fully-sitewide link, and is all one level deeper into the site (http://www.greenlaneseo.com/blog/) but for this experiment I think it’s good enough.

Expectation:   I have a number of blogs with a wide variety of backlinks.  I still believe sitewide links have power (though limited), and expect to possibly see another position bump.

Result: On 11/24 (6 days after the change), the keyword actually dropped two spots to position 12 (page 2).  From what I can observe, no new sites have entered the set.

Test 2.1

Since that sitewide link didn’t work too well, I reversed it.  Actually, I updated it to push all the links into the Outdated Content Finder page.  Maybe if we consolidate into my second most powerful page it might have a positive effect to the same target keyword.

Test:  11/24 – Updated the site-wide copy as follows:

updated-link

Expectation:  Truth is, I expected more from Test 2.  With Test 2.1, I’m even less optimistic there will be a positive change.  At the least, I’m expecting my target keyword to fall back to position #10.

Result: Apparently better than expected.  Now appearing in position 9 for my target keyword since 11/27.

Test 2.1 update

Results Summary

The domains in this set stayed relatively constant throughout this 10 day experiment.  Again, I make no claim to this being the results everyone should expect, since we must consider competition, possible backend algorithm changes, and (especially since these are all SEO companies) possible changes by the websites themselves.  But, my theories are as follows:

  • Direct internal linking with specific anchor text still has a little bit of value, especially if you direct it through your best pages.  A single ranking bump from second to first page may be larger if it was a result deeper in the rankings (my guess!).  So, very little gain, and definitely a small recommendation, but there are much bigger SEO fish to fry than this change.  If a client had to pay to have this change done, not sure I would ever put this top on the list.
  • We’ve heard sitewide links may be scrutinized by Google, and it may be true after all with a direct keyword impact dampened.  But, while keyword value may not pass directly, efficient PageRank still may. Some clever “PageRank sculpting” may still have minor value. Keyword, minor.  Once again, recommendations for this kind of result won’t be moving higher on my list any time soon.

 

Title Tag Optimization

Test 1

Situation: Thousands of SEOs, websites, and audit tools suggest these two best practices for title tags:

  • Target keyword should be the first word
  • Title tag must be under 70 characters

Personally, I’ve rejected this for the last 8 years.  Here’s why – I believe Google is more sophisticated, and realizes the target keyword being first in the title isn’t always natural.  In Google’s younger days, sure – it’s a signal they could code to capture, but I think it’s too limiting to be a signal today.  It’s a usual SEO recommendation that surely Google knows about.  Second, if a title tag is too long, it gets truncated.  That’s not a great user experience, but I’ve never seen evidence of the truncated text not helping rank.  I’ve only seen the opposite.

Test: To test this, I updated my title tag for my blog homepage on 11/29/2013.  Target keyword is in the middle of the title tag.  I intentionally caused the tag to truncate.  This is a pretty terrible title, but suits the experiment:

bad title

On a side note, after creating this terrible title tag, I submitted to Fetch as Google.  Within 60 seconds this title tag showed in an incognito search, despite an outdated “Nov 18, 2013″ date. That’s remarkable.

On 11/30 through 12/02, I’m ranking position 271 for my keyword.  It seems pretty settled there. On 12/03, I have updated the title tag to this:

updated-title

Expectation:  I don’t think the ranking will move.  I don’t think keyword position matters.

Result: On 12/7 my current rank for the keyword was still 271.  On 1/4/2013, it flopped down to 284.  No positive change.

Test 1.1

I wholeheartedly believe the volatility of a change is different when a rank is in the hundreds, vs. in the tens.  Let’s revise the same test on a keyword that is already ranking well.  For the keyword Philadelphia SEO, my homepage page ranks 6.  The title tag is Greenlane SEO – Search, Analytics, and Strategy Services Since 2005. A Philadelphia SEO Company.

Test:  On 1/8 let’s see what happens if I change it to Philadelphia SEO Company – Greenlane SEO – Search, Analytics, and Strategy Services Since 2005.

Expectation:  I don’t think the ranking will move.  I don’t think keyword position matters in this case either.

Result: On 1/12 my current rank for the keyword was still 6.  No positive change (but I’m reverting immediately – that’s a terrible title tag just for a supposed SEO value).

philadelphia seo SERP

 

 Test 2

I don’t believe that a title needs to be under 70 characters for SEO value to take hold.  As mentioned early in this post, a truncated title is not great from a marketing perspective.  Surely there’s better things a user can see than an ellipsis in the SERP link, but when trimming to 70 characters is recommended in order to rank better, I call “shenanigans”.

Test:  I’m not going to work too hard on this test because I’ve tested this before.  On 1/8/2014, on a blog post called Review of Repost.us, I rank #1 for “review of repost.”  The title tag is simply Review of Repost.us.  I am changing the title to past 70 characters:  Review of Repost.us – A Review By Bill Sebald – Is Repost.Us SEO Friendly?  Let’s Find Out!  Greenlane Search Marketing

Expectation: I’m expecting no drop in rank whatsoever.

Result:  On 1/13/2014, no drop with new ugly truncated title tag.

Results Summary

As expected, tweaking the title tags with these old-school recommendations didn’t do anything.  It’s not 2007 anymore.

  • Changing the location of the keyword, and extending past the 70 characters, did not seem to matter.

Conclusion

I do hope you enjoyed the tests.  As stated in the beginning of the post, this is not scientific. Take this as directional and do what you may with the information, but my recommendation for those who still solely rely on these kinds of recommendations to provide your client with SEO services, please reconsider recommending things that have a bigger impact. If you’re a business person yourself, and you get recommendations like this, please don’t drink the kool-aid.


Permalink
2

Review of Repost.us

I stumbled upon an interesting service I don’t think many SEOs know about – at least, not the few I’ve asked.  It’s called repost.us.  Looks like it’s about 2 years old.

Simple premise: Add your site to the database, and others can republish your content.  They say, “It’s the wire service reinvented for the web.”

Click any image to enlarge

Repost.us Homepage

A user of repost.us can login, search for content, and simply copy and paste the blue embed code (with a couple checkbox options) right into their website.  See below – one of my articles, straight from this blog, has been added to their database.  This is how a user sees it:

Repost.us Sample

Notice above, circled in red, there is an Adsense block as part of the copied code.  This isn’t my Adsense code; instead it appears to be added there by the repost.us team, and does appear to wind up in your posted article.  This gives repost.us a chance to monetize for the service. This also gives a publisher, who embeds Adsense, a chance to swing their publisher ID over as well. Interesting way to earn more Adsense clicks.

What About Duplicate Content?

Right.  The dreaded D word.  Here’s a site that took my content and reposted it:

Sample republished SEO post

Did you notice the attribution links (in red) at the bottom?  These particular links don’t show in the source code either (but others do – read on).

Now here’s the cool part.  Search in the source code for this specific body of copy and you won’t find it.  Either will Google.  Javascript injects this copy in through the embed code, which looks like this:

<div class=”rpuEmbedCode”>
<div class=”rpuArticle rpuRepost-7af546614f6b5e93c9c6053b466c1a0f-top” style=”margin:0;padding:0;”>
<script src=”https://1.rp-api.com/rjs/repost-article.js?3″ type=”text/javascript” data-cfasync=”false”></script><a href=”http://s.tt/1JrGh” class=”rpuThumb” rel=”norewrite”><img src=”//img.1.rp-api.com/thumb/6754722″ style=”float:left;margin-right:10px;” /></a><a href=”http://s.tt/1JrGh” class=”rpuTitle” rel=”norewrite”><strong>I&#8217;m Not Afraid Of A Google Update Against Guest Posting</strong></a> (via <a href=”http://s.tt/1JrGh” class=”rpuHost” rel=”norewrite”>http://www.greenlaneseo.com/</a>)<p class=”rpuSnip”>
Let’s face it – the SEO industry has a tendency to stomp a tactic into the ground.  Some of us even get lazy (pleny of this kind of junk around). Directory submissions were once wildly valuable, then SEOs started creating directories by the thousands&hellip;
</p>
</div>
</div><!– put the “tease”, “jump” or “more” break here –><hr id=”system-readmore” style=”display: none;” /><!–more–><!–break–><hr class=”at-page-break” style=”display: none;”/><div class=”rpuEmbedCode”>
<div class=”rpuArticle rpuRepostMain rpuRepost-7af546614f6b5e93c9c6053b466c1a0f-bottom” style=”display:none;”>&nbsp;</div>
<div style=”display: none;”><!– How to customize this embed: http://www.repost.us/article-preview/hash/4917fea1ea6f6df42de6a8f3d7cb3d4d –></div>
</div>

Natively the code will only show a snippet, like something you might see in a <noscript>. This is all Google will see (which I achieve by turning off Javascript in my browswer):

Sample 2

See the links in red above?  The The Kind Of SEO I Want To Be  (via http://www.greenlaneseo.com/) links?  Those are the only two links that appear to link back to my original, canonical blog post.  They live in the source code behind the full injected content. Sadly they are both the same shortened URLs (in this case http://s.tt/1MWo1) but they are at least 301 redirects.  If you believe 301′s dampen PageRank more than straight links, despite statements from Matt Cutts, then this is probably disappointing.

In my experience, this small amount of duplicate content, with one or two links back to the original document (including 301′s), don’t seem to cause any duplicate content issues.  I’ve had my content posted on Business 2 Community in full with an attribution link, and Google still seems to figure it out.  My posts still wind up ranking first – even if it takes a few weeks.

Seems SEO Friendly Enough… But How Were The Results?

I emailed the team at repost.us and asked for a user count and activity.  CEO John Pettitt kindly responded:

“We don’t give exact numbers but you can assume between 10K and 100K sites embed content in any given month. There are over 5000 sites contributing content. We have not quite 4 million articles in the system and we republish between 50 and 200K articles a month.

The average reposted article gets ~150 views per post, that goes up a lot for new content where it runs ~2000 and we regularly see content getting 20-50K views for an article if a bigger sites picks it up. The usage is very quality sensitive, if it’s content farm quality “seo bait” it probably won’t do well. it’s it’s original well written content it will do better.”

Pretty awesome numbers!  Unfortunately, I didn’t fair so well.

My Repost.us data

After running at least 3 months, with only 6 domains republishing my articles (one apparently being repost.us itself), I received a total of 40 total impressions (disregard the chart above that suggests 21 for just for the few they show in the summary).  Still, that’s 6 links I got without really doing anything but writing for my own blog.  

Also, out of all the posts on my blog, there were only 6 different posts shared through the 6 different sites (I have blog posts dating back to 2007).  I did see a year old post, but for the most part, all the content that got republished was newer content.  I don’t know if that’s because their system chose to suppress old posts, or just a coincidence.

Finally, after spot checking the 6 sites that hosted at least one article, all but the repost.us domain were extremely poor.  DA of less than 15 with virtually no external links according to Moz.  Now I’m much less excited about the handful of links I received.

Conclusion

So it wasn’t a success for me, but in light of the numbers John (from repost.us) shared, I could very well be unlucky or simply not in line with what the user base is looking for.  I write for the SEO industry.  The users of this service may very well not have any interest in SEO.  Or, maybe I’m just not writing interesting stuff (but I refuse to believe that!).

But I do believe in the power of reposting content.  I’m not completely afraid of duplicate content over getting more eyeballs onto a piece of my content strategy.  At the end of the day, republishing for eyeballs – even in traditional paper media – was a marketing goal.  Again, I believe Google is good enough at sorting most light duplicate content eventually, whereas repost.us also took precautions to make sure they helped avoid adding noise to the signal and misguide the algorithm into mistaking the canonical URL.  We actually just started to use repost.us for some of our clients as well, taking note of the different categories the service supports.

My only concern with the service is, based on an unfair sample of 6, there may be a lot of spammers republishing and looking to achieve an article marketing type of model (ie, post everything, monetize with ads).  Could the spam links hurt?  Probably not, but I would definitely keep my eyes open as an SEO.

My one sentence bottom line review: Absolutely worth a try.  It could yield some great SEO and marketing results, especially when / if the service grows.


Permalink
1

Analogies And Metaphors To Help Explain SEO To Your Boss

Happy Thursday everyone.  A quick SEO post to bring some brevity, tips, and pop culture into your day.

Yesterday I had a client ask for some campaign items to present to the CEO. He is concerned about year over year natural search gains.

As an SEO I bet your chest just tightened up reading that.  We’ve all been there.  Our lot in life will put us there again.  When the spotlight is put on natural search performance, it’s almost always put on you as a performer (at least semi-consciously).  Maybe you feel threatened or defensive.  The counter-arguments start squirting through your neocortex.

The problem is, you can’t usually get away with telling the c-suite, “you’re damn lucky I (we) were able to stop any bleeding and keep you climbing the mountain Google’s model is destined to swat you from.”  As SEOs we know it’s the Pareto Principle.  We know consistent top positions is vital for query revision.  We know Google wants to keep the index fresh and, relatively speaking, very few brands seem to be sacred cows.  I bet you’d love to say, “imagine where you’d be if I wasn’t here!”  Unless you have no fear of losing your job, you probably can’t get away with that.

We need to help the c-suite.  They’re never the enemy – they’re your best allies, and you are their partners.  The smart ones listen and appropriately challenge you, while others may be a bit slower.  They’re all regular folks with their own strengths and weaknesses.  I’ve had my share of executives who just couldn’t get it – whether because they weren’t capable of seeing the big picture, or didn’t care because of the demands that were on them.  Before my agency life, I had one boss who set unrealistic goals and put no resources behind his team. He convinced himself that since he was once on top, he should be natively staying on top.  I quit.  He lost his natural search lead, and last I heard, his company is toast (your first analogy).

If we don’t step in, the folks we consult for could take uncorrected preconceived concepts to their next jobs and cause more complications.  We can be heroes, but it takes work.

What’s the ROI of SEO?  Staying competitive!

Words that work

I’ve found analogies and metaphors work well in explaining the obtusity of SEO.  Here’s two that have worked for me.  And since I’m currently celebrating 80′s Thursday (my own personal holiday), I’ve got incredible movie posters to boot.

affiche-the-ice-pirates-1984-1

SEO and Pirates

“As an SEO, I can help identify the opportunity and draw a loose perimeter on the map.  But I can’t necessarily tell you exactly where in the perimeter the booty is; nor can I guarantee how deep it is.  But, as luck would have it, I can also help you dig.  The time is all dependent on how many shovels we have in the dirt, and how hard we dig.  The timing may be up to luck, but we will find the gold to offset all the effort.”

I told this to a group of interactive marketers while running the SEO department in a 200+ person agency.  Clients were asking account managers for SEO help, and they were hesitant to bring us in if we couldn’t guarantee an ROI. We found ourselves pitching to our own peers. After the meeting, one person in particular scheduled a meeting with me.  “It clicked,” he said.  This person was an analytics and data wiz, and became a huge ally – and friend – during my years in the agency.  He went on to run some major accounts (think national sport leagues), and did SEO a huge service by not only explaining it correctly, but in selling the real value through to the clients.  He pre-qualified a couple opportunities for our group as well.

SEO and Racing

“To win a race, not only does the car need to consistently be upgraded (aka optimized), but many factors need to be analyzed routinely like track builds, track conditions, talent of driver and pit crew, talent of competitors.

cannonball-run-movie-poster-01So let’s imagine you are a team owner. You implement an expensive, cutting edge exhaust system on your best car. You notice in your trials that the car clocked better, but you still didn’t win that week’s race.  Next week you install a new suspension, but again lost the race.  Worse, your competition still beat you soundly without the two optimizations you have. Some of your team starts to get frustrated and confused. Theories and opinions are flying.  Chaos level rising!

But you do the right thing. You keep buying, trying, testing, and removing optimizations. You watch your competitors and study their moves for inspiration, but you don’t worry.  You stay on target.  Suddenly, towards the middle of the season something happens. You start placing in the top 5. The points and rewards (money) you’re receiving is slowly starting to add up.  Chaos level lowering!

Eventually you start winning. Your wins offset all your losses with a healthy margin of revenue leftover to enjoy.  But it’s important you think about next season, and your next level of racing.  New technology will arise.  New track conditions, new team members for both you and your competitors, and a hundred other factors will need your monitoring.  Don’t sit still just because you’re winning – if you don’t stick with it, you’re going to fall behind again.  You can’t afford to do that after all your investments.”

I’ve used the racecar parallel a zillion times.  I’ve used it mainly in pitch decks so I can make sure from the outset I’m explaining what the client will be in for with an engagement.  Want to compete?  Come with me.  Not into the risk?  Try paid search.

Conclusion

I know this post is dangerously close to “What Dom DeLuise taught me about SEO” type posts, so you’ll just have to forgive me this one time.

Also notice I didn’t use Field of Dreams “If You Build It They Will Come.”  That’s a myth and a terrible movie.

What Analogies Do You Use?  Share In The Comments!

 

 


Permalink
16

Sometimes SEO Is Only As Good As The Clients You Choose

We direct all our SEO prospects to our online material, which we candidly post on the website (go to our homepage and click the tour button for an example).  We don’t have fancy leave-behind decks, or spend hours sweating over pitches like some agencies I’ve worked with.  I’ve seen much less time (and cost) succeed with the right kind of catered communication, especially in the SEO industry alone.  Our services are specific – SEO consulting with a lower emphasis on labor.  In our online tour, we share our history, our beliefs, our differentiators, and our price.  We have found that this helps qualify the next conversation.  Some prospects read this and never return, presumably looking for another type of SEO service.  While others only feel more confident about partnering with us.  Through this second conversation, our conversion rate is very high.

We keep it simple, and respectful of everyone’s time.  We’re all busy in business.

But our system isn’t flawless.  We had a client last for only two months.  We both agreed to part ways.  It’s sounds funny in hindsight – how could two months determine a relationship that couldn’t be saved?  From the start everything was cordial.  We asked them to review our tour, and assumed they had.  We had a 20 minute conversation following their internal review. We won the business without asking the right questions.

In our postmortem we realized we assumed too much.  We assumed they read – and understood – our services as well as we did.  20 minutes isn’t anywhere close the the amount of time we should have spent qualifying them.  We were a little too foolhardy with our gut. From the first deliverables, where we had some great ideas to really break the website out of its template, everything was rejected.  We dove into their competitors to see what they were doing, and suggested rivaling big ideas.  We were shot down again with concerns of time and little faith. We believed in our ideas, and fought for them.  “They’re working very well for other clients, and here’s examples of them in the wild,” I shared.  We were feeling pushed into old-school SEO services, something we could do, but just don’t believe in.

As the dust started to settle, it turned out when they said they wanted quick results, they meant very, very quick – what we considered unrealistic.  But for a hot minute, I bent.  I instructed our team to pivot and try to deliver – a poor decision, and something very out of character.  Not poor because I don’t put clients first, but poor because we weren’t in any position to meet that goal with this particular website.  They had a long road ahead.  Luckily, a candid discussion with the company’s CEO soon followed, and it was clear we were not on the same page.  My initial emotion was, “what did you guys hire us for?”  But later a clearer head asked, “why didn’t we qualify them better?”  We wouldn’t have believed what they wanted was realistic.

This was a valuable wake up call to help us (re)focus on the path we spent so many months creating with the launch of our business.

They were a great company with good people and cool products – we were just on completely different sides of the fence.  They knew enough SEO to have their spot, and we were trying to pull them to our side of the yard; all along not seeing the giant brick wall that divided us.  Could it have been saved?  Yes.  But I don’t think it was worth it for either party. They’re better off with a company more in sync, as are we.  Both our businesses got a pretty good education outside of SEO in my opinion.

Make Sure Both Parties See The Brick Wall From The Same Vantage Point

You should be standing next to your client.  Not across from them.  You should be able to have open conversations.  You certainly should have the grounds to disagree.  If you only want to make money, being a yes-man will only get you so far.

 Client:  Can you get me to rank #1 for grilled cheese?
 SEO:  Yes
 Client:  Can you guarantee me a 800% ROI?
 SEO:  Yes
 Client: Where do I sign?

Six months later when you’re making no money off the term grilled cheese, “yes” doesn’t have any power. Now you have contention, burnout, and praying your client services team has another client waiting in the wings when this current client goes supernova.

Sure, you made your money, but unless you own the company, don’t care about your reputation, and don’t have to face the clients after you sign them, you’re setting yourself up for a world of hurt.

Here’s how I might answer those questions:

 Client:  Can you get me to rank #1 for grilled cheese?
 Bill:  Probably not without a major commitment from your team, a larger budget than you have, and the ability to make changes quickly.
 Client:  Can you guarantee me a 800% ROI?
 Bill:  No, but I can make it my goal to influence Google to see you as the authority on Grilled Cheese and related cheesy sandwiches.
 Client: Why should I sign you?
 Bill:  Because I’m not afraid to tell you how it really works in SEO, and I can teach you a lot about the additional opportunities you have in natural search based on our experiences.

I recently had a conversation with a prospect who said (paraphrasing), “I spoke with [big name SEO] who said we’d use [semi-popular blog network] for my link building if I went with them. They said it was white-hat, but it sounds like a blog network to me.”

That really depressed me.  I was more than happy to inform this nice guy that the network in question was anything but white hat.  Is the sale of service so important that you would intentionally mislead your prospects?  Won’t that set you up for failure when you get hit with a penalty?  Is the hit-and-run model the best you can scale?  Is your own reputation in the SEO space not valuable?  If this SEO had said, “so, yeah, we’re totally black hat… you down?” that would be respectable.

Succeeding As A Partnership

Whaddaya Say, Bats?I quit consulting and agency life for a few years because I didn’t like (what I thought) was #thegame.  But starting a company, and creating our own rules, built a new version of the game which I’m enjoying to death. Settling in with clients (which we call partners as an homage to a lesson learned years ago) and really respecting the values each other bring to the table has been great.  Sometimes completely different business models and philosophies can work great together.  Just like in love (and comic books), opposites can attract.  It’s a great feeling waking up knowing you’re doing good work.  When a partner will decide to leave us, I truly hope they can say, “you taught us some great stuff – I’m going to recommend you anywhere I can. Thanks for sharing your experience.  We had a great adventure.”

That’s what a consultant does.

What do you want your clients’ parting statement to be?

Summary Tips

My TL:DR tips for creating the best SEO:client relationship, and setting yourself up to do the best work of your life:

  • To have good clients, you need to have your story straight.
  • To make your story known, you need to have an identity.  You need to have a deeper purpose than “get rich.”  It’s easy to forget this purpose when running “the business end” of the business each day.
  • You need to spend the time making sure the opportunity is for you.  If it smells funny, try to figure out why before you take it or throw it back.  If it seems great, look for a possibly brick wall that might be eluding you.
  • Be specific in what you offer.  Don’t choke yourself on things you don’t know or don’t think you can accurately deliver.

I’d love your comments below!

 


Permalink
Page 1 of 1612345...10...Last »