Sometimes desperate times call for desperate measures. This post is about a desperate measure.
We had a client with a manual link penalty. We did some work (using my outline from this post). Rankings started going up and traffic/conversions started boosting. Then, a few days later, the next Google Notification came in. It’s like playing digital Russian Roulette with those things – you’ll either be thrilled or be in a lot of pain.
This time Google said they “changed” our penalty, as there were still some spammy links out there.
Remember, not all penalties have the same impact. Clearly ours was lessened (which was continually proven in the weeks to follow), but our client – rightfully so – wanted to have the whole penalty removed. The problem was we couldn’t find anymore bad links. Everything from Ahrefs, OSE, Google Webmaster Tools, Bing Webmaster Tools, and Majestic (etc.) was classified and handled appropriately.
Google’s notifications sometimes show some additional samples of poisonous links. This time we were showed only two links of forum spam, something we found zero instances of previously. Old school, dirty forum spam usually is belched out in huge, automated waves. We asked the client, who asked their previous vendors, if they had any knowledge of the link spamming. Nobody knew anything about it, so any chance of getting a list of these URLs (which was probably very low anyway) was now nil. But how did we miss all of it?
The problem was, this forum spam was so deep in the index that the major tools couldn’t find them. Even Google’s Webmaster Tools report didn’t reveal them. That’s right – Google’s notification was showing us links existing, but weren’t even giving us insight into those links through Webmaster Tools. They never got any clicks so we weren’t finding them in Google Analytics. Google’s vague link reporting functions and vague, boilerplate notifications weren’t helping us help them.
The only way to find these deep links was through the use of Google’s search engine. Unless you have a staff of hundreds and nothing but time to manually pull results and analyze one by one, this didn’t seem possible. But we came up with with a reasonably easy process using Cognitive SEO, Scrapebox, Screaming Frog, and good old Excel, to try to emulate this activity with at least some success.
Note: I feel obligated to tell you that this is not going to be an exhaustive solution. I don’t think there is one. There’s limitations to what Google will actually serve and what the tools listed in the post can actually do. To give you some good news, Google will likely release you from a penalty even though you didn’t clean up every single spammy link. All the clients I’ve gotten out of the doghouse still had some spam out there we weren’t able to find. To Google’s credit, at least they seem to understand that. Hopefully this process will help you out enough to get the job done when your repeated reinclusions are denied (even after really, really trying).
We’re going to have to beat Google into giving us opportunity. The problem is, we’re going to get a serious amount of noise in the process.
We know the inanchor: operator can be helpful. It’s not as powerful as we’d like, but it’s the best we have. A search in google like inanchor:”bill sebald” will ask Google to return sites that link using “bill sebald” as anchor text. This will be very valuable… as long as we know the anchor text.
Step 1. Get the anchor text
This can be done in a few ways. Sometimes your client can reveal the commercial anchors they were targeting, sometimes they can’t. All the major backlink data providers give you anchor text information. My favorite source is Cognitive SEO, because they give you a nice Word Cloud in their interface right below their Unnatural Link Detection module (see my previous post for more information on Cognitive).
Collect the anchor text, paying special attention to any spammy keywords you may have. I would recommend you review as many keywords as possible. Jot them down in a spreadsheet and put them aside. Don’t be conservative here.
You also want to be collecting the non-commercial keywords. Like, your brand name, variations of your brand name, your website URL variations, etc. Anything that would be used in a link to your website referencing your actual company or website.
Together you’ll get a mix of natural backlinks and possibly over-optimized backlinks for SEO purposes. We need to check them all, even though the heavily targeted anchors are probably the main culprit here.
This is where Scrapebox comes in. I’m not going to give you a lesson (that’s been done quite well by Matthew Woodward and Jacob King). But if you’re not familiar, this powerful little tool will scrape the results right out of Google, and put them in a tabular format. You will want proxies or Google will throw captchas at you and screw up your progress. Set the depth to Scrapebox’s (and Google’s) max of 1,000, and start scraping.
Step 1: Enter in your queries
In the screenshot example below, I entered one. Depending on results, and how many commercial anchor text keywords you’re looking for, you want to add more. This might require a bunch of back and forth, and exporting of URL’s, since you have a limitation in how much you can pull. I like small chunks. Grab a beer and put on some music. It helps ease the pain.
But don’t just do inanchor: queries. Get creative. Look for your brand names, mentions, anything that might be associated with a link.
Step 2: Choose all the search engines as your target
In most cases you’ll get a lot of dupes, but Scrapebox will de-dupe for you. In the errant case where Bing might have some links Google isn’t showing, it may come in handy. Remember – Google doesn’t show everything it knows about.
Step 3: Paste in your proxies
It seems Google is on high alert for advanced operators en masse. I recommend getting a ton of proxies to mask your activities a bit (I bought 100 from squidproxies.com, a company I’ve been happy with so far. H/T to Ian Howells)
Step 4: Export and aggregate your results
After a few reps, you’re going to get a ton of results. I average about 15,000. Scrapebox does some de-duping for you, but I always like to spend five minutes cleaning this list, filtering out major platforms like Youtube, Yahoo, Facebook, etc, and removing duplicates. Get the junk out here and have a cleaner list later.
Got a huge list of webpages that may or may not have a link to you? Wouldn’t it be great to find any links without checking each page one by one? There is. Screaming Frog to the rescue.
Copy and paste your long list out of Excel and into a notepad file. Save as a .txt file. Then, head over to Screaming Frog.
Choose: Mode > List
Upload your recently created .txt file.
Then choose: Configuration > Custom
Enter in just the SLD and TLD of your website. See below:
Now when you click start, Screaming Frog will only search the exact URL in your text file, and check the source code for any mention of yoursite.com (for example). In the “custom” tab, you can see all the pages Screaming Frog found a match. Be careful, sometimes it will find hyperlinks that aren’t actually linked, email addresses for you, or hotlinked images.
Boom. I bet you’ll have more links than you originally did, many of which are pulled from the supplemental hell of Google’s index. Many of these are in fact so deep that OSE, Ahrefs, Majestic, etc., don’t ever discover them (or they choose to suppress them). But, odds are, Google is counting them.
Remember earlier when I said this wasn’t a perfect solution? Here’s the reason. Some of these pages that Google shows for a query are quite outdated, especially the deeper you go in the index. In many cases you could grab any one of the URLs that you found that did not have a link to your site (according to Screaming Frog), and look at the Google cache, then find the link. Did Screaming Frog fail? No. The link has vanished since Google last crawled the URL. Sometimes these deeply indexed pages don’t get crawled again for months. In a month the link could have been removed or been paginated to another URL (common in forum spam). Maybe the link was part of an RSS or Twitter feed that once showed in the source code but has since been bumped off.
The only way I know to overcome this takes a lot of processing – more than my 16gb laptop even had. Remember the part where you upload the full list of URLs into Screaming Frog in list mode? Well, if you wanted to pull of the governers, you could actually crawl these URLs and their connected pages as well by going to Configuration >Spider>Limits and remove the limit search depth tick, which applies a crawl depth of ’0′ automatically when switching to list mode. I was able to find a few more links this way, but it is indeed resource intensive.
This is an extreme example on rare cases.
Yesterday we had a prospect call our company who was looking for a second opinion. Their site had a penalty from some SEO work done previously. The current SEO agency’s professional opinion was to burn the site. Kill it. Start over. My gut-second opinion was that it should (and could) probably be saved. After all, there’s branding on that site. The URL is on their business cards. It’s their online identity and worth a serious attempt at rescue. In this case I think extra steps like the above might be in order (if it should come to that). But if it’s a churn-and-burn affiliate site, maybe it’s not worth the effort.
Post-penguin we find that removing the flagged links, with the parallel event of links just becoming less and less valuable as the algorithm refines itself, does keep rankings from bouncing completely back to where they were before – in most, but not all, cases. That’s a hard pill for some smaller business owners to swallow, but I have never seen a case of penalty removal – where all the levels of rank affecting penalty were removed – keep a site from never succeeding in time. Time being the keyword.
So yeah, maybe it really has “come to this,” If your site is worth saving. At the very least you’ll be learning your way around some incredible powerful tools like Scrapebox, Cognitive SEO, and Screaming Frog.
I’m excited to see if anyone has a more refined or advanced way to achieve the same effects!
There must be thousands of SEO tools. While many tools are junk, a few great tools rise up each year and grab our attention. They’re often built for some very specialized needs. Of all the industries these brilliant developers could build in, they chose SEO. I’m always thankful and curious. As a fan of SEO tools, both free or paid, I’m excited to learn about new ones.
A few months ago I got an email from François of Linkody asking for some feedback. It did a nice job of link management and monthly ‘new link’ reporting. Pricing was very low, it’s completely web-based, and is very simple and clean. It pulls from the big backlink data providers, and even has a free client-facing option (exclusively using Ahrefs) at http://www.linkody.com/en/seo-tools/free-backlink-checker. Great for quick audits. I’ve used it quite a bit myself, and was happy to give a testimonial.
The link management function isn’t new to the SEO space. Many tools do it already, like Buzzstream and Raven – and they do it quite well. Additionally, link discovery is an existing feature of tools like Open Site Explorer, yet this is an area where I see opportunity for growth. I love the idea of these ‘new link’ reports, but honestly, haven’t found anything faster than monthly updates. I know it’s a tough request, but I mentioned this to François. By tracking “as-it-happens” links, you can jump into conversations in a timely manner, start making relationships, and maybe shape linking-page context. You might even be able to catch some garbage links you want to disassociate yourself from quicker.
The other day I received a very welcomed response: “I wanted to inform you of that new feature I’ve just launched. Do you remember when you asked me if I had any plan to increase the (monthly) frequency of new links discovery? Well, I increased to a daily frequency. Users can know link their Linkody account with their Google Analytics account and get daily email reports of their new links, if they get any of course.”
Sold. That’s a clever way to report more links, and fill in gaps that OSE and Ahrefs miss.
Upon discovering the new URL, you can choose to monitor it, tag it, or export.
The pros: Linkody picks up a bunch of links on a daily basis that some of the big link crawlers miss. You can opt for daily digest emails (think, Google Alerts). Plus it’s pretty cheap!
The cons: It needs Google Analytics. Plus, for the Google Analytics integration to track the link, the link has to actually be clicked by a user. However, for those who have moved to a “link building for SEO and referral traffic generation” model (like me), this might not be much of a con at all.
As François told me, “next is displaying more data (anchor text, mozrank…) for the discovered link to help value them and see if they’re worth monitoring. And integrating social metrics.” Good stuff. I’d like to see more analytics packages rolled in, and more data sources? Maybe its own spider?
If you’re a link builder, in PR, or a brand manager, I definitely recommend giving Linkody a spin. It’s a great value. Keep your eye on this tool.
I remember a few years ago blowing the mind of a boss with a theory that Google would eventually rank (in part) based on their own internal understanding of your object. If Wikipedia could know so much about an object, why couldn’t Google? In the end, I was basically describing semantic search and entities, something that has already lived as a concept in the fringe of the mainstream.
In the last year Google has shown us that they believe in the value of a semantic web and semantic search engines. With their 2010 purchase of Metaweb (which is now Freebase), and the introduction of the knowledge graph, the creation of schema, and the sudden delivery of a new algorithm called Hummingbird, Google is having one hell of a growth spurt. It’s not just rich snippets we’re talking about or results that better answer Google Now questions.
We used to say Google had an elementary school education. They understood keywords and popularity. Now it can be argued Google has graduated, and is now enrolled in Silicon Valley Jr. High School. Comprehension has clearly improved. Concepts are being understood and logical associations are being made. A person/place/thing, and some details about them (as Google understands it), are starting to peek through in search results.
Yesterday was my birthday. Yesterday was also the day I became Google famous – which to an SEO geek is kind of awesome. I asked Google a couple questions (and some non-questions), and it showed me I’m an entity (incognito and logged in):
This produced a knowledge result (like we’ve seen a couple times before). Details on how I got this are illustrated deeper in this post:
The comprehension level has its limit. Ask Google “when was bill sebald born” or “what age is bill sebald” or “when is bill sebald’s birthday,” and no such result appears. For some reason an apostrophe throws off Google – quereying “bill sebald’s age” vs. the version bulleted above, and there’s no knowledge result. Also, reverse the word order of “bill sebald age” to “age of bill sebald” and there’s no result.
Then, ask “bill sebald birthday” and you’ll get a different knowledge result apparently pulled from a Wikipedia page. This doppelganger sounds a lot more important than me.
We know Google has just begun here, but think about where this will be in a few years. At Greenlane, we’re starting entity work now. We’re teaching our clients about semantic search, and explaining why we think it’s got a great shot at being the future. Meh, maybe social signals and author rank didn’t go the way we expected (yet?), but here’s something that’s already proving out a small glimpse of “correlation equals causation.” It doesn’t cost much, it makes a lot of sense for Google’s future, and seems like a reasonable way to get around all the spam that has manipulated Google for a decade.
I’m not into creating a label. Semantic SEO isn’t a necessary term. You might have seen it in some recent presentations or blog post titles, but to me this is still old-fashioned SEO simply updating to Google’s growth. This is the polar opposite to the “SEO is dead” posts we laugh at. Someone’s probably trying to trademark the “semantic SEO” label right now, or at least differentiate themselves with it. To me, as an SEO and marketer, we always cared about the intent of a searcher – semantic search brings us closer to that. We always cared about educating Google about our values, services, and products. We always wanted to teach Google about meaning (at least for those who were doing LSI work and hoping it would pay off). If this architecture becomes commonplace, it becomes part of any regular old SEO’s job duties. Forget a label – it’s just SEO.
The SEO job description doesn’t change. Only our strategies, skills, and education. We do what we always do – mature right along with the algorithms. We will optimize entities and relationships.
Semantic search isn’t a new concept.
I think the knowledge graph was one of the first clear indications of semantic search. Google is tipping its hand and showing some relationships it understands. Look at the cool information Google knows about Urban Outfitters. This suggests they also know, and can validate this information – like CEO info, NASDAQ info, etc. Google’s not quick to post up anything they can’t verify.
Click through some of the links (like CEO Richard Hayne) and you’ll get more validated info.
These are relationships Google believes to be true. For semantic search to work, systems need to operate seamlessly across different information sources and media. More than just links and keywords, Google will have to care about citations, mentions, and general well-known information in all forms of display.
Freebase, as expected, uses a triple store. This is a great user-managed gathering of information and relationships. But like any human-powered database or index, bad information can get in – even with a passionate community policing the data. Thus, Google usually wants other sources. Wikipedia helps validate information. Google+ helps validate information.
The results I got for my age (from Google above) probably came from an entry I created for myself in Freebase. The age is likely validated by my Google+ profile where I listed my birthdate. Who knows – maybe Google also made note of a citation on Krystian Szastok’s post about Twitter SEO’s Birthdays where I’m listed there too. I’m sure my birthday is elsewhere.
But what about my height? Google knows that too, and oddly enough, I’m fairly sure the only place on the web I posted that was in Freebase:
But I also added information about my band, my fiance, my brother and sister – none of which I can seem to get a knowledge listing for. However, Google seems to have arbitrarily given one for my parents, who as far as I know are “off the grid.”
Another knowledge result came in the form of what I do for a living. This one is easy to validate (in this case only helped with several relevant links I submitted through Freebase):
This is really the exciting part for me. When I first saw the knowledge graph in early 2013, it wasn’t just a, “that’s cool – Google’s got a new display interface,” type of thing. This was my hope that my original theory may be coming true.
In fact, in a popular Moz Whiteboard Friday from November 2012 called Prediction: Anchor Text is Weakening…And May Be Replaced by Co-Occurrence, I was hopeful again. There was a slight bit of controversy here on how a certain page was able to rank for a keyword without the traditional signs of SEO (in this case the original title mentioned co-citation, where Bill Slawski and Joshua Giardino brought some patents to light – see the post for those links). My first though – and I can’t bring myself to rule it out – might have been that it’s none of the above; instead, this is Google ranking based on what it knows about relations of the topic. Maybe this is a pre-Hummingbird rollout sample? Maybe this is the future of semantic search? Certainly companies buy patents to hold them hostage from competitors. Maybe Google was really ranking based of internal AI and known relationships?
Am I a fanboy? You bet! I think the idea of semantic search is amazing. SEO is nothing if not fuzzy, but imagine what Google could do with this knowledge. Imagine what open graph and schema can do for feeding Google information on creating deeper relationships. Couldn’t an expert (ala authorship) feed trust in a certain product? Couldn’t structured data improve Google’s trust of a page? Couldn’t Google start to figure out easier the intent of certain searches, and provide more relevant results based on your personalization and those relationships?
What if it could get to the point where I could simply Google the term “jaguar.” Google could know I’m a guitarist, I like Fender guitars, and I’m a fan of Nirvana (hell – it’s a lot less invasive than the data Target already has on me). Google could serve me pages on the Fender Jaguar guitar, the same guitar Kurt Cobain played. Now think about how you could get your clients in front of search results based on their relationships to your prospective searchers needs. Yup – exciting stuff.
An entity is an entity. Do this for your clients as well. The entries in Freebase ask for a lot of information that could very well influence your content production for the next year. Make your content and relationships on the web match your entries. At Matt Cutts’ keynote at Pubcon, he mentioned how they’re just scratching the surface on authorship. But I think authorship is just scratching the surface on semantic search. I think the big picture won’t manifest for another few years – but, no time like the present to start optimizing for relationships. At Greenlane we’re pushing all our chips in on some huge changes this year, and trying to get our clients positioned ASAP.
On a side note, I have a pretty interesting test brewing with entities, so watch this spot.
PFor one reason or another, plenty of sites are in the doghouse. The dust has settled a bit. Google has gotten more specific about the penalties and warnings through their notifications, and much of the confusion is no longer… as confusing. We’re now in the aftermath – the grass is slowly growing again and the sky is starting to clear. A lot of companies that sold black hat link building work have vanished (and seem to have their phone off the hook). Some companies who sold black hat work are now even charging to remove the links they built for you (we know who you are!). But at the end of the day, if you were snared by Google for willingly – or maybe unknowingly – creating “unnatural links,” the only thing to do is get yourself out of the doghouse.
Occasionally we have clients that need help. While it’s not our bread and butter, I have figured out a pretty solid, quick, and accurate method when I do need to pry a website out of the penalty box. It requires some paid tools, diligence, a bit of excel, and patience, but can be done in a few hours.
The tools I use (in order of execution):
To get the most out of these tools, you do need to pay the subscription costs. They are all powerful tools. They are all worth the money. For those who are not SEOs, reading this post for some clarity, let me explain:
To truly be accurate about your “bad links,” you need to get as big a picture of all the links coming to your site. Google Webmaster Tools will give you a bunch for free. But, in typical Google fashion, they never give you everything they know about in a report. Hell – even their Google Analytics is interpolated. So, to fill in the gaps, there are three big vendors: Open Site Explorer by Moz, Majestic SEO, and Ahrefs.
Wait – so why isn’t Ahrefs and Majestic SEO on my numbered list above? Because Cognitive SEO uses them in their tool. Keep reading…
Note: Click any of the screenshots below to get a larger, more detailed image.
1. Download the links from Google Webmaster Tools.
Click Search Traffic > Links To Your Site > More > Download More Sample Links. Choose a CSV format.
Don’t mess with this template. Leave it as is. You’re going to want to upload this format later, so don’t add headers or columns.
2. Download all individual links from Open Site Explorer to a spreadsheet.
3. Copy only the links out of OSE, and paste under your Webmaster Tools export.
At this point you should have a tidy list of each URL from Google Webmaster Tools and Open Site Explorer. Only one column of links. Next, we head over to Cognitive SEO.
There are a number of SaaS tools out there to help you find, classify URLs, and create disavow lists. I’ve heard great things about Sha Menz’s rmoov tool, There’s also SEO Gadget’s link categorization tool (everything they build is solid in my book). I once tried Remove’em with OK results. Recently Cognitive SEO entered the space with their Unnatural Link Detection tool. With a little bit of input by you, it has its own secret sauce algorithm. I found the system to be quite accurate in most cases, classifying links into three buckets: OK, suspect, and unnatural. More info on the AI here. Also, if you read my blog regularly, you might remember my positive review of their Visual Link Explorer.
First you tell Cognitive what your brand keywords are. Second, you tell it what the commercial keywords are. Typically, when doing disavow work for a client, they know what keywords they targeted. They know they were doing link building against Google guidelines, and know exactly what keywords they were trying to rank for. If the client is shy and doesn’t want to own up to the keywords – or honestly has no idea – there’s a tag cloud behind the form to help you locate the targeted keywords. The bigger the word, the more it was used in anchor text; thus, is probably a word Google spanked them over.
A note about the links Cognitive provides: Ravzan from Cognitive tells me the back link data is aggregated from MajesticSEO, Ahrefs, Blekko and SEOkicks mainly. That’s a lot of data alone!
Below I’ve used Greenlane as an example. Other than some directory submissions I did years ago, unnatural link building wasn’t an approach I took. But, looking at my keyword cloud, there are some commercial terms that I want to enter just to see what Cognitive thinks. Note, the more you fill in here, the better the results. The system can best classify when at least 70% of anchor text is classified as brand or commercial.
Click submit, and Cognitive quickly produces what it thinks are natural and unnatural links.
Cognitive produces nice snapshot metrics. I can quickly see what links I need to review (if any). In my case, Cogntive marked the directory work I did as suspect. Since I don’t have a manual or algo penalty, I’m not going to worry about this work I did when I was younger, dumber SEO.
But, for a client who has a high percentage of bad links, this is super helpful. Here’s an example of results from a current client:
This site has a highly unnatural link profile and it’s likely to be already penalized by Google. This happens to be an all too true statement.
Next, Cognitive added a layer of usability by extending with the Unnatural Links Navigator.
This tool basically creates a viewer to quickly toggle through all your links, and quickly (with some defined hotkeys) tap a site as “disavow domain” or “disavow link”. You get to look at each site quickly and make a judgement call on whether you want to agree with Cognitive’s default classification, or disagree. 9 times out of 10 I agree with what Cognitive thinks. Once in a while I would see a URL labeled “OK” where it really wasn’t. I would simply mark it to disavow.
What should you remove? Here’s a page with great examples from Google. Ultimately though this is your call. I recommend to clients we do the more conservative disavow first, then move to a more liberal if the first one fails. Typically I remove things that link look like they belong on the previously linked page. I also remove pages with spun content, forum board spam, xrumer and DFB stuff, obvious comment spam, resource page spam, and completely irrelevant links (like a viagara link on a page about law). PR spam, directories, and those sites that scrape and repost your content and server info have been around forever – currently I see no penalty from these kinds of links, but if my conservative disavow doesn’t do the job, then my second run will be more liberal, and contain these. 9 times out of 10 my conservative disavow is accepted.
This part of the process might take a couple hours depending on how many links you need to go through, but this is obviously much faster than loading each link automatically, and a lot more thorough than not loading any links at all. I believe if you’re not checking each link out manually, you’re doing it wrong. So turn on some music or a great TV show, grab a beer, tilt your chair back, and start disavowing.
Once complete, you’ll have the option to export a disavow spreadsheet and a ready-made disavow .txt file for Google.
Here’s are the full steps to make the most out of Cognitive SEO.
Google wants you to make an effort and reach out to the sites to try and get the link removed. Painful? You bet. But some SEOs swear it doesn’t need to be done (exclaiming that a simple disavow is enough).
To disavow, take the .txt file you exported from Cognitive, and add any notes you’d like for Google. Submit through your Google Webmaster Tools at https://www.google.com/webmasters/tools/disavow-links-main
But, if you want to attempt to get the links removed, Buzzstream can help you! Buzzstream is like a CRM and link manager tool for inbound marketers. Easily in my top 3 SEO tools. For prospecting, one (of several) things Buzzstream can do is scan a site and pull contact information. From an email that appears deep in the site, to a contact form, Buzzstream can often locate it.
By creating an account with Buzzstream, you can upload your spreadsheet of links into it, forcing Buzzstream to try to pull contact information. Choose “match my CSV” in the upload, and tell Buzzstream your column of links should be categorized as “linking from.”
Here’s a sample. Notice the email, phone, and social icons? This is a huge help in contacting these webmasters and asking for the link to be removed.
That’s all there is to it. For anyone who has done disavows in the past, and found it excruciating (as I used to), this will hopefully give you some tips to speed up the process. Of course, if you’re not in the mood to do any of this yourself, there are certainly SEO companies happy to do this work for you.
Any questions with these steps? Email me at email@example.com or leave a comment below.
Ever wonder how powerful some of the oldest SEO recommendations still are? With the birds and the bears (and a little caffeine) changing so much in SEO since 2011, I wanted to see first hand some of the results we can get from some moves like internal linking and title tag optimization. Using my own site as the proving ground, and moving quickly between tweaks and first results to try and exclude any other ancillary update or change, I decided to test some optimizations I still see recommended or used in the field. The set of competing pages I chose below don’t move very often, so I thought this might be a good group to experiment with.
Note: It’s important to understand that this is not a controlled test at all. Any single domain I’m competing against could be making some changes at the same time which would naturally skew my results. Let’s take this with a grain of salt and consider all of this directional. This is not advice, this is merely my experience and thoughts. If I get hammered on this in the comments, so help me…
Truthfully I think the tl;dr can be summed up pretty well in a single statement:
[rant] See, the results of these tests turned out as I (and probably most of you) expected. Virtually no gains on the thinnest of tests. There were very little surprises below. Yet, these still bring related recommendations all the time from lesser quality blogs – or worse, sometimes agencies and consultants.
Last week I walked into a pitch where the prospect showed me some of the projects his current neighborhood SEO company is working on. He candidly told me he didn’t know what the SEO company was doing for him (which is why he was entertaining new vendors). With the draft of this post in my head, he started sharing some of the recommendations he was given – some of which coincidentally are listed below. Others recommendations included press release links and quickly churned video production.
Now I’m not one to “negative sell” over a competitor (ie, downplay someone else’s service to promote my own), and I was extremely respectful to this vendor, but I left the meeting really frustrated for this business owner. It took everything I had to keep from blasting this vendor. The business owner is clearly the victim of lazy SEO. He was a great guy trying to run a business and relied on the company to be his SEO hero. I respectfully gave him my different opinion on tactics and strategies without truly speaking my mind. I’m still not sure I shouldn’t have been more truthful.
In case you’re wondering, none of the local services in my screenshots below is the vendor I’m reluctantly protecting. [/rant]
Updated 2-20-2014: Lia Barrad made a great point in the comments that I feel should be added here. Unfortunately I couldn’t persuade a client to allow me to display bigger data. As a result I was only limited to do the tests on our own site. The amount of traffic and testing options I had on this relatively small Greenlane site didn’t give me much opportunity to also show a lift/loss in traffic. I really wanted to share that as well, because I truly think qualified and converting traffic is way higher on the list of valuable SEO KPIs. Instead I was relegated to using garbage keywords like “Philadelphia SEO” that doesn’t bring much good traffic (I used to rank extremely well for the term and eventually abandoned it because it wasn’t worth the effort in my case).
Enjoy the test!
Situation: On November 17 2013, using Chrome Incognito, my site ranks #11 for a geo-targeted keyword (see graphic below – in this case I don’t want to muddy this test by adding the keyword anywhere on this website except in the testing page).
Click images for larger view
The strongest page on my site is my homepage (which is currently ranking for the keyword above). It has a PA of 52.58, with 420 external links passing link equity from 51 external domains.
My second strongest page is my Outdated Content Finder tool. It got mentions in Moz, Search Engine Land, Search Engine Journal, and picked up from mentions at Mozcon. It has a page authority (PA) of 49.07, with 89 linking root domains, for a total of 100 external, equity passing links. There are already 40 outbound links from this page, with two being to external domains.
On my Outdated Content Finder page, there isn’t a reference to the homepage using any anchor text but “home” in the navigation.
Test: To see if I could pass better PageRank to my homepage, using an exact match anchor text, I implemented the following:
Expectation: In many cases, the Fetch As Google URL submission works really fast (I’ve seen it add a new URL in less than 10 minutes), but I’m not really expecting a jump in rank. I think because the sitewide navigation, where there’s a home link already embedded, this second link may not have much power.
Result: It took a few days, but there was a single-position gain on 11/18 (same as the new cache date). The bump went from position 11 to position 10. Nothing to hang my hat on normally, but for a page jump, I’m somewhat satisfied in this case.
To push the rankings a little higher, let’s try a partial-sitewide, exact anchor link to the same homepage.
Test: 11/19 – My blog has a different sidebar than my non-blog pages. With a widget in WordPress I can add a simple piece of copy with an exact anchor text link:
This isn’t a true fully-sitewide link, and is all one level deeper into the site (http://www.greenlaneseo.com/blog/) but for this experiment I think it’s good enough.
Expectation: I have a number of blogs with a wide variety of backlinks. I still believe sitewide links have power (though limited), and expect to possibly see another position bump.
Result: On 11/24 (6 days after the change), the keyword actually dropped two spots to position 12 (page 2). From what I can observe, no new sites have entered the set.
Since that sitewide link didn’t work too well, I reversed it. Actually, I updated it to push all the links into the Outdated Content Finder page. Maybe if we consolidate into my second most powerful page it might have a positive effect to the same target keyword.
Test: 11/24 – Updated the site-wide copy as follows:
Expectation: Truth is, I expected more from Test 2. With Test 2.1, I’m even less optimistic there will be a positive change. At the least, I’m expecting my target keyword to fall back to position #10.
Result: Apparently better than expected. Now appearing in position 9 for my target keyword since 11/27.
The domains in this set stayed relatively constant throughout this 10 day experiment. Again, I make no claim to this being the results everyone should expect, since we must consider competition, possible backend algorithm changes, and (especially since these are all SEO companies) possible changes by the websites themselves. But, my theories are as follows:
Situation: Thousands of SEOs, websites, and audit tools suggest these two best practices for title tags:
Personally, I’ve rejected this for the last 8 years. Here’s why – I believe Google is more sophisticated, and realizes the target keyword being first in the title isn’t always natural. In Google’s younger days, sure – it’s a signal they could code to capture, but I think it’s too limiting to be a signal today. It’s a usual SEO recommendation that surely Google knows about. Second, if a title tag is too long, it gets truncated. That’s not a great user experience, but I’ve never seen evidence of the truncated text not helping rank. I’ve only seen the opposite.
Test: To test this, I updated my title tag for my blog homepage on 11/29/2013. Target keyword is in the middle of the title tag. I intentionally caused the tag to truncate. This is a pretty terrible title, but suits the experiment:
On a side note, after creating this terrible title tag, I submitted to Fetch as Google. Within 60 seconds this title tag showed in an incognito search, despite an outdated “Nov 18, 2013″ date. That’s remarkable.
On 11/30 through 12/02, I’m ranking position 271 for my keyword. It seems pretty settled there. On 12/03, I have updated the title tag to this:
Expectation: I don’t think the ranking will move. I don’t think keyword position matters.
Result: On 12/7 my current rank for the keyword was still 271. On 1/4/2013, it flopped down to 284. No positive change.
I wholeheartedly believe the volatility of a change is different when a rank is in the hundreds, vs. in the tens. Let’s revise the same test on a keyword that is already ranking well. For the keyword Philadelphia SEO, my homepage page ranks 6. The title tag is Greenlane SEO – Search, Analytics, and Strategy Services Since 2005. A Philadelphia SEO Company.
Test: On 1/8 let’s see what happens if I change it to Philadelphia SEO Company – Greenlane SEO – Search, Analytics, and Strategy Services Since 2005.
Expectation: I don’t think the ranking will move. I don’t think keyword position matters in this case either.
Result: On 1/12 my current rank for the keyword was still 6. No positive change (but I’m reverting immediately – that’s a terrible title tag just for a supposed SEO value).
I don’t believe that a title needs to be under 70 characters for SEO value to take hold. As mentioned early in this post, a truncated title is not great from a marketing perspective. Surely there’s better things a user can see than an ellipsis in the SERP link, but when trimming to 70 characters is recommended in order to rank better, I call “shenanigans”.
Test: I’m not going to work too hard on this test because I’ve tested this before. On 1/8/2014, on a blog post called Review of Repost.us, I rank #1 for “review of repost.” The title tag is simply Review of Repost.us. I am changing the title to past 70 characters: Review of Repost.us – A Review By Bill Sebald – Is Repost.Us SEO Friendly? Let’s Find Out! Greenlane Search Marketing
Expectation: I’m expecting no drop in rank whatsoever.
Result: On 1/13/2014, no drop with new ugly truncated title tag.
As expected, tweaking the title tags with these old-school recommendations didn’t do anything. It’s not 2007 anymore.
I do hope you enjoyed the tests. As stated in the beginning of the post, this is not scientific. Take this as directional and do what you may with the information, but my recommendation for those who still solely rely on these kinds of recommendations to provide your client with SEO services, please reconsider recommending things that have a bigger impact. If you’re a business person yourself, and you get recommendations like this, please don’t drink the kool-aid.
I stumbled upon an interesting service I don’t think many SEOs know about – at least, not the few I’ve asked. It’s called repost.us. Looks like it’s about 2 years old.
Simple premise: Add your site to the database, and others can republish your content. They say, “It’s the wire service reinvented for the web.”
Click any image to enlarge
A user of repost.us can login, search for content, and simply copy and paste the blue embed code (with a couple checkbox options) right into their website. See below – one of my articles, straight from this blog, has been added to their database. This is how a user sees it:
Notice above, circled in red, there is an Adsense block as part of the copied code. This isn’t my Adsense code; instead it appears to be added there by the repost.us team, and does appear to wind up in your posted article. This gives repost.us a chance to monetize for the service. This also gives a publisher, who embeds Adsense, a chance to swing their publisher ID over as well. Interesting way to earn more Adsense clicks.
Right. The dreaded D word. Here’s a site that took my content and reposted it:
Did you notice the attribution links (in red) at the bottom? These particular links don’t show in the source code either (but others do – read on).
<div class=”rpuArticle rpuRepost-7af546614f6b5e93c9c6053b466c1a0f-top” style=”margin:0;padding:0;”>
Let’s face it – the SEO industry has a tendency to stomp a tactic into the ground. Some of us even get lazy (pleny of this kind of junk around). Directory submissions were once wildly valuable, then SEOs started creating directories by the thousands…
</div><!– put the “tease”, “jump” or “more” break here –><hr id=”system-readmore” style=”display: none;” /><!–more–><!–break–><hr class=”at-page-break” style=”display: none;”/><div class=”rpuEmbedCode”>
<div class=”rpuArticle rpuRepostMain rpuRepost-7af546614f6b5e93c9c6053b466c1a0f-bottom” style=”display:none;”> </div>
<div style=”display: none;”><!– How to customize this embed: http://www.repost.us/article-preview/hash/4917fea1ea6f6df42de6a8f3d7cb3d4d –></div>
See the links in red above? The The Kind Of SEO I Want To Be (via http://www.greenlaneseo.com/) links? Those are the only two links that appear to link back to my original, canonical blog post. They live in the source code behind the full injected content. Sadly they are both the same shortened URLs (in this case http://s.tt/1MWo1) but they are at least 301 redirects. If you believe 301′s dampen PageRank more than straight links, despite statements from Matt Cutts, then this is probably disappointing.
In my experience, this small amount of duplicate content, with one or two links back to the original document (including 301′s), don’t seem to cause any duplicate content issues. I’ve had my content posted on Business 2 Community in full with an attribution link, and Google still seems to figure it out. My posts still wind up ranking first – even if it takes a few weeks.
I emailed the team at repost.us and asked for a user count and activity. CEO John Pettitt kindly responded:
“We don’t give exact numbers but you can assume between 10K and 100K sites embed content in any given month. There are over 5000 sites contributing content. We have not quite 4 million articles in the system and we republish between 50 and 200K articles a month.
The average reposted article gets ~150 views per post, that goes up a lot for new content where it runs ~2000 and we regularly see content getting 20-50K views for an article if a bigger sites picks it up. The usage is very quality sensitive, if it’s content farm quality “seo bait” it probably won’t do well. it’s it’s original well written content it will do better.”
Pretty awesome numbers! Unfortunately, I didn’t fair so well.
After running at least 3 months, with only 6 domains republishing my articles (one apparently being repost.us itself), I received a total of 40 total impressions (disregard the chart above that suggests 21 for just for the few they show in the summary). Still, that’s 6 links I got without really doing anything but writing for my own blog.
Also, out of all the posts on my blog, there were only 6 different posts shared through the 6 different sites (I have blog posts dating back to 2007). I did see a year old post, but for the most part, all the content that got republished was newer content. I don’t know if that’s because their system chose to suppress old posts, or just a coincidence.
Finally, after spot checking the 6 sites that hosted at least one article, all but the repost.us domain were extremely poor. DA of less than 15 with virtually no external links according to Moz. Now I’m much less excited about the handful of links I received.
So it wasn’t a success for me, but in light of the numbers John (from repost.us) shared, I could very well be unlucky or simply not in line with what the user base is looking for. I write for the SEO industry. The users of this service may very well not have any interest in SEO. Or, maybe I’m just not writing interesting stuff (but I refuse to believe that!).
But I do believe in the power of reposting content. I’m not completely afraid of duplicate content over getting more eyeballs onto a piece of my content strategy. At the end of the day, republishing for eyeballs – even in traditional paper media – was a marketing goal. Again, I believe Google is good enough at sorting most light duplicate content eventually, whereas repost.us also took precautions to make sure they helped avoid adding noise to the signal and misguide the algorithm into mistaking the canonical URL. We actually just started to use repost.us for some of our clients as well, taking note of the different categories the service supports.
My only concern with the service is, based on an unfair sample of 6, there may be a lot of spammers republishing and looking to achieve an article marketing type of model (ie, post everything, monetize with ads). Could the spam links hurt? Probably not, but I would definitely keep my eyes open as an SEO.
My one sentence bottom line review: Absolutely worth a try. It could yield some great SEO and marketing results, especially when / if the service grows.
Happy Thursday everyone. A quick SEO post to bring some brevity, tips, and pop culture into your day.
Yesterday I had a client ask for some campaign items to present to the CEO. He is concerned about year over year natural search gains.
As an SEO I bet your chest just tightened up reading that. We’ve all been there. Our lot in life will put us there again. When the spotlight is put on natural search performance, it’s almost always put on you as a performer (at least semi-consciously). Maybe you feel threatened or defensive. The counter-arguments start squirting through your neocortex.
The problem is, you can’t usually get away with telling the c-suite, “you’re damn lucky I (we) were able to stop any bleeding and keep you climbing the mountain Google’s model is destined to swat you from.” As SEOs we know it’s the Pareto Principle. We know consistent top positions is vital for query revision. We know Google wants to keep the index fresh and, relatively speaking, very few brands seem to be sacred cows. I bet you’d love to say, “imagine where you’d be if I wasn’t here!” Unless you have no fear of losing your job, you probably can’t get away with that.
We need to help the c-suite. They’re never the enemy – they’re your best allies, and you are their partners. The smart ones listen and appropriately challenge you, while others may be a bit slower. They’re all regular folks with their own strengths and weaknesses. I’ve had my share of executives who just couldn’t get it – whether because they weren’t capable of seeing the big picture, or didn’t care because of the demands that were on them. Before my agency life, I had one boss who set unrealistic goals and put no resources behind his team. He convinced himself that since he was once on top, he should be natively staying on top. I quit. He lost his natural search lead, and last I heard, his company is toast (your first analogy).
If we don’t step in, the folks we consult for could take uncorrected preconceived concepts to their next jobs and cause more complications. We can be heroes, but it takes work.
What’s the ROI of SEO? Staying competitive!
I’ve found analogies and metaphors work well in explaining the obtusity of SEO. Here’s two that have worked for me. And since I’m currently celebrating 80′s Thursday (my own personal holiday), I’ve got incredible movie posters to boot.
SEO and Pirates
“As an SEO, I can help identify the opportunity and draw a loose perimeter on the map. But I can’t necessarily tell you exactly where in the perimeter the booty is; nor can I guarantee how deep it is. But, as luck would have it, I can also help you dig. The time is all dependent on how many shovels we have in the dirt, and how hard we dig. The timing may be up to luck, but we will find the gold to offset all the effort.”
I told this to a group of interactive marketers while running the SEO department in a 200+ person agency. Clients were asking account managers for SEO help, and they were hesitant to bring us in if we couldn’t guarantee an ROI. We found ourselves pitching to our own peers. After the meeting, one person in particular scheduled a meeting with me. ”It clicked,” he said. This person was an analytics and data wiz, and became a huge ally – and friend – during my years in the agency. He went on to run some major accounts (think national sport leagues), and did SEO a huge service by not only explaining it correctly, but in selling the real value through to the clients. He pre-qualified a couple opportunities for our group as well.
SEO and Racing
“To win a race, not only does the car need to consistently be upgraded (aka optimized), but many factors need to be analyzed routinely like track builds, track conditions, talent of driver and pit crew, talent of competitors.
So let’s imagine you are a team owner. You implement an expensive, cutting edge exhaust system on your best car. You notice in your trials that the car clocked better, but you still didn’t win that week’s race. Next week you install a new suspension, but again lost the race. Worse, your competition still beat you soundly without the two optimizations you have. Some of your team starts to get frustrated and confused. Theories and opinions are flying. Chaos level rising!
But you do the right thing. You keep buying, trying, testing, and removing optimizations. You watch your competitors and study their moves for inspiration, but you don’t worry. You stay on target. Suddenly, towards the middle of the season something happens. You start placing in the top 5. The points and rewards (money) you’re receiving is slowly starting to add up. Chaos level lowering!
Eventually you start winning. Your wins offset all your losses with a healthy margin of revenue leftover to enjoy. But it’s important you think about next season, and your next level of racing. New technology will arise. New track conditions, new team members for both you and your competitors, and a hundred other factors will need your monitoring. Don’t sit still just because you’re winning – if you don’t stick with it, you’re going to fall behind again. You can’t afford to do that after all your investments.”
I’ve used the racecar parallel a zillion times. I’ve used it mainly in pitch decks so I can make sure from the outset I’m explaining what the client will be in for with an engagement. Want to compete? Come with me. Not into the risk? Try paid search.
I know this post is dangerously close to “What Dom DeLuise taught me about SEO” type posts, so you’ll just have to forgive me this one time.
Also notice I didn’t use Field of Dreams “If You Build It They Will Come.” That’s a myth and a terrible movie.
We direct all our SEO prospects to our online material, which we candidly post on the website (go to our homepage and click the tour button for an example). We don’t have fancy leave-behind decks, or spend hours sweating over pitches like some agencies I’ve worked with. I’ve seen much less time (and cost) succeed with the right kind of catered communication, especially in the SEO industry alone. Our services are specific – SEO consulting with a lower emphasis on labor. In our online tour, we share our history, our beliefs, our differentiators, and our price. We have found that this helps qualify the next conversation. Some prospects read this and never return, presumably looking for another type of SEO service. While others only feel more confident about partnering with us. Through this second conversation, our conversion rate is very high.
We keep it simple, and respectful of everyone’s time. We’re all busy in business.
But our system isn’t flawless. We had a client last for only two months. We both agreed to part ways. It’s sounds funny in hindsight – how could two months determine a relationship that couldn’t be saved? From the start everything was cordial. We asked them to review our tour, and assumed they had. We had a 20 minute conversation following their internal review. We won the business without asking the right questions.
In our postmortem we realized we assumed too much. We assumed they read – and understood – our services as well as we did. 20 minutes isn’t anywhere close the the amount of time we should have spent qualifying them. We were a little too foolhardy with our gut. From the first deliverables, where we had some great ideas to really break the website out of its template, everything was rejected. We dove into their competitors to see what they were doing, and suggested rivaling big ideas. We were shot down again with concerns of time and little faith. We believed in our ideas, and fought for them. ”They’re working very well for other clients, and here’s examples of them in the wild,” I shared. We were feeling pushed into old-school SEO services, something we could do, but just don’t believe in.
As the dust started to settle, it turned out when they said they wanted quick results, they meant very, very quick – what we considered unrealistic. But for a hot minute, I bent. I instructed our team to pivot and try to deliver – a poor decision, and something very out of character. Not poor because I don’t put clients first, but poor because we weren’t in any position to meet that goal with this particular website. They had a long road ahead. Luckily, a candid discussion with the company’s CEO soon followed, and it was clear we were not on the same page. My initial emotion was, “what did you guys hire us for?” But later a clearer head asked, “why didn’t we qualify them better?” We wouldn’t have believed what they wanted was realistic.
This was a valuable wake up call to help us (re)focus on the path we spent so many months creating with the launch of our business.
They were a great company with good people and cool products – we were just on completely different sides of the fence. They knew enough SEO to have their spot, and we were trying to pull them to our side of the yard; all along not seeing the giant brick wall that divided us. Could it have been saved? Yes. But I don’t think it was worth it for either party. They’re better off with a company more in sync, as are we. Both our businesses got a pretty good education outside of SEO in my opinion.
You should be standing next to your client. Not across from them. You should be able to have open conversations. You certainly should have the grounds to disagree. If you only want to make money, being a yes-man will only get you so far.
Client: Can you get me to rank #1 for grilled cheese?
Client: Can you guarantee me a 800% ROI?
Client: Where do I sign?
Six months later when you’re making no money off the term grilled cheese, “yes” doesn’t have any power. Now you have contention, burnout, and praying your client services team has another client waiting in the wings when this current client goes supernova.
Sure, you made your money, but unless you own the company, don’t care about your reputation, and don’t have to face the clients after you sign them, you’re setting yourself up for a world of hurt.
Here’s how I might answer those questions:
Client: Can you get me to rank #1 for grilled cheese?
Bill: Probably not without a major commitment from your team, a larger budget than you have, and the ability to make changes quickly.
Client: Can you guarantee me a 800% ROI?
Bill: No, but I can make it my goal to influence Google to see you as the authority on Grilled Cheese and related cheesy sandwiches.
Client: Why should I sign you?
Bill: Because I’m not afraid to tell you how it really works in SEO, and I can teach you a lot about the additional opportunities you have in natural search based on our experiences.
I recently had a conversation with a prospect who said (paraphrasing), “I spoke with [big name SEO] who said we’d use [semi-popular blog network] for my link building if I went with them. They said it was white-hat, but it sounds like a blog network to me.”
That really depressed me. I was more than happy to inform this nice guy that the network in question was anything but white hat. Is the sale of service so important that you would intentionally mislead your prospects? Won’t that set you up for failure when you get hit with a penalty? Is the hit-and-run model the best you can scale? Is your own reputation in the SEO space not valuable? If this SEO had said, “so, yeah, we’re totally black hat… you down?” that would be respectable.
I quit consulting and agency life for a few years because I didn’t like (what I thought) was #thegame. But starting a company, and creating our own rules, built a new version of the game which I’m enjoying to death. Settling in with clients (which we call partners as an homage to a lesson learned years ago) and really respecting the values each other bring to the table has been great. Sometimes completely different business models and philosophies can work great together. Just like in love (and comic books), opposites can attract. It’s a great feeling waking up knowing you’re doing good work. When a partner will decide to leave us, I truly hope they can say, “you taught us some great stuff – I’m going to recommend you anywhere I can. Thanks for sharing your experience. We had a great adventure.”
That’s what a consultant does.
What do you want your clients’ parting statement to be?
My TL:DR tips for creating the best SEO:client relationship, and setting yourself up to do the best work of your life:
I’d love your comments below!
My fiancé loves shoes. Like the stereotype, she has a closet full. I don’t get it. I own three pair, and one pair are flip-flops. One afternoon she came into my office and said, “you’re the Google geek, find me these shoes.” She had a pair of Candie’s that broke right in the middle of the sole. I found out she searched for about 45 minutes looking for these shoes, to no avail – I suspect much longer than the average person.
Challenge accepted – I’m supposed to be good at this, but within 30 minutes I couldn’t find anything either. I used everything from reverse image search to the most descriptive keywords I could image. I looked in CSE’s, Pinterest, Amazon, and a few major shoe retailers. I went deep into supplemental pages of smaller retail sites (which is a very scary place).
The entire internet could not find a pair of these shoes. It’s like they never existed.
If you’re a retailer in this day and age, and you’re making your loyal customers work this hard, you’re doing something seriously wrong. Let’s review some of the areas Candie’s could improve, and maybe relate this to your own business. All along think about the post-sale customer journey as well (which is the category we fit in, where my fiancé started out with brand loyalty, and slowly lost it by the end of the event).
Candie’s is a brand started in the 1980′s. Kohl’s has exclusive rights to all their different product lines except shoes. Thank you Wikipedia.
A quick browse of results in Google’s blog search shows a brand that’s interested in being associated with celebrity. They pursue (and promote) famous spokeswomen. So far this doesn’t bring much topical diversity. Every headline I found features a blurb about Hayden Panettiere or Britney Spears, etc. Candie’s got a link from USA today (sort of), but the only thing the piece spoke about is how Vanessa Hudgens is the new face of Candie’s. Too bad the link only went to Kohls.com, and didn’t mention anything related to the value of the products or this could have been a much more valuable link.
Why didn’t the post say more? If what’s existing on the web is a clue, it might be because there’s no content strategy helping out the cause. For all I know, USA Today got a press release about the Vanessa Hudgens news, and with a quick Google search, couldn’t find anything else to say either. Yes – news outlets do work like that. They use Google too. Poor Candies.com got the shaft.
I checked forums and Twitter chatter (using Candie’s related keywords) – but didn’t find much. With all these endorsements I assumed there’d be more fan chatter. Do the customers really have nothing to pine about?
So let’s try Facebook. Candie’s has 669,000 likes. This looks promising. Lots of likes, and, well… little comments. One post asked the question “what do you love most about fall?” There were only 14 comments. 952 likes, but 14 comments. A low engagement rate even for a blasé, phoned-in question.
There are more celebrities on the Facebook page, and a few links, but most go to Kohl’s shopping pages or other Facebook pages. This looks like an intentional closed loop which doesn’t seem to be generating any critical mass whatsoever. I have to think those celebrities weren’t cheap, and underused.
How about Twitter? They have the obligatory Twitter account. However, tweets are quite busy being promotional like the Facebook page (not a best practice these days) with some additional use of cutesy Instagram pictures.
During our scavenger hunt, we wanted to see if their social team would handle a request:
Amy (my fiancé) even jumped in. Neither of us got a reply. It doesn’t look like they use Twitter for customer service, which is really pretty disappointing. The web – and your customers – expect this of you. It’s 2013.
If I were working with Candie’s, I’d be all over engagement and content creation. I’d want more out there than just what celebrity we have in our campaigns. I’d want people to start associating my products with quality, design, or trendsetting. I’d want to start showing off how well my shoes work with day and night outfits. I’d want a resource center for my shoppers to see how shoes look with certain styles. I’d want my shoes to rank for something more than just the product description of an ecommerce site. Hell, I’d even be pleased to see an outdated infographic at this point. I’m sure in the c-suite of Candie’s have plenty of adjectives and analogies for their shoes they toss out in the board room, as well as qualities they boast and want to represent. I certainly couldn’t figure it out from their online output.
To do more than rank for the brand name, Candie’s needs a voice. Candie’s needs topics and expertise. There’s certainly not much now to pull in any long-tail search traffic. There’s brand authority and the likelihood to rank for shoe related queries, but they just don’t have the context. Checking out a tool like BuzzSumo, it’s pretty clear there are people sharing articles they could have been writing – but haven’t tried.
Content marketing would only grow their share. It would make the consumer persona(s) much more valuable.
Ok, so Candie’s doesn’t succeed on the content or social front in my opinion. How about good old traditional SEO? How are the rankings? Not good according to SEMrush. A total of 7 keywords in their database, and all brand terms (save one).
Take a look at the homepage.
Right off the bat, they hide the shoes section (took me past my 3 second “catch me” time frame), but let’s just look at the basic SEO.
Now take a look at the homepage naked (CSS off):
This could very well be the least SEO friendly homepage of a major brand I have ever seen.
In fact, the whole website is really just a couple hero graphics after another, ultimately funneling traffic back into Kohl’s. Is this mandated by the Kohl’s ecommerce agreement? Maybe, but if Wikipedia is right, that doesn’t apply to shoes – the hidden category. Once I got to http://www.candies.com/spring2012/weHeart.asp, I couldn’t even find my way back to shoes.
I found 16 title tags. All but three said nothing more than “Candie’s”. Virtually no spiderable text anywhere. This site just isn’t trying. The competitors must be loving that!
What about backlinks? 263 linking root domains, and no real footprints of SEO work here.
I’m starting to wonder if a heads-down presence is intentional. Since they’re not an ecommerce store (and divert most their retail traffic to http://www.kohls.com/feature/candies.jsp), maybe they’re afraid to rank to well, as it may take away from a shopping page. That would be “tinfoil hat” behavior in which I’ve never seen the likes before.
The landing page on the Kohl’s site isn’t much better (cached here for posterity). Look at the adorable SEO copy under the 70% graphic spots. At least the anchor text matches the destination pages’ title tags.
But I can’t wrap my head around the hero spot video. At this stage in the game, does Kohl’s really want to be doing the branding for Candie’s? Hit play, and prepare to be distracted from your purchase for 60 seconds watching behind the scenes footage of a photo shoot. Kohl’s makes no other attempt to build the brand or tie this video to products. I stick by my ecommerce rule of thumb – if the graphics/video take up 75% of your real estate, they better be prepared to drive 75% of your page sales.
With precious time to encourage the interest of a searcher, where the back button is ingrained in every shoppers brain, I would be promoting more shopping funnels. Not distractions.
If we were hired to do Candie’s SEO, while assuming probable bureaucracy and politics the brand may have with their ecommerce agreement, I would certainly work to fix the items I cried about above. That’s a no-brainer. But that would only get them to the stadium. If they wanted to compete, they’d have to keep swinging.
They need some big ideas and quick.
This post assumes the Candie’s brand actually wants to be bigger online than they currently are. That may be erroneous. Also take into consideration, this is not an exhaustive audit by any means (I run an SEO company but Candie’s is not a client). If I’m wrong on this audit, then enjoy the lessons anyway. The truth is, from only a few hours of scanning, the opportunities for SEO leap out at you.
Amy (my fiancé) never did hear from Candie’s. She never did find a single instance of the shoe, but her diligent searching led her to discover another brand who had similar shoes. They got the sale. If this is happening on a large scale, it’s scary to think how many lost sales Candie’s has had in the last decade.
On the contextual front, SEO is more than just keyword-level stuff (where Hummingbird reminded some of us with a baseball bat). While brands may have a certain upper-hand, they still need to do search marketing. Candie’s is an example of a brand that’s getting buried for anything but their own brand terms. With great power comes great responsibility – aid your customers post-sale, and they’ll reward you. There must be other customers (in addition to my fiancé) who love Candie’s shoes. I struggle to believe all the followers they have are there for Carly Rae Jepsen pictures. In terms of topics, there’s not much to read, so they don’t have much to release virally. If I were a brand advocate, I’d be pretty hungry for something more from the website. I’d be flatly disappointed.
The truth is, Candie’s is not alone. I’ve done a lot of big brand work, and while they usually have a big backlink portfolio due to being top of mind for a lot of bloggers, there are many instances where the content output is very low. Attempts were never made to be more than a couple marketing campaigns or a logo. While Candie’s certainly works with ad agencies, a company or specialist who knows search would be an amazing addition to their arsenal.
By the way, Amy also recently found the box. For any women wondering what shoes these are, or if they were actually real (as I began to question), here you go. Dazz black, baby. I’m probably going to get in trouble for showing her shoe size.
I’ve had my share of SEO predictions fall flat on their face. But I remember distinctly sitting in the office of a VP in my former ‘big agency’ life (guessing around 2009), talking about how Google will have to move into identifying, comprehending, and processing intent, while finding new ways to judge popularity. PageRank was a great start, but it can’t scale. Our culture is completely online now – the Google algorithms, relativity speaking, can’t keep up. It’s easy to forget Google isn’t magical. They’re still a powerful but limited machine.
I would postulate on Google eventually looking at more abstract factors where good old fashioned online marketing campaigns could get recognized. Where pieces and results of campaigns become crumbs that make up influence in aggregate. Truth was, I was seeking internal support for expanding the SEO group’s output, instead of mild data crunching and producing thin, quick-and-dirty recommendations. In 2009 it seemed obvious that Google would eventually shut down “gaming the system” schemes – of which they recently did a reasonably good job (with some causalities). It seemed to me that if anyone could understand programs to scale and distort, it’s Google. It also felt like the routine tactics of SEO couldn’t last forever. It felt like time to start getting creative.
I wanted to believe in the power of marketing effecting SEO. Not just because that was my college background and interest, but because it seemed logical. Marketing has shaped our culture. Our culture is online. Thus, Google needs to continue understanding the culture’s role and response in marketing. In there lies understanding of the queries.
I didn’t (and still don’t) think all SEOs need to be marketers. Digital PR? Not all SEOs use the same side of their brain but still remain pertinent. It’s sensational to say, “the SEO industry must adapt to *THIS* or die!” Like anything in any marketing channel, that’s awfully limiting. Defining rules and standards? Not for me, I shake that kind of stuff off. No person (or concept) is going to be able to drive the SEO bus alone. The Magical Mystery Bus drives itself.
Let’s think about the clues we have at hand, which to me suggest a path towards SEO marketing.
Here’s the definition of marketing from the AMA. ”Marketing is the activity, set of institutions, and processes for creating, communicating, delivering, and exchanging offerings that have value for customers, clients, partners, and society at large.”
I really wish PPC didn’t get the label of search engine marketing (SEM). It doesn’t seem to fit today. It’s like when alternative music became mainstream – it became the alternative to what? I would like to use the term search engine marketing for the concept of big ideas that Google notices, appreciates, rewards, and shares. I want to impress Google by impressing their users first. I’m not going to try to make up a new term (I have shame), but we refer to it at Greenlane as SEO marketing; a non-creative name for creative campaigns. It’s what I couldn’t convince a big agency of doing.
Here’s a couple very recent things we know:
They took away all our keyword-level data
This is a raw nerve. [not provided] is a jerk, but not that significant a change in my book. Lazy SEOs can now fully hide behind this when they tell their clients, “sorry – I can’t prove how awesome we are without keyword level data.” Or, they can promote themselves to the client when total organic data is on the rise (even if it’s branded terms from some other online marketing channel in the other side of the house, where the SEO had no influence). There’s lots of posts floating around basically admonishing you from caring about this total loss since the “representational sample” we’ve been playing with was already soiled since October 2011. On that I totally agree. When Annie Cushing called these keyword data remnants “junk data,” it’s not just because she’s proven, but because it’s common sense. I do however disagree with the posts that scold you for ever caring about keyword data in the first place. That’s got to be tweetbait!
For me, I did like the remaining organic keyword data in at least one of the ways I liked all the organic keyword data. I liked it as a unique source of inspiration and guidance. Those weird keywords you found that you wanted to immediately discount. I got into the habit of analyzing them hoping to find a wormhole to another universe. I loved the, “why the hell did Google think I was relevant to that, and why did people come to my site for it,” moments. This keyword data led to topic creations that flourished for not only my own site, but my clients as well. However, this was quite limited – it was only important for the handful of possible topics you were already somehow relevant for in Google’s eyes, not the myriad of topics you could be relevant for in the demands of searchers. You need to think the other possible topic universes are even richer in opportunity.
The keyword data was great to have, but it was a small sample of your actual opportunity. We have to adapt.
Google wants to be better an answering questions. We assume it’s more than turning Google into what Ask.com was supposed to be. Every query is a question, so Hummingbird presumably is a good old fashioned Google engine update. If Hummingbird’s value is to understand the meaning of the words, “ communicating, delivering….” for the value of “customers, clients, partners, and society at large” seems to be more important in my book. This suggests to me SEO is more about communication than ever before. Content, as a general artifact, isn’t the king it used to be. The topic that answers presumed query intent may be more valuable, and that takes some iteration to get right. That’s certainly a content marketing principal.
Why does Google care about site speed? Why do they care where ads are located? DOM, bounce, hierarchy – whether Google infers or uses GA data is debatable (either Google is lying, or they’re not). The bottom line is these are things I believe they should be looking at, but won’t make too prominent because they’re all game-able direct signals. Until they can weed out bots artificially crawling a site and leaving footprints to emulate a visitor’s “happy, successful site session,” we might as well (at the minimum) look at these items as a usability feature to improve the visitors experience aside from Google. As an SEO, we did a good job getting the traffic, but why should we stop there? Why not make sure the material the searcher receives is indeed inline with their query.
Not all direct signals are cut and dry. So maybe Google plusses don’t help you rank. They sure help you figure out what your community likes; that could help you rank.
We’ve seen Google overcome a lot of garbage the last few years. Sure, they blew up a few innocent communities bombing the bad guys, but they’re not afraid to make changes. They’re wise to pull back on things that can backfire. So with some technical site characteristics being a factor, it’s safe to think there will be more, no? Help the conversation continue by helping the site improve. In the meantime, take advantage of everything else and produce good communication that will maybe have its day in the sun when the algorithm catches up.
The (re)launch of our agency came with many changes from my original launch as a sole proprietor in 2005. From a partner, to employees, to 15 clients – it all brings different responsibilities. Some Keith and I still need to learn. Case in point – this week we lost our first client. It was mutual. We weren’t on the same page, and as part of our postmortem, I see why. Where we are promoting the big picture ideas above, they were looking for the type of work I was doing in 2009 at the big agency. Strictly keyword focused stuff. I don’t want to say we evolved, because I don’t want it to downplay the significance of other SEO approaches, but we have organically morphed into something shaped by our personal 13 year SEO experience. We are looking for clients that have morphed the same way we have.
We do creative things. We consult with companies – hand in hand – to create and drag the right campaigns to the ground. It’s all very much based in SEO, but in thinking of all the strategies and projects we have going on across our portfolio, I’m pretty excited to see where SEO goes. I feel like we’re seated well. I’m banking on it, so to speak. I think this is one prediction that shows no sign of falling on its face, and something I hope all SEOs are taking a good hard look at from time to time.