WPengine is our current hosting solution (since we run Wordpress), going from 1&1 (a “you get what you pay for” type of hosting service). I wasn’t thrilled with 1&1, especially after trying to run the Outdated Content Finder with them. They’re support was very poor. But not unlike any other webhost I used in the past. I figured they’re all this way. Reliability and availability was fine for the most part until a Moz Top 10 link came through, then timber…. down went the webhost. I’m also not a huge fan of having to dig through confusing resource pages when a problem happens. I figured they were all this way.
The folks at Mountain View made the conscious decision that keywords alone couldn’t deliver them the results they wanted to see (ahem, “their users wanted to see”). Google tried some different modeling, but ultimately came around to semantic search (that is, using semantic technology to refine the query results). Now I said much of the industry has picked up on it. Not all. I still see a lot of pretending Panda, Penguin and Hummingbird never happened. That’s unfortunate for innocent clients around the world. But for most of us probably reading this, we’re students of a new lexicon. With words like “triples” and “entities” and “semiotics” and “topic modeling.”
Ah, the SEO report. Sometimes the bane of our existence. Some agencies spend the majority of their time creating monthly detailed monstrosities, while others might send quick, white-labeled exports. Meanwhile, smart companies (like Seer) look for ways to use APIs and programming to speed up data pulling. At Greenlane, we took this approach as well; Keith, my partner and incurable data nerd, created our out-of-the-box reports to pull API data on traditional SEO metrics like rankings (yes – we still believe in the value), natural traffic (at the month over month and year over year level), natural conversions (same range), and every necessary target landing page metric we could think of. Then after discussing clients’ own KPIs, we add more obligatory reports to our default set.
Entity optimization as a big SEO play isn’t quite upon us yet. It’s a slow, growing Google addition. I know – it frustrates me too. So much potential, of which I believe will greatly improve search results in the future. Google isn’t nearly showing the fruits of everything it knows through entities, whether through cards or search results – at least not relative to the way they rank on keywords alone. But can knowledge cards help bring qualified traffic while considering searcher intent? SEOs always talk about searchers intent. Anyone who’s been doing SEO for a while knows that building for intent can be a challenge.
I read – and commented on – a great post called Panda 4.0 & Google’s Giant Red Pen by Trevin Shirley. Panda 4.0 just hit; the SEO space is hiding under their desk, with some reacting either out of panic or for show. It’s definitely news, but at this point, I don’t see any reason to scream from the rooftops at Google. It’s what we should be expecting by now. In 2011, the first Panda showed us Google is not afraid to drop atom bombs. Panda opened the door for Penguin, and many updates have come since. Matt Cutts said he wished Google had acted sooner, and in his shoes, I’d probably agree.
Sometimes desperate times call for desperate measures. This post is about a desperate measure. We had a client with a manual link penalty. We did some work (using my outline from this post). Rankings started going up and traffic/conversions started boosting. Then, a few days later, the next Google Notification came in. It’s like playing digital Russian Roulette with those things – you’ll either be thrilled or be in a lot of pain. This time Google said they “changed” our penalty, as there were still some spammy links out there. Remember, not all penalties have the same impact. Clearly ours was lessened (which was continually proven in the weeks to follow), but our client – rightfully so – wanted to have the whole penalty removed. The problem was we couldn’t find anymore bad links. Everything from Ahrefs, OSE, Google Webmaster Tools, Bing Webmaster Tools, and Majestic (etc.) was classified and handled appropriately.
The link management function isn’t new to the SEO space. Many tools do it already, like Buzzstream and Raven – and they do it quite well. Additionally, link discovery is an existing feature of tools like Open Site Explorer, yet this is an area where I see opportunity for growth. I love the idea of these ‘new link’ reports, but honestly, haven’t found anything faster than monthly updates. I know it’s a tough request, but I mentioned this to François. By tracking “as-it-happens” links, you can jump into conversations in a timely manner, start making relationships, and maybe shape linking-page context. You might even be able to catch some garbage links you want to disassociate yourself from quicker.
I remember a few years ago blowing the mind of a boss with a theory that Google would eventually rank (in part) based on their own internal understanding of your object. If Wikipedia could know so much about an object, why couldn’t Google? In the end, I was basically describing semantic search and entities, something that has already lived as a concept in the fringe of the mainstream.
For one reason or another, plenty of sites are in the doghouse. The dust has settled a bit. Google has gotten more specific about the penalties and warnings through their notifications, and much of the confusion is no longer… as confusing. We’re now in the aftermath – the grass is slowly growing again and the sky is starting to clear. A lot of companies that sold black hat link building work have vanished (and seem to have their phone off the hook). Some companies who sold black hat work are now even charging to remove the links they built for you (we know who you are!). But at the end of the day, if you were snared by Google for willingly – or maybe unknowingly – creating “unnatural links,” the only thing to do is get yourself out of the doghouse.