Occasionally I like to review some tools that I have personal experience with, especially if they’re somewhat unknown. Some survive, some don’t. Some kick ass, and some, well… save your ass.
WPengine is our current hosting solution (since we run WordPress), going from 1&1 (a “you get what you pay for” type of hosting service). I wasn’t thrilled with 1&1, especially after trying to run the Outdated Content Finder with them. They’re support was very poor. But not unlike any other webhost I used in the past. I figured they’re all this way. Reliability and availability was fine for the most part until a Moz Top 10 link came through, then timber…. down went the webhost. I’m also not a huge fan of having to dig through confusing resource pages when a problem happens. I figured they were all this way.
So the switch to WPengine came from a local Philly SEO named Zach Russell who specialized in WordPress speed and security. He told me to check it out and really promoted the switch.
The site is way more stable for really not much more money. The backend interface isn’t some hacky cPanel type interface – it’s a custom designed, and super useful, backend that easily helps you:
- Backup (and recall a backup) of your entire WordPress site
- Quick access to error and server logs
- Quick access to a staging environment
- Quick access to visits, bandwidth, and storage stats
- Drives you to one of the best support teams I’ve ever used
And, it connects right into your WordPress install.
Solving the WordPress Frustrations
I’m a huge WordPress fan. And so are hackers. Since it’s so popular, hackers scan for vulnerabilities (often through one of a kazillion WordPress plugins). Once they get in, it’s hard to purge them and close their secret door. My site got hacked while on 1&1. It was a disaster and took a while to restore, which I ultimately did myself. WPengine makes the claim to offer security, and so far, not a problem there.
WPengine will update your WordPress for you, while being very particular about the plugins they let you install. If it’s on their blacklist, it’s not going in. At first I was disappointed to see some of my past-favorite plugins disallowed, but I certainly came to understand and respect it. One company’s plugin I previously favored was on the list; when I complained, they reached out to the company to push for improvements. Soon thereafter it was on my “allowed” list.
Editing PHP is dangerous when you’re not an expert like me. It’s super easy to take down a site with a misplaced period. The backup feature is fantastic. Usually within an hour (of a small site like ours), the routine backup can be installed with a click of a button. I’ve even had the support team on live chat helping me figure out where my code was bad. That’s going beyond the call of duty. Another situation – I actually had a WordPress update roll out automatically that broke my theme (super rare occurrence, but leave it to my bad luck…). Reverted it right away and went to the theme provider for an update. Total downtime was 15 minutes.
If you’re busy, these are “got your back” features I really appreciate.
If you’re using WordPress, this is a platform you really need to consider. Check them out at http://wpengine.com/. Transparency: there are affiliate links in this post, but the review – as they always are with me – is 100% from the heart.
There must be thousands of SEO tools. While many tools are junk, a few great tools rise up each year and grab our attention. They’re often built for some very specialized needs. Of all the industries these brilliant developers could build in, they chose SEO. I’m always thankful and curious. As a fan of SEO tools, both free or paid, I’m excited to learn about new ones.
A few months ago I got an email from François of Linkody asking for some feedback. It did a nice job of link management and monthly ‘new link’ reporting. Pricing was very low, it’s completely web-based, and is very simple and clean. It pulls from the big backlink data providers, and even has a free client-facing option (exclusively using Ahrefs) at http://www.linkody.com/en/seo-tools/free-backlink-checker. Great for quick audits. I’ve used it quite a bit myself, and was happy to give a testimonial.
The link management function isn’t new to the SEO space. Many tools do it already, like Buzzstream and Raven – and they do it quite well. Additionally, link discovery is an existing feature of tools like Open Site Explorer, yet this is an area where I see opportunity for growth. I love the idea of these ‘new link’ reports, but honestly, haven’t found anything faster than monthly updates. I know it’s a tough request, but I mentioned this to François. By tracking “as-it-happens” links, you can jump into conversations in a timely manner, start making relationships, and maybe shape linking-page context. You might even be able to catch some garbage links you want to disassociate yourself from quicker.
The other day I received a very welcomed response: “I wanted to inform you of that new feature I’ve just launched. Do you remember when you asked me if I had any plan to increase the (monthly) frequency of new links discovery? Well, I increased to a daily frequency. Users can know link their Linkody account with their Google Analytics account and get daily email reports of their new links, if they get any of course.”
Sold. That’s a clever way to report more links, and fill in gaps that OSE and Ahrefs miss.
Click image for larger view
Upon discovering the new URL, you can choose to monitor it, tag it, or export.
The pros: Linkody picks up a bunch of links on a daily basis that some of the big link crawlers miss. You can opt for daily digest emails (think, Google Alerts). Plus it’s pretty cheap!
The cons: It needs Google Analytics. Plus, for the Google Analytics integration to track the link, the link has to actually be clicked by a user. However, for those who have moved to a “link building for SEO and referral traffic generation” model (like me), this might not be much of a con at all.
What’s on the roadmap?
As François told me, “next is displaying more data (anchor text, mozrank…) for the discovered link to help value them and see if they’re worth monitoring. And integrating social metrics.” Good stuff. I’d like to see more analytics packages rolled in, and more data sources? Maybe its own spider?
If you’re a link builder, in PR, or a brand manager, I definitely recommend giving Linkody a spin. It’s a great value. Keep your eye on this tool.
[UPDATE - it looks like Repost.us has shut down]
I stumbled upon an interesting service I don’t think many SEOs know about – at least, not the few I’ve asked. It’s called repost.us. Looks like it’s about 2 years old.
Simple premise: Add your site to the database, and others can republish your content. They say, “It’s the wire service reinvented for the web.”
Click any image to enlarge
A user of repost.us can login, search for content, and simply copy and paste the blue embed code (with a couple checkbox options) right into their website. See below – one of my articles, straight from this blog, has been added to their database. This is how a user sees it:
Notice above, circled in red, there is an Adsense block as part of the copied code. This isn’t my Adsense code; instead it appears to be added there by the repost.us team, and does appear to wind up in your posted article. This gives repost.us a chance to monetize for the service. This also gives a publisher, who embeds Adsense, a chance to swing their publisher ID over as well. Interesting way to earn more Adsense clicks.
What About Duplicate Content?
Right. The dreaded D word. Here’s a site that took my content and reposted it:
Did you notice the attribution links (in red) at the bottom? These particular links don’t show in the source code either (but others do – read on).
<div class=”rpuArticle rpuRepost-7af546614f6b5e93c9c6053b466c1a0f-top” style=”margin:0;padding:0;”>
Let’s face it – the SEO industry has a tendency to stomp a tactic into the ground. Some of us even get lazy (pleny of this kind of junk around). Directory submissions were once wildly valuable, then SEOs started creating directories by the thousands…
</div><!– put the “tease”, “jump” or “more” break here –><hr id=”system-readmore” style=”display: none;” /><!–more–><!–break–><hr class=”at-page-break” style=”display: none;”/><div class=”rpuEmbedCode”>
<div class=”rpuArticle rpuRepostMain rpuRepost-7af546614f6b5e93c9c6053b466c1a0f-bottom” style=”display:none;”> </div>
<div style=”display: none;”><!– How to customize this embed: http://www.repost.us/article-preview/hash/4917fea1ea6f6df42de6a8f3d7cb3d4d –></div>
See the links in red above? The The Kind Of SEO I Want To Be (via http://www.greenlaneseo.com/) links? Those are the only two links that appear to link back to my original, canonical blog post. They live in the source code behind the full injected content. Sadly they are both the same shortened URLs (in this case http://s.tt/1MWo1) but they are at least 301 redirects. If you believe 301’s dampen PageRank more than straight links, despite statements from Matt Cutts, then this is probably disappointing.
In my experience, this small amount of duplicate content, with one or two links back to the original document (including 301’s), don’t seem to cause any duplicate content issues. I’ve had my content posted on Business 2 Community in full with an attribution link, and Google still seems to figure it out. My posts still wind up ranking first – even if it takes a few weeks.
Seems SEO Friendly Enough… But How Were The Results?
I emailed the team at repost.us and asked for a user count and activity. CEO John Pettitt kindly responded:
“We don’t give exact numbers but you can assume between 10K and 100K sites embed content in any given month. There are over 5000 sites contributing content. We have not quite 4 million articles in the system and we republish between 50 and 200K articles a month.
The average reposted article gets ~150 views per post, that goes up a lot for new content where it runs ~2000 and we regularly see content getting 20-50K views for an article if a bigger sites picks it up. The usage is very quality sensitive, if it’s content farm quality “seo bait” it probably won’t do well. it’s it’s original well written content it will do better.”
Pretty awesome numbers! Unfortunately, I didn’t fair so well.
After running at least 3 months, with only 6 domains republishing my articles (one apparently being repost.us itself), I received a total of 40 total impressions (disregard the chart above that suggests 21 for just for the few they show in the summary). Still, that’s 6 links I got without really doing anything but writing for my own blog.
Also, out of all the posts on my blog, there were only 6 different posts shared through the 6 different sites (I have blog posts dating back to 2007). I did see a year old post, but for the most part, all the content that got republished was newer content. I don’t know if that’s because their system chose to suppress old posts, or just a coincidence.
Finally, after spot checking the 6 sites that hosted at least one article, all but the repost.us domain were extremely poor. DA of less than 15 with virtually no external links according to Moz. Now I’m much less excited about the handful of links I received.
So it wasn’t a success for me, but in light of the numbers John (from repost.us) shared, I could very well be unlucky or simply not in line with what the user base is looking for. I write for the SEO industry. The users of this service may very well not have any interest in SEO. Or, maybe I’m just not writing interesting stuff (but I refuse to believe that!).
But I do believe in the power of reposting content. I’m not completely afraid of duplicate content over getting more eyeballs onto a piece of my content strategy. At the end of the day, republishing for eyeballs – even in traditional paper media – was a marketing goal. Again, I believe Google is good enough at sorting most light duplicate content eventually, whereas repost.us also took precautions to make sure they helped avoid adding noise to the signal and misguide the algorithm into mistaking the canonical URL. We actually just started to use repost.us for some of our clients as well, taking note of the different categories the service supports.
My only concern with the service is, based on an unfair sample of 6, there may be a lot of spammers republishing and looking to achieve an article marketing type of model (ie, post everything, monetize with ads). Could the spam links hurt? Probably not, but I would definitely keep my eyes open as an SEO.
My one sentence bottom line review: Absolutely worth a try. It could yield some great SEO and marketing results, especially when / if the service grows.
Before starting this review, I want to highlight some good prospecting by Razvan Gavrilas. He read a comment I left on a post from Seer about data visualization and Google Fusion Tables, and reached out to me (for those who disagree with me about the power of comments, here’s more proof of value). Razvan emailed me through one of my e-mail accounts, to which I unwisely mistook as being a vendor looking to pitch. He then hit me on Twitter, to which I unwisely ignored thinking it was also a vendor pitch. He then added me on Linkedin, and finally got my attention. His persistence was impressive, and my ignorance was shameful. I wish I had taken notice sooner, as he was offering me a demo of a really incredible tool. Semi-serendipitously, I offered to do a review, and recommended the company to a few of my friends, one of which was Mike King who also shared it – he has much more amplification than I do. This is more proof that persistent, smart, personal outreach may not be scalable, but it’s still incredibly powerful. Now, on to the review…
I’m a very-right brained, visual person. I really like data visualization. The critique I left on the Seer blog about Google Fusion Tables was that the functionality wasn’t there to click through and look at specific data points. As an answer to that, the Visual Link Explorer by Cognitive SEO was born. In addition to the Visual Link Explorer, my demo gave me a huge array of link slicing tools, with a lot of filters and features. Unlike many link tools predecessors, this toolset was clearly created to serve the masses who may each be looking to gather different link metrics. On many reports you can filter on link strength, citation flow, count, etc. Also unlike some simpler link reporting and analysis tools, there’s a learning curve here. But like any robust analysis tool (like Omniture for instance), it may take some time to learn this platform. I see this being valued more by the enterprise agencies or in-house SEOs who are held to higher reporting and analysis standards.
I tinkered. I created a campaign and ran an audit on my company’s services domain and another Philadelphia SEO company’s domain. I already had a fair sense of their linking tactics – they have a lot of exact match anchor footer links embedded in clients’ websites. I wanted to see how the two link profiles compared. The campaign wizard prompted me through the initial steps (where I deepened the data pull), and returned massive digital reports within 7 minutes (which the system then saves for immediate review later). That was impressive considering how slice and diced data I had at my fingertips, right in my browser.
So jumping into the new Visual Link Explorer feature specifically, this was really the most impressive of all. A fully navigable, functional, clickable visualization of my link graph:
click image below to open larger in new window
Now here’s the comparison of my SEO competitor, which was just as easy to pull up:
click image below to open larger in new window
Right off the bat it’s pretty clear that we have two completely different link building, content marketing, and site architecture strategies. By examining the cluster above, I confirmed what I suspected about my competitor. They have hundreds of links pointing directly to their homepage, with very little variation of exact match anchor text – terms like Philadelphia SEO Company, and Philadelphia SEO. Surprisingly, while Google spanked a lot of this with the Penguin updates, this company still remains strong for these keywords. They rank very well, and this visualization helps me recognize (in seconds) their exemption, and possibly put together a plan to match them at their own game. In my opinion, that’s the biggest value of data visualization – the ability to “snapshot” the landscape quickly, and start driving actionable strategies. With a lot of clients or busy days, this is incredibly important.
Zooming into the interactive interface, I’m able to see links much closer (the scroll wheel on the mouse is heavenly for this). I’m also able to toggle Link Trust Flow, Domain Trust Flow, Link Citation Flow, Domain Citation Flow, and Link Rating.
click image below to open larger in new window
I’m able to click through each of the data points to get more information (in the form of a knowledge box), a fix for one of my biggest criticisms of other data visualization tools:
click image below to open larger in new window
It’s really pretty amazing, and I’m just tapping into it. My only criticism is (and I shared this with Razvan) is its missing some definitions, and by that I mean, clearly descriptive labels of what all the amazing data means. Novice link builders will get lost in this data, so I’d like to see it maybe cater to them more. This is a powerful tool and should be clearer so all SEO clients can benefit from an empowered (and fully comprehending) SEO service provider.
I would be shocked if this doesn’t quickly become part of an SEOs regular arsenal.
More coming soon – I’m going to create a video tour hopefully soon. In the meantime, to see some of the other reports from Cognitive SEO’s great tool, here are a few more resources:
I love when clients ask, “What keywords am I ranking for?”
Well, unless you’re ranking for a keyword, and people are clicking through (where I can see it in your analytics), I have no idea. I suppose I could do a few hours of keyword research and run that through a rank checker to see if you’re in contention. Or I could use SEMrush.
SEMrush is a great tool, with a lot of keyword data for both PPC and SEO. Basically they create their own ranking index of the web. And, they’re pretty accurate. It’s great as a competitive tool – enter in a competitor and check out what they’re ranking for. This is eye-opening for several reasons. With estimated volume along with each keyword, it’s great to illustrate keywords you’re not optimizing for (and maybe should be).
It does a few other “nice to have” things, like providing quick snapshots of competitors in Google. Granted, you could do this yourself pretty quickly with your own Google search, but SEMrush gives you a decent .csv output. I also like the competitive ad review feature (helps you figure out how your ad copy stacks up with your competitors).
This tool has been out for a while, though I’m always surprised how underutilized it is. Give it a spin.
Behold. Bing is alive. Bing is Microsoft’s newest search engine (codename Kumo), replacing Live Search (Live.com redirects, and the search box at msn.com is now a Bing box). Microsoft is putting a big $80 million into branding this; probably a reaction to some of their previous branding/rebranding failures. But if this is more of the same, it’s not going to beat Google despite all of the branding. Sure it can raise market share and improve ad revenue, but this needs to be a special search engine targeting a big “type” of searcher.
So is it? Not really.
First off, the results seem about the same as before. I don’t think they did much with the algorithm – if anything. My rankings all stayed the same. I still feel like I’m getting the same mainstream to junk site results ratio (my big complaint about Live were that the results were either really safe, or really useless – very little in the way of fringe, valuable hidden sites). I assume that if this takes off, more time and money will be poured into the algorithm.
I do like the Web Groups. For certain queries, a left navigation is generated with different related categories. These categories also appear in the body of the results. Useful when the engine can’t determine a searcher’s intent. This is their attempt at giving you wider results (and actually giving you more listings per page). Google does this too on occasion, but not this well in my opinion.
I also like other components of the interface. I’m still surprised that Google is still so plain and dull. Bing gives you more color, and uses the search engine result page real estate more efficiently. To the right of each result is a dynamic button (when you hover over a listing). This gives a summary of content by pulling HTML text from the site. I think this is useful once you get used to it. It’s also easy to ignore if you’re not interested.
A lot of the other stuff is very Google like. Same old related searches, same vertical results, and pretty much the same Live image results. Dig in and try it.