Chances are the disavow.txt file you sent to Google is outdated. As Google keeps discovering and rolling out Penguin updates, it’s discovering links that were missed on the first go-round. It’s important to also realize that the link data tools you used for your disavow are typically much slower at finding links than Google.
During the middle of last month Apple released their latest mobile operating system update. While iOS9 boasts a number of features designed to improve usability, there are some takeaways that are shaking up things for site owners and Internet marketing folks. Let’s dive into each area a little bit more.
Before Google Analytics blew up, I was most in Omniture. It had some pretty advanced dashboard and bookmark features. I find myself missing that easy functionality in Google Analytics. But behold, it’s not unavailable… it’s just not necessarily an out-of-the-box item in Google Analytics.
When you’re prospecting for hours and hours, you can only use site modifiers for so long. Let social media help you with your work – rather than distract you. Friend or Follow is a site that tells you information about your personal Twitter account, specifically by “giving you the ability to quickly and easily sort, filter, follow, and unfollow your contacts.” Basically, it’s a way to see who has unfollowed you on Twitter, Instagram or Tumblr. That person you went to college with and are only following out of guilt? The coworker from your last job who you despised? Friend or Follow gives you a guilt-free out to unfollow those people who are clogging up your feed.
Over the last few years, SEOs have been talking about relationship building as a concept. We’re often reminded how link building is becoming digital PR. And with that, I’ve been pitched more and more by big brands looking for a collaboration of sorts, using a different spin. Note: This is not new by any stretch, but I’m writing about it because it seems to be a current trend.
These days, most clients and prospects I talk to believe they’ve run out of things to write about. One ceaseless option is to update (and improve upon) their existing assets and stale “evergreen” pieces. Yes—evergreen pieces do go stale. But this post is about new ideas in a modern age. The days of “one landing page per keyword” is SEO history. The emerging best practices for SEO content—in text form—is (again) long form, holistic copy. Searchmetrics has told us this for years. It’s not a 1:1—it’s not because Google inherently thinks long copy is better for them to serve. The correlation is likely related to Hummingbird, Google’s improvement in comprehension, and good old fashioned keywords and synonyms. Or maybe it has to do with the content’s improved power to convert (and lower the bounce). At the end of the day in 2014, the bigger net you cast, the more likely to capture Google’s attention and trust. At SMX East, Search Metrics’ Marcus Tober gave a great presentation providing more context. I don’t know about you, but I didn’t really need a study. I’ve been seeing it myself.
Necessity is the mother of invention. Many years ago, one of our clients bought a popular, content-rich website and redirected it to their current domain. SEO (and retaining the backlinks) were not on their radar at that time. Upon learning about the migration, we asked if they had redirected the site at a page-level or just redirected the site to their own homepage. The client had no idea how the redirection was done and they didn’t have a redirect list (list of the old, legacy URLs) to work from. We needed to invent a plan to gather up the data.
In the past, I enjoyed republishing content to sites like Business 2 Community and Yahoo Small Advisor. I also experimented with reposting on platforms like Linkedin. Typically (as expected) my content would rank quickly upon publishing on my own site. Then, the much higher DA sites would republish shortly thereafter. As you might expect, Google would start ranking those domains above mine. But, in a few weeks, Google would suppress those big domains and my post would remain the victor for years and years to come (without the use of cross-domain canonical tags). Why? Because it’s the canonical article. It makes sense. Google can figure that out. I wrote about this experience in 2013. I was fine with this situation. I enjoyed the traffic and visibility I would get from these other sites. It’s simply smart marketing. But lately I’ve been noticing an inconsistent change. Unfortunately, I’m finding Google isn’t performing as expected in most cases these days. For me, it’s been less than half.
I’ve heard the best time to write is very early in the morning when you’re still in sleep mode. It may help with creativity or in developing concepts. It might even help you spend less mental energy (who couldn’t use more battery life?). Not to mention, the only likely distraction are roosters, though only a problem for marketers working on farms. For our SEO clients, I often write my titles after my piece is written, but I never go into a content piece without a purpose. And more than a fluffy idea, but an idea that I can qualify as valuable.
We recently had a client launch a new site in Wordpress. It was appropriately in a staging area before launch. Instinctively upon hearing the news of the launch, we decided to look for a robots tag. Sure enough, every page was marked as “noindex, nofollow”. The client was able to make the change before Google crawled the new site. Above I said “instinctively” because, well, this isn’t the first time I’ve seen a site launch set to block search engines. It’s probably not even the 50th. I worked on an eCommerce platform where many sites launched with this issue. Wordpress – as fine a platform as it is – makes it super easy to launch set to noindex. Since developers often build sites in staging areas, they’re wise to block bots from inadvertently discovering their playground. But, in the hustle to push live an update or new design, they can forget a tiny (yet crucial) check box. I’ve gathered up three different ways you can monitor your clients’ sites, or even your own, without the use of server logs or an education in server administration. There’s different kinds of website monitoring (e.g., active, passive), but I’m keeping it simple and applicable for anyone. I wanted to pick a few that were diverse, free or affordable