I want to see more proof.
There’s a time and place for theoretical marketing posts (including SEO); I’ve written my share. I still do. I’d say about half of my posts are philosophical. John-Henry Scherck called me “the prove it” guy, but I still welcome and value the philosophical posts. However, I dislike when some posts suggest facts that haven’t been proven, or when they raise more questions than they answer. As content producers we need to be conscious of this. If we make a claim, or recommend a strategy or tactic, we better have some proof that it worked. Otherwise you could be misleading your readers. Do you have the cure to manual penalties? Do directories still have value? Is comment marketing worth doing? Prove it.
SEO has more unknowns than it’s had in a while. With dozens of new, major algorithm changes, we’re back in the dark in a lot of ways. In the days of old, we would argue things in forum boards with testing results. Now I believe we’ve become accustomed to accepting things more easily.
Are We Still Testing?
We have more Googlers sharing information with us. That’s new. Matt Cutts, John Mueller, and a few Google forum boards are very helpful. But the nuggets we get are usually as ambiguous as anything written in Webmaster Guidelines. Is this fluffy information answering most SEOs questions? Personally, I tend to find myself more confused, walking away with more questions I know Google will never answer.
So I test. A lot. I have a few website playgrounds. Many have gotten torched. I built them as a reaction of getting burned by being a passive believer.
Remember Page Rank sculpting with nofollows? For a while there, I remember every website talking about the right ways to do Page Rank sculpting. They were treating the positive impact of the tactic as fact. SEOmoz had a few posts that served as the playbook for me. I loved it. I understood it perfectly and used it on many, many ecommerce websites, believing it was law. I spent my client’s money on it. My mastery of it was something I was proud of, until Matt Cutts dropped a bombshell that Page Rank sculpting with nofollows had stopped having impact about a year prior.
I’d been living a lie.
A lot of websites and SEOs had egg on their face. If we were really testing, as an industry we probably would have figured this out for ourselves. Regardless, this was a poignant moment in my career.
I don’t blame the curators – I’m glad they’re passing this stuff along so I can have it on my radar. I use Twitter more than I use my RSS reader. But I do hold the “producers of content” accountable.
Last week I watched a Whiteboard Friday about doing SEO on someone else’s website. Good concept, but I found myself asking questions:
“If you have positive press out there or if you’re going to start generating some and get it to rank well for your brand name, that’s even better than reputation management.” How? Why? Can you show me some examples?
“Remember Twitter, in particular, Google just loves to rank Twitter pages for brand names.” Can you show me? I haven’t seen this.
“I’ve seen SlideShare URLs ranking for all sorts of highly competitive phrases.” I haven’t – can you show me an example?
“If you’ve got a great link from a source, and especially if Google’s not crawling it or they haven’t crawled it yet or that link doesn’t appear to have had much impact, you might want to point some links at it to help that page gain some extra authority, particularly if it’s on a powerful domain, but you’re feeling like, man, it’s just not getting the credit, what I would normally expect it to provide to me, you can pump that page up.” Getting links is tough – can you convince me that this is worth my time? This could be an expensive and time consuming wild goose chase.
Granted – this was a video, and maybe isn’t the best vehicle for all of my questions, but this is the kind of thing that personally leaves me with frustrated. I hate when movies do it (it destroyed the Star Wars prequels), and I really hate when our industry does it. Takes me right out of the moment.
It seems to me, as a whole, we’re apparently mostly on board with authorship being “huge”, and that “social signals are important”, but compared to the old days, there really isn’t any persuading evidence out there that I’ve seen to make me stop the press. Just a lot of fluffy blog posts and convention presentations. We have some guys, like AJ Khon who properly positioned authorship as a concept to be aware of, and guys like Bill Slawski who point us to patents that suggest it may come into play. But there are others who praise it as being a game changer without showing us why. We have to be careful with that. Remember how +1 clicks were going to improve rankings? How many posts and presentations said it already started? Yeah, well, it never did.
This post isn’t a knock on any website or anyone in particular. As I said, I’m guilty of it too, but I now try to answer the questions I’m raising when I can. In this case for example, I couldn’t show the client’s pages, but I did show as much as I could to prove the case.
Articles With Proof Live Forever
I’m training an employee to learn link building. I immediately went to this post by James Agate, published in February. Thanks to Evernote, I have a list of posts that I want to remember because they’re rich in proof. That post by James has built the core of our outreach program, not because he made claims, but he showed some data. I don’t walk away with questions after a post like that.
If you come across a post that is leaving you unsatisfied, use the comments like we used to use forum boards. Do it for your industry.
I’ve commented on Twitter about how some old SEO tactics have become relevant again after the march of Penguins and Pandas. In some regards, the SEO we’ve been resorting to feels retro. That’s not necessarily a bad thing.
One old-school tactic that I’m having a lot of luck with again is dynamic local landing pages. For most, I suspect this is an SEO 101 type tip, but for others it might inspire some new campaigns.
Before you continue with this post, you should have a quick read of Google’s (intentionally) vague definition of Doorway Pages. This tactic is specifically mentioned. We’ll come back to this later…
Take a look at these screenshots. This isn’t my doing, but a good example of the local landing page tactic from my neighborhood. These custom local pages are getting pretty good placement for competitive terms. Same website, different targeted local landing pages.
(click for larger images)
What Are We Talking About?
Remember the days when it seemed like local queries pulled up loads of specific location-based local pages in the natural results? They were often thin pages with tons of duplicate content (compared to the site’s other location pages). There was also a ton of footer links connected to other dupe pages in hopes of providing more crawls and PR spreading. There were several companies who sold a service of building these pages out and allowing you to host them in a directory or subdomain.
It got spammy.
But one day these pages started to fade in the SERPs; partially due to more Google Places listings pushing them down, but also seemingly due to an algorithm change as well. At least, that was my impression. I abandoned the tactic of building these local pages.
A few months ago I was looking at some competitor results and started to see a lot of these pages again (my client is in a medium-aggressive, though ripe with spam). I started taking notes. At about the same time I saw a note from John Mueller (from Google) answering a forum question about how much boilerplate text needs to be different to stand out and avoid duplicate content filters. His response (paraphrasing), “a few sentences should do it.”
Duplicate content has always been necessary on some sites, especially ecommerce, news sites, and dynamically generated location pages. Google has always recognized that sometimes duplicate content is a good user experience, but struggled with tuning their algorithm to adjust for it. They gave us functions, like the canonical tag, to help Google rank content properly (one of the few times they truly empowered SEOs). But it seems to me, the algorithm is now in a state where it’s doing a reasonable good job of parsing duplicate content on its own.
With that hope, I created a couple old-school local landing pages by hand, and linked them off a folder called /local/ on my website. Sorry, I’d love to show you some specific examples, but it’s client work. Instead I’ll continue with the site I featured above.
I used Google Analytics’ keyword report to show any local based natural search keywords to inspire my first three local pages. In this folder was a healthy Philadelphia, Houston, and Phoenix based landing page, beautifully optimized for all the terms I wanted to rank for, including useful content catered to the uniqueness of each region. This was content I knew my visitors would love. Yet, 75% of the text was identical, including the title tags.
Under the fold, I linked these sites together like the screenshot above, but much less spammy. On the homepage of my website, I shot a local link to one of these pages. The DA of the website is decent, but I was immediately impressed how well they ranked.
The Experiment Continues
With these three pages now pulling traffic, but still feeling a little spammy, I was able to optimize and “keyword wash” them a little better, until I had a go-forward template. From Salesforce I was able to pull a good list of cities who convert well for this business, and prioritize my remaining hundreds of local pages. With the help of my team, we had a few hundred built in relatively few hours. This time, instead of the homepage link pointing to one page, we created a hub local HTML sitemap. Every page I checked was indexed within a day.
It’s interesting to see this working again (it’s been years), but today I was working on on a dynamic template that now pulls from a database of zip codes. In my database I have enough unique content to push the 75% dupe content to 25%, just to make it more penalty proof and user-focused. I’ll have hundreds of these pages by the end of the week. This next step of care is going to make a bigger difference.
Results So Far
Now with almost 200 pages since May, it’s great watching the traffic come in. The local pages represent 22% of my total natural traffic in October. My natural search conversion rate is 23% higher for these pages than all my other organic keywords. I’m exciting to grow this with more pages.
This Will All Die If…
Hopefully for a few of you this will be actionable, and might drive a new strategy. But I beg you. Don’t spam this like we did before. I’m clearly admitting my first rollout above was actually a little spammy because it was really just about the keyword ranking. If a hand editor or algorithm marked this, they might knock it a bit for over-optimization. Based on the last 10 months, we have every reason to believe Google will come after it without prejudice (if it’s not already on the docket). Do this right, and make it valuable for the searchers. Because this is drawn to pretty specific queries, your conversion rate will likely be higher.
I’m confused. Isn’t this against Google guidelines?
Maybe. If your intent is to “manipulate search engines and deceive users by directing them to sites other than the one they selected, and that provide content solely for the benefit of search engines.” But what if your local pages are actually unique to location? What if while hoping to win in SEO, you’re also providing unique value for the targeted region? If you’re a service provider in Philadelphia, you could write something on your Philly page about the average wait time for Philadelphia service, or a unique phone number for Philly residents, or maybe other local resources that align with your offering? Suddenly a doorway page seems more valuable.
I don’t know of any page like this being Panda’d out; the popular definition of a doorway page is a page that deceives users (usually living on microsites) that funnel traffic to a destination they didn’t originally want. I don’t condone spam, but I do urge you to draw your own conclusion and take care when implementing this tactic.
Here’s a quick link building (or link reclamation) tip for you. Google Webmaster Tools has really grown. Yeah, there’s still some squirrely reporting (like why my impression count is exactly the same every day), but the Crawl Errors function is vital for anyone who adds and removes a lot of pages, or has switched sites and URLs.
A client of mine recently got a new website. More than a reskin, 98% of the URLs had changed (for the better). With Screaming Frog and some insight on what the URLs were going to be, I was able to whip together a good .htaccess file to use.
The new site has been live for a few months now, and despite thinking I had the 404 issue pretty covered, I logged into the Crawl Errors tool in Google Webmaster Tools.
I thought I had it under control. Clearly not. But Google makes it easier than ever to fix. Click the Not Found button, and take a look at the list of 404’s it gives you.
Ideally you can clean these up with a couple sweeping server redirects. In my case I simply forgot to remove an old XML sitemap. But the beautiful thing is that each resulting page can be clicked for more information:
Are you of the video persuasion? Here’s the a screencast of the tactic:
Doing big ecommerce for years meant I didn’t get too much experience with local search. You may not know, but (for example) the Toys R’ Us website and the brick and mortars aren’t really connected, which is (fortunately or unfortunately?) pretty common in enterprise ecommerce. Many big retailers who have an online presence only put a small amount of their funds and attention into the .com, typically resulting in silos.
Now in my latest role as a B2B marketer for a regional business, I was excited to dive into some local work. The problem is, I didn’t do a great job keeping up with this specialization. I needed to ameliorate myself. I didn’t totally understand the Venice update, and there were changes with the packs that I didn’t totally follow. I was more experienced with optimizing local pages in Google’s general search, than for the more intuitive local packs.
At Mozcon 2012, Darren Shaw had one of the most useful presentations for me. I asked him to go to dinner (yes, I wasn’t afraid to ask for a date apparently) to pick his brain. He was meeting his family that night, but was kind enough to help me out following my Seattle visit via email. I’m a regular user of Whitespark now. There’s a great citation finder tool (with a positive SEObook review here), and they have services that dig way deeper than, say, Yext. Whitespark also teamed up with Citation Labs to create the darling Link Prospector. He’s humble about it, but Darren and Whitespark should be on your radar.
I asked him some questions and decided to share the answers – hopefully if you’re at the same level as I am with local search, this will be very useful to you to too.
What are some ways local search can help drive qualified traffic that sites without a brick or mortar counterpart haven’t considered?
There are plenty of local service based businesses without physical offices. Appearing in the local pack listings can often drive more clicks than an organic listing, especially if you’ve taken the time to set up Google Authorship to make make your listing stand out with a profile photo.
One major benefit to having a local listing are the reviews that potential customers can read to evaluate and select your business. A prominent local listing combined with plenty of positive reviews is a guaranteed business booster far beyond what you’d see with only a high organic ranking. People trust user reviews more than what you say about your services on your website.
Can you define the Venice update? Does Venice only affect Google’s local vertical (ie, the local packs), or does it also contribute to rankings in the regular results.
In a nutshell, Venice localized the organic results. Since Venice, if google detects local intent in the search query, they’ll try to return locally relevant organic results in addition to the local pack. For an excellent, in-depth, guide to the implications of Venice, check out this post from Mike Ramsey on SEOmoz’s blog.
What are some of vital local search tactics, maybe compared to life before the Venice update?
The blended algo was already in place prior to Venice, but the organic factors (onsite & links) gained more weight.
The local search tactics we employ didn’t change post Venice. The core tactics remain:
- Local Google+ Page optimization (categories being the most important)
- Website optimization
- NAP (Name, Address, Phone Number) consistency (audit and clean up of existing citations)
- Citation building
- Review acquisition and reputation management (responding to reviews).
- Content development and link building.
Does Google+ integration change tactics and strategies much?
Not much. There are two things that changed:
1) We now encourage our clients to be more active in Google+. I’m not convinced that social signals are providing any permanent ranking benefits at the moment (although we are seeing temporary boosts), but I figure it’s going to be valuable in the long term to have some social authority built up.
2) Businesses can review other businesses AS the business rather than an individual, so this opens up new ways of acquiring reviews by asking your business partners to review you. You review them, they review you, win-win.
What are some of your favorite ways to optimize for local search inside and outside of the packs?
– Tracking down and cleaning up inconsistent NAP data in your citations is a time consuming and frustrating task, but it can have a very positive impact once all the issues have been sorted out. We’re going to be launching a service for this soon.
– Getting a few very high quality, locally relevant, links can give a great boost to your rankings. Sponsorship opportunities at the local colleges are good for this. (this tip courtesy of David Mihm)
– We love citations from locally relevant and industry specific sites. You can use the Local Citation Finder to find them (see the how-to at the bottom of this post), or you can just hire our citation building service to do the hunting and submitting for you.
– Using the Link Prospector to find local guest post opportunities, and getting a citation as well as a link in the post has been working well for us. We also use it to find those high value, local, sponsorship opportunities that I mentioned above.
My Twitter is @billsebald, and I hope every one of you follow and communicate with me. Read on and find out why.
Sometimes you have to make your own luck. You don’t win the lottery if you don’t buy a ticket. You won’t win a marathon if you don’t get out of your chair. And you don’t make friends if you don’t communicate with people. We’re wired to grab at opportunities that seem obvious, but we don’t typically pause for serendipitous moments.
The more SEO evolves, we find ourselves stretched thinner and thinner. There’s a lot of noise – it grows faster than the tools we create to carve through. Our focus is rarely pinpoint, while our attention span needs to be wider. It can get scary and overwhelming. It’s the fright that drives a bigger swarm of rabid land-grabbers to the same obvious relationships. Whether you’re a link builder or in PR, you know you’re fighting in a mosh pit of like-minded peers after the same prize.
I love networking. Not necessarily through the traditional kind of awkward meet and greet, name-tagged, stuffy network events. I’ve always liked digital networking. Since I can remember, semi-anonymous communications to people on Myspace, mIRC, chat rooms, Listservs, BBS, etc., was always more comfortable for me. Like most of us, I used to hide behind usernames before truly branding myself. I’m a social butterfly though only on the web. You can imagine why I’m a Twitterholic.
I get many calls for consulting work. If I were consulting full time I wouldn’t be hurting for clients. Many are from old co-workers, old client referrals, current client referrals, and friends I’ve made on Twitter or LinkedIn. By being helpful, being generally kind, and not being afraid to give something away for free, I’ve seen returns. I’ve created great friendships just by chance communications on Twitter.
A relationship that sits above the business deal is huge. I know for a fact some major agency deals are made because of past relationships and current friendships (I’ve been in the room!!!). I’ve seen companies go through the whole RFP dog and pony show as part of procedure, when in actuality the vendor was already chosen based on prior relationships. Keep and eye out for luck and you have this: Serendipity > New Contacts > Nurture > Friendship > Opportunity. You define friendship.
Twitter is amazing for this. I respond to everyone who ever sends a note to me. It’s not that hard because I don’t have a Rand or Danny following and schedule (now that would be difficult!). I’ve blogged about relationship nurturing on Twitter, and how the SEO industry should maintain the practice of supporting each other without labels/levels/titles or any other ego. But I also think the same friendly quality should go to everyone you communicate with on Twitter and Linkedin (or any other digital network), including those outside your industry. You’re creating more luck.
Here’s a recent case where the serendipity could have worked for someone in our industry. I was working lightly with a client who needed a specific function of SEO, something I just didn’t have the bandwidth to handle. Concurrently, I followed an SEO who occassionaly tweets about this niche. I sent him a few tweets to feel him out. They weren’t, “hey – are you free to take this client?” It was more of me trying to jump into the conversation where I thought I could add value, and just see what kind of warmth I would get. I got no response, while I was looped out in the continuing conversation.
Another topic came up a few weeks later and I tried to add some color again with the same SEO. Still no response. Eventually, since I was still thinking about him for this opportunity, I sent a public tweet directly to him asking him a question related to his niche. Still no reply.
Takeaway: Perception Is Reality
There was a chance for this SEO to strike up a conversation with me, to where I probably would have DM’d him with the opportunity. For whatever reason, he didn’t take the chance of communicating, and I lost interest in him. Later when I was pruning my “following” list, I apparently made a semi-conscious decision to cut him. Now he’s completely off my radar.
I don’t know if he’s looking for work or not, but it’s still a missed opportunity. And I have the perception of him as a “not so warm and fuzzy” guy because he didn’t get back to me. True or not, perception is reality. This is where some people say, “it was never meant to be.” That statement drives me crazy. Of course it’s not meant to be if you don’t nurture serendipity.
I was looking for information on creating a firepit in my yard. I thought a homemade firepit might be fun to build, so I hit Google. I found an article on a website that I wouldn’t normally visit, but it was coincidently a niche my client serves in. While reading the article (and enjoying the warm tone of the blogger), I decided to write her a note telling her I liked the article, asking a follow up question, and then giving a subtle link pitch. We had about 3 emails back and forth before the link pitch was reintroduced. Not only did I get a link, but I got a glowing review, completely unprompted. I also found out she has some other sites I was interested in, and that she and I grew up in the same town. I added her on LinkedIn, and sure enough, got a surprise SEO referral from her 2 weeks later. All because I squeezed everything I could out of a firepit post.
Takeaway: Take Time To Learn What A Person Has To Offer
When you come up to someone’s front door with a vacuum cleaner in hand, you look like a vacuum cleaner salesman. The door won’t open. Understandably serendipity isn’t scalable, but you’ll get things out of it that the other land-grabbers are probably not getting. Once luck hits, I like to romance the connection.
1n 1998 I started hanging out in a local record shop. The owner wanted to take his music shop online (which back then mainly meant selling through eBay), so I offered to help him out for a couple bucks while I was in college. One of our customers wanted a direct connection to get first dibs when new CDs came in. I didn’t mind sending him emails when something I knew he liked came in (I could have blown him off). I did this for years, and we started having great musical discussions through his prodigy email address. It turned out he worked for Atlantic Records, and started getting me backstage passes to shows when they came through Philadelphia. With all the access to rock stars, I got inspired to interview them and post it online. Two years later I had an online music magazine, amazing experiences, and was introduced to search engine optimization. I wasn’t seeking any of this initially.
Takeaway: Good Will For All
My SEO career started by chance because I was a music fan, and was willing to look into an opportunity instead of sitting on my ass. I took chances, tried things without worry that I wouldn’t like it, didn’t sit around thinking too hard about everything, and just positioned myself for opportunities. By putting myself out there and doing favors, it paid off and led me down a path I’m incredibly thankful for.
Hopefully this gives you something to think about while we all do this SEO thing together.
This is me. Daniel E. “Rudy” Ruettiger. I look a lot like Mikey from the Goonies.
This is Google:
The other day the clouds opened, and the mighty hand of Google left a note in my Google Webmaster account. It was the rumored “Manual spam action revoked” email. As @armondhammer put it on Twitter, “That’s like getting a presidental pardon, Google style.”
For those who like recovery stories, here’s how I figured mine out. Like Rudy, I didn’t give up. I had a huge mountain of uncharted trails ahead of me. And I, well, I also got lucky as hell.
I have a lot of sites, but only one got spanked back in March. I always want to be trying everything in SEO; most of my sites were clean, some were a touch dirtier. The niche I was battling in had(has) an abundance of spammers. Somewhat familiar brands were using forum spamming, paid linking, link wheels – you name it. They were pounding the big box retailers on head terms. Although I didn’t get too sucked into the vortex, I did ultimately lose to the urge to fight fire with fire. I participated in some blog link networks to level the playing field. I went gray.
This was the post that woke me up: Unnatural Link Warnings and Blog Networks from SEOmoz. I heard rumblings of the blog link networks getting sacked (including Authority Link Network). I knew a lot of posts were being deindexed and the junk links were being severed, but that’s the risk you take when you break Google’s commandments. Historically, the worst thing that could happen is Google would devalue those links from perceived bad neighborhoods. They wouldn’t actually penalize the website. But thanks to that SEOmoz post, my confidence was rattled. I remember getting home from work and reading this post 30 times in a sweat. I can still picture Carson Ward’s smiling profile picture.
Thanks to Carson’s post, I learned about the “unnatural links” warning that Google started sending out in Webmaster Central. Up until then, I rarely went into GWT. But sure enough, I logged in, and there it was. It might as well been written with a neon font and Myspace-style glitter .gifs – it couldn’t have been more sickening. It felt like a busted high school party – the cops were outside, and everyone was dashing to make sure they weren’t the unlucky schmuck who got nabbed. I instantly went to Build My Rank and chose the remove live posts option that BMR was kind enough to offer, and hoped my error would fade into obscurity.
What Was I Thinking?
A colleague serendipitously turned me onto Build My Rank. It was cheap (when cheap actually worked), and was an an efficacious defense to my spamming competitors. I had already been writing original content for guest postings; in my mind this was merely a more automated extension of that. I felt a risk but really never thought Google was going to use them as a rally point, let alone make them into a Panda poster child. Of all the things Google had to clean up (and ultimately got with Penguin), low PR blog link networks should have been prioritized later in my opinion. But it was like crack – the rankings went up for nearly every keyword I targeted using BMR. I kept pushing my secret drug. The more the service started to feel dirtier, the more blind I made myself.
[box title=”Build My Rank” color=”#000000″]Build My Rank allowed the user to pay a “per article” fee on top of the monthly subscription. The writers (who I believe were in-house – not sure if that’s true) weren’t very good, but BMR also let you write your own unique content. They’d prohibit your article if it didn’t meet their uniqueness and quality standards (though the rules seemed to be lax for their own authors). This was their way of justifying to their audience that they were Google-proof. Clearly that didn’t work out so well for them.[/box]
So, while this network was getting caned with bamboo, my targeted rankings plummeted. I didn’t know if it was because I cut all these links out of my link profile, or because I was being penalized. There was a lot of confusion at this point, and very little details from Google. They kind of let us, well, sweat.
I sent in my first (of many) reinclusion requests. I was honest. I told them about the crack I’d been smoking. I also told them I’d removed the posts and I wouldn’t disappoint them again (I’ve kept my word). My thought was this request would really go to nobody, but while months went by (as did several Panda updates, and a Penguin) I slowly started to see my rankings return. I was also now doing nothing other than clean, G-approved SEO. I had a reputable news company helping with legitimate content marketing. I worked with them to make sure the pieces was informative, unique, question-answering content. They did internal linking, and studied the analytics to look for other content marketing opportunities.
It was about this time I saw virtually all my rankings return, except about 6 of my major converting keywords (all synonyms and plurals of each other). Those were my big terms. In this website’s niche there isn’t a lot of long-tail, so I was still a wounded SEO. Meanwhile I was now getting new, fuzzy WMT messages: “Site violates Google’s quality guidelines,” with notes like look for possibly artificial or unnatural links pointing to your site. Wonderful. Is this sort of the same issue spoken a different way? Was it something else? It appeared like this doesn’t have anything to do with Build My Rank anymore, but how could I be sure? This looked like problems with my external links (ie, backlinks from other sites). The blogosphere generally seemed to think so, so I went with it.
I pulled an OSE link report and saw a lot of spam – much of which was there before I started with this client, though some was new. A link wheel was pointing to me, started in August 2011 (according to the posting dates in the post’s meta data). Now, I admitted I wasn’t squeaky clean, but this wasn’t my doing. This was a huge sloppy footprint that I found in minutes. I assumed the Penguin algorithm could find just as easily. It targeted only one keyword – my industry’s biggest head term. That can’t be good, but Google wouldn’t let negative SEO work, right? I promptly sent this discovery to Google in yet another reinclusion request.
This is where Google ultimately let me down. They seem more interested in tackling the webspam they helped promote with PageRank. There would be casualties, including more innocent casualties than I. There wasn’t anything in OSE or the links reported in GWT that looked too bad except this link wheel. Does that mean the other spam links were ignored? Never found? I think it was June/July when I finally jumped into the “negative SEO works” camp, and ate my decade-long Google fanboy hat for breakfast. For a company that wants to be transparent, this brick wall causes more problems from generally helpful SEOs.
I Started To Feel Like Dr. Richard Kimble
I made a mistake, was in the wrong place at the wrong time, thinking that I was still “kinda” doing what wasn’t explicitly called out as bad by Google. I fell into a bad crowd. Now I’m in a shitstorm that I can’t explain, fix, or understand. I had to buy a Remove’m package to basically send Google a spreadsheet saying I tried to contact every shit website that was linking to me. 5% of the results that showed from that tool had a contact associated, and I heard back from 1% of the recipients I sent an email to. Still, I sent this in yet another reinclusion request with the note, “I tried.” This was – and still is – absolutely absurd.
It was at this point we learned that these were manual penalties, and I was at the mercy of a Googler who just didn’t like me. Yes – I did take it personally. Who the hell was this manual hand editor? Why couldn’t I win his heart? This reinclusion request was rejected as well. I was still a fugitive.
My Last Reinclusion Request
At this point I had given up. I was sick of hearing tips from people who never claimed to come back from the manual penalty (many of whom seemed to be confusing this as Penguin). It was chaos in the streets. A month had passed since my last failure. I had no more changes to make. So I drafted one last reinclusion request, even though I didn’t do any more clean up. I had nothing left to do.
[quote style=”1″]Dear Google,
I am truly sorry our relationship had to end like this. I should not have cheated on your Webmaster Guidelines. Call it a momentary lapse of indiscretion, but it’s all gone too far. You tell me my back links are poisonous, but I did not create any that you are now showing me in my Webmaster account. I truly don’t know how to remove them. I wasn’t trying to hurt you and your users. I do not want to torch my site because it really is a valuable resource for searchers. I hope one day we can be friends. Call me.
But luckily I had another idea before I hit send. I started think about “over-optimization”. Though I didn’t believe I was in a Penguin filter, I was manually flagged nonetheless – it still could have been a Penguin-type, on-site, over-optimization crime. Since the webmaster message they send is obviously canned, and there’s quite a number of things a webmaster can do that is “wrong”, maybe I can try not taking the message so literally. Maybe it’s not about “links to my site” as in external links, but maybe it’s over optimization in my current site. Maybe I’m not reading between the blurry lines Google has always been known for.
I started looking through the content marketing articles I had on the site from the news company (mentioned earlier). They used internal links between the articles and the top-level pages as an SEO best practice. I started to realize that at some point the anchor text started to get very similar – in fact, it began centering on my 6 core keywords. The more of their articles I read, the more the penalty trigger seemed obvious. Look for possibly artificial or unnatural links pointing to your site. Well, these looked artificial, unnatural, and they were pointing to my site (even though they were already within my same domain). The intent of the links were to pass PageRank, deepen crawls, and yes, help with certain keyword rankings. Maybe Google only recognized the third intention? I had nothing to lose – I removed these links from 80 posts and sent the reinclusion request.
Admitting My Mistake
All of this was pretty humbling. I made a mistake that set of a chain of events that I didn’t expect but should have forseen. I know Google. I know how they are vague in their guidelines. I know how the search product is always full of surprises, both good and silly. Every SEO makes mistakes – we’re in a field where very little is textbook. Secretly I know a few big name SEOs who (in confidence) have similar stories. I’m ashamed that I didn’t see it earlier, but I took my eye off my tactics. I’m saddened that Google took such a hard line with me while those blatant spammers still exist and dominate. But there’s something to be said about “doing your time.” I truly think I gained some good experience in a new world order. I also believe that Panda and Penguin – which now appear long overdue, and not the “wreckless moves” I used to consider them – are some of the smartest filters Google could have put in. They’re taking a risk with the casualties, to bank on better results by the end of the year. I mean, as a business built around algorithmically serving the best webpages, how could they not get more aggressive (and include humans, Mahalo style). It really was just a matter of time.
If you like recovery stories, a good one was just posted on YOUmoz.
Bloggers and content marketers get writer’s block.
Luckily, if you mine Google Analytics, inspiration is right around the corner.
We know a few things –
- We want to write about things people are searching for and interested in
- We want to write about things people like to share (create some advocacy)
- We want to write something fresh
Market research you say? We already have that at a cursory level.
This is the obvious one. Pull up your search keyword reports (ignore and grit your teeth at the [not provided]), and look for keywords that may have brought some long-tail traffic.
According to this, one of the engines think I already have some relevance for “the difference between Google and Bing”. Now I’m inspired. I don’t really have an article like this, so maybe I can spend some time thinking about what my take on this would be.
In May I blogged about Google Analytics new social reporting features. If you haven’t gotten into these reports, check them out (or read my post). I find myself in here a lot. How do you know what people are interested in? They’ll tell you by sharing and clicking.
Below is a snapshot of Twitter visits (click to enlarge):
I did a blog post about about lessons learned through unfollowing people on Twitter. SEOmoz picked it up in their Top 10 and drove a ton of traffic, which is a sign right there that people seem to be interested in Twitter topics. On days where the SEOmoz influence wasn’t directly present, I was able to click around in this report to see that it was tweeted 40 times since its posting. More inspiration that people liked the topic, right? Well, maybe – though Twitter sent it 187 visits, it had a low Average Visit Duration. I dont know about you, but I can’t read an article in 36 seconds. Something about this article didn’t appeal to most of the people who read it through a Twitter link.
However, a more recent article called Search Marketing Content vs Digital PR didn’t get the share-heat that the Twitter article did, but it’s average visit time was over 3 minutes. I’m inspired – I have some more perspectives on search content writing.
Time On Site
Mentioned above, I use time on site as an indicator that someone is actually reading my stuff. As a writer, that’s my goal (as well as funneling them through conversions). By clicking Content > Site Content > All Pages, you can sort by visits and duration.
This is based on all traffic. With this view there’s a little more redemption for my Twitter article. The Average Time On Page is up. I don’t segment my different digital channels, but if I did and wrote for one channel only, this would be useful. Audiences of different channels have different habits based on the medium they used to find you – it’s always fascinating to me, especially how different it can be in eCommerce.
That’s All Folks
Nice and easy, and tends to give me enough inspiration to kick off a brainstorming session and fill my editorial calendar (which I do hope you’re using). If you’re interested, here’s the editorial calendar template I use. I leave mine in Google Docs so I can quickly pull it up, jot a couple of ideas down, and save for when I’m ready to write an actual post.
Make sure you check out Anthony Nelson’s comment below – that’s a great tip as well.
301’ing to homepage (test)
I’ve been on Twitter since 2007. I’m certifiably addicted, but I’ve never kept my main feed organized. It was too much work after I let it all pile on. My Twitter was getting fat.
Years ago Twitter was asked, “what are you.” Twitter’s answer, “whatever you want us to be!” Some turned it into a prospecting tool, an RSS feed, a toy, a chat room, a customer service tool, a spamming tool, a stalking/trolling tool, or a brand manager. I realized I never really turned it into anything. It’s like a tornado of people, and I just spiral around in it without any real habitual use. But one thing I never did was look through my raw Twitter feed. I use TweetDeck for Chrome, and completely removed the main Twitter feed.
I was sweeping my mess under the rug. I’m usually very organized, probably due to a little OCD. My Twitter usage did not reflect that. Sure, I relied on lists, but I didn’t build them out nearly enough. I was missing other good things in my main feed that didn’t get automatically filed.
I decided to break off my “relationships” with 3,000 people. I did it by hand using Tweepi. It didn’t give me the sense of power I hoped. Most of the mutual followers didn’t realize I existed (just like High School), but for some reason I was still in a relationship status with them. I certainly expected to lose a ton of followers (assuming many of them were only following me as long as I was following them, but with TweetAttacks vanishing, maybe that was less likely?). In a week I’ve lost only a few hundred.
For some tweeters, it was hard to say goodbye to the icons I’ve gotten familiar with. I’m not kidding. By removing everyone manually, I tried to remember the good times. Some were big brands that followed me back, or big Twitter-celebrities. Yes – I said goodbye to the Zappos CEO. I was impressed 5 years ago when he followed me, but we’ve never spoke (plus he’s apparently seeing 369,000 others). I dropped virtually all the brands I was following. I dropped SEMs and social specialists if we never communicated, or if they never responded – with the exception to a few who were really thought leaders or good friends.
Here was some of my criteria:
1. If we haven’t had a conversation in 2 years, and your content doesn’t really excite me, I broke up with you.
2. If you don’t respond to me, and you’re not a top provider/curator of content, I dumped you.
3. If your icon was a hot woman, but your name was George, I let you go.
4. If your shirt off was in your icon (and you’re a guy) you were severed.
5. If you have a Z in your name where you should have an S, I dumped you on principal.
6. If your icon was an egg, dumped.
7. If you haven’t tweeted in over 3 months and I didn’t know you personally, I cut you loose.
8. If your icon was an animated .gif, gone.
9. If you were an obvious bot, I asked myself how I ever followed you, then gave you the boot.
10. If you retweet really dumb things, I buried you.
11. If you appear to follow everyone who follows you (like I used to, which is how I got into this mess), you’re toast.
12. Abusive use of the underscore.
What Did I Learn?
For me, I realized that I was doing Twitter wrong. I want SEO industry content and some laughs with my friends. I want to be on the pulse of what’s important through the lens of the people I enjoy and respect. I meant no disrespect to the people I cut – I’m sure there are lots of great people, but the connection was never made. I want all my mutual connections to be real connections, more like my LinkedIn. Now I’m following much fewer users, and put my raw stream back into my grid.
It’s been a pleasure. And it’s controllable.
Why Should You Follow Me If I Won’t Follow You Back?
Maybe you shouldn’t, especially if you haven’t stopped to figure out what Twitter should be for you. Granted, my tweets/retweets are 50% relevant to SEOs, with the other 50% being hilarious, but if you’re not into that type of thing, why follow me? I’m also very responsive on Twitter – I respond to everyone, so if you like a good conversation, strike one up with me. That’s another good reason to follow me. If I agree that we’re “hitting it off” I’ll probably follow you back.
But why does Twitter need to be a mutual connection?
My Admission – I Was A Twitter Hoarder
How did I let it get this way? In the beginning I had some bad habits. I followed everyone who followed me using a tool (who’s name I forget). I also did a lot of following of people in lists (instead of just following their lists). I followed a lot of people who others I admired were following. I did this blindly, assuming that I’d be able to find a few favorites after a few weeks of watching tweets. #badplan
I also used to do consulting, and thought of Twitter as a real business prospecting tool. I semi-consciously thought a high follower count could be seen as clout. The problem was, although I had an auto-DM, I didn’t nurture any of the contacts. I was a complete Twitter hack for 3 years. I only got bit by the bug and really started to understand its value in the last couple years.
Twitter has introduced me to great people. I’m excited for Mozcon in a couple weeks to meet people I speak with on Twitter. I’ll learn something there, but suspect much of it will be through conversations and networking due to the relationships I’ve made on Twitter. That’s really pretty huge.
A bit of a rant here. At the risk of putting my credibility on the line, I can honestly say i don’t know for certain how to get better rankings post Penguin and Panda. I know others secretly say this as well. It was easy there for a while – I was almost willing to slap the guarantee sticker on my services. But the flux right now is completely ridiculous. The Google Dance is bad. I have no doubt we’ll figure it out soon, but in the meantime…
I think Google needs an intervention. I just want to shake them and scream, “help me help you!”
How many posts do you see now titled “How to recover / beat / game Google post-Penguin or post Panda?” Did you ever read one? The popular ones are rehashes of the traditional ways of clean SEO. Last night I met some great people from Microsite Masters (at the Philadelphia SEO Grail) who have claimed they’ve seen other sites recover from Penguin, and after explaining how, it sure wasn’t by the fluff that these other posts have been feeding. They went into detail. I believed them. I sure don’t believe the generic, recycled drivel my Twitter feed is being bombarded with. I’ve actually recovered from some Penguin stuff myself – again, not by the way people would expect if all you read is the SEO “content for content’s sake”.
As an industry, there’s a lot of us who are really guilty of patting ourselves on the back, playing the ego game, writing for content’s sake, and not being transparent. This is an industry born in the trenches with new students appearing everyday. The vets know how to look past the “characters” but it’s far from evident to the newbies. I saw a great character last night at my SEO event; many of us in the room enjoyed watching him jump around, but we weren’t going to let his vocal misinformation permeate with the newbies in the room. It was a good feeling of unity and what I think is really required from SEO’s to continue growing. Now’s your chance to put your stake in the ground and have a persona, if it doesn’t have value, it won’t last.
I urge you to start writing content that actually is either 1) actionable, 2) a strong opinion, or 3) proven to some degree. Teach your readers (kids) the things you’ve done to recover, not what you think the answer is. At least be transparent with them and say you don’t know, or you haven’t recovered yet. Do your part not to litter the SEO stream.
This goes for presenters as well.
Thank you for listening. I feel better now: