Monthly Archives: September 2009
Posted by randfish
If you’ve been around the SEO world a while, you’ve undoubtedly heard the old adage:
Do what’s right for users and engines will reward you with higher rankings
Along with its peer:
SEO tactics that focus on engines, rather than users, are manipulative (black/gray hat) and will eventually be discounted or penalized
In my opinion, both of these statements are utterly false and tragically misleading. In my view, SEO starts with the user (of course), but cannot ignore the incredible importance of search-engine targeted (and specific) tactics. When I first considered the issue, I thought that perhaps, years ago, these opinions were more accurate than they are today. However, after visualizing the issue, I discovered even that isn’t true:
(please note: graphs like this are, as always, just my personal opinion)
The value of tactics from each set has risen/fallen over time, leading me to the conclusion that this was never good advice. And yet, thinking back, I’m almost sure that at some point, at a conference and during interactions with clients, I personally repeated this misnomer. I want to issue an apology for that now and set the record straight – SEO is a task that requires paying close attention to the needs of both users and engines. You can’t be an effective SEO without it.
Just think of all the specific tasks we perform that we’d never do if it weren’t for search engines:
- Title tags: We might still make them, but agonize over keyword usage and positioning, uniqueness and flow? I doubt it.
- Meta tags: Nope. No reason to even bother.
- XML Sitemaps: I’m pretty sure no human has ever visited this file in an attempt to sort out the pages on your site.
- Webmaster Tools Registration: Without engines, there wouldn’t be any.
- Keyword Research: I think this practice would be more like advertising copy – think Mad Men.
- Keyword Targeting: Why worry about keyword placement for anything other than conversion rate optimization?
- URL Canonicalization: No need – visitors are getting the content either way.
- Accessible Link Structures: So long as you’re not worried about the >2% of visitors who can’t see Flash, go ahead and build rich applications to your heart’s content.
- Robots.txt & Meta Robots: No engines, no reason to direct engines.
- Link Building: Unless it’s specifically to draw in relevant traffic, why bother?
- Creating Vertical Search Feeds: That’s going to be time wasted.
- Information Architecture: While there’s good reasons to do some of this for users, a significant portion of the accessibility and link hierarchy arguments are made moot.
- Rel="Nofollow": Internally or externally, it becomes a pointless attribute.
I think the problem with the classic "build for users" advice is that it sounds so compelling and, on a surface level, makes a lot of sense. Maybe this is a good warning not to adhere to any advice just because it seems logical on its face – knowledge and expertise may not make for simple messaging, but, outside of politics, accuracy is far more valuable than fitting into a sound byte.
UPDATE: A lot of folks in the comments are under the impression that I’m recommending against building for users – nothing could be further from the truth. Websites are made for people, and users should absolutely be the focus of your efforts. My argument in this post centers specifically around the practice of search engine optimization and the idea that tactics which are engine-focused (like XML sitemaps, anchor text, link architecture, webmaster tools usage, etc.) can be ignored because they’re not "for the user." The charts and points above are intended to illustrate that if you only focus on "user-targeted SEO" you’re missing a huge chunk of the potential SEO opportunity pie.