I just watched a video that Michael Gray put up today on his blog about how to optimize your wordpress blog. The video was mostly informative and insightful. Being that I use wordpress, it definitely gave me some things to consider. Mostly he talked about making sure that googlebot won’t read duplicates of your posts by disallowing the robots from your archive files. This way, he says, you will have the engines only viewing your content in the main place that you put it. Which makes sense. Except that I have three contentions with this idea.
1. Everyone, everywhere else is saying to not worry about duplicate content. It is situations like these that are going to make duplicate content inevitable no matter what. Duplicate content is something that happens naturally, and googlebot knows that.
2. The whole idea behind archives and categories is to increase usability on a website. It is extremely useful to be able to sort or seach posts by date, category, author, etc.. It doesn’t make any sense that we should be worried that our site is extremely accessible, aren’t most of us battling to make other people’s sites more user friendly and navigable by both humans and spiders alike?
3. We should not have Google dictating this kind of stuff to us. We should be dictating how we want THEM to function. IMHO it’s one thing to optimize a site so that it is viewable to robots by using text and rich content, and to build links etc etc etc.. Nit-picky stuff like this should not be mandated because googlebot might get confused! Google has one of the biggest, best and most expensive devlopment teams in the world, they for sure can figure this out.
So, mostly I like what Graywolf has to say. I read his blog regularly, and am for what he is doing in the “greyhat” area of SEO. But stuff like this, brings up feelings like noted in #3 of above. Let’s all use good technique as we develop. But let’s also develop from the bottom up, and not worry about who’s sitting on top. -kenny