25 Feb 11 10:34 am
There's has been some discussion cover this in this post:
post63491.html?hilit=duplicate%20content#p63491In that post James reports that he has gotten sites ranking with just a keyword change like you're suggesting. The only caveat he mentions is that it needs to be on multiple sites. So I guess your theory would hold. However, if you've got a good spinner then I would just give it a spin since it takes like a minute or less per article. Here's why I think that's a good idea:
From what I gather, basically your theory is based on this statement you made:
the reason that Google does not use a content filter is that the content is still unique for the particular keyword that is calling up the content in response to a search engine search summons for a particular keyword
First of all, this is an assumption as far as I know. It's an assumption based on results like what James has reported, but still I would say it's an assumption nonetheless since no one aside from the Google engineer really know the algorithms they use for filters. Still, for the sake of argument, we will consider it true until otherwise proven by other reported results.
Even so, I would still advise spinning the content because by ONLY changing the keyword, you are assuming that no one else is gunning for that keyword using the same article. If I were another AJP member for example, and I chose the same niche as you armed with the same articles, what are the chances of me wanting to rank for the same keywords? I would think it would be reasonably high. After all, most of us pick keywords based on more or less the same criteria right? So with two or more identical articles it once again becomes a battle of links which is kinda back to square one. That would be risk number one. (Note: I guess some would not consider this a risk, after all, this just means all the more the content doesn't matter much and what's important is the back links)
Secondly, Google is constantly trying to improve its algorithm to rank the original source content higher in the search rankings. In other words, they are trying to clamp down on plagiarized content and auto blogs. With this in mind, considering your keyword density is probably the recommended 5-7%, that means your article is potentially 93-95% duplicate content. Here's a test, run your article through a free plagiarism checker like plagium and if there are other instances of the article found. If it shows up there, I would chance to say that Google probably would get at least that result too if not better. Now I have no idea how Google determines which is the source article but if your article is one of the many that pops up in this scenario who know what would happen? Doesn't seem like a worthwhile risk to me.
Anyway, what I've said is conjecture at this point so it is very much my personal view on this topic. In terms of REAL results, I've had articles of about 30-50% spin quality rank well (like page 2 or bottom of page 1) for a while. Normally though I would try for a minimum of 50% for the above stated reasons.
Just for the record, Mark also recommends rewriting the articles although I can't remember the exact reasons. I believe he said the only exception is if you are using PPC to get traffic.
Oh, I also want to make a note that I do know of people who don't care whatsoever about duplicate content and still rank well and make money with their sites so I guess it's really up to you to decide what to believe. :) I do it just for my own peace of mind (plus it doesn't take much effort on my part) and so that readers get at least partial if not fully original content.
Joshua