Forum Moderators: Robert Charlton & goodroi
I'm not arguing the point that duplicate content filters would improve search results. But the question is, is it in place? And should we be worried.
I've an affiliate website and I sell about 200 widgets. I've built a brand new site, got links, Pr and imported the data feed. Now I know this exact content is on another 50-500 sites. Do I...
1. Take a week off and rewrite new fresh content
2. Build a tool that replaces some words in my data feed. For example rather than "cool new widget" my site will say "funky widget" or "fresh looking widget". Hey it could work, couldn't it?
3. Do nothing and hope I slip through Google's net. Surely they have bigger fish to catch? How many Amazon affiliates are there?
I know no-one knows for sure but is DC at the page level, paragraph level or at the sentence level? If you can pick 5-7 words from your website, do a search and you find several websites with the same sentence is this DC? If that's the case, is every directory built with the DMOZ dump a duplicate site?
I'd like to hear any comments, thoughts or opinions.
D.
1. Take a week off and rewrite new fresh content
Sounds like hard work to me... not the way of a true affiliate :)
2. Build a tool that replaces some words in my data feed. For example rather than "cool new widget" my site will say "funky widget" or "fresh looking widget". Hey it could work, couldn't it?
Been there tried it. The trouble is you often cannot change advertisers copy or you have to seriously change it to have an effect. You end up with too much work.
3. Do nothing and hope I slip through Google's net. Surely they have bigger fish to catch? How many Amazon affiliates are there?
I think its a combination of flags that trigger the filter. There is no doubt, looking at the serps, that many get through. However, Google seems to pick a broad range of different occurances rather than showing everyone. Its as if one site from each cluster of similar sites gets chosen. If you make your surrounding content just right, then your 'affiliate text' does not get ignored. In otherwords, if there is enough supporting unique text which generates a unique mix of words overall, you get through. This is easier said than done.
Create a page, link to it normally then link to it again with "?var=none". Even though var=none doesn't change the page, it will certainly lower your page rank.
I use it to change style sheets, and it affects my position in the index.
on one of my sites, I had updated a series of pages to include a keyword in the URL, (ie www.domain.com/widgets-item.html) from just www.domain.com/item.html.
well, in the midst of the transistion, I named one page widget-item.html and also renamed it widgets-item.html (the plural form). I forgot to delete the singular one for a long time, and the plural one, being linked to a high ranking page, got indexed by Google.
recently, I noticed that all versions of this page have been nixed from Google completely. it was the only one that I had done that with in it's group, and the only one dropped.
my theory, duplicate content is a no-no with the goog-goog... :-)
I don't see the problem in affiliate sites in my own niches. What seems to happen is that some pages from each affiliate rank well and the rest don't. Some affs barely know how to install and run a script, but lack the knowledge to make their sites a little different or to apply SEO, so they are competing with others with identical look and feel. I expect them to fare worse than those who can modify scripts to get a different result; hence the latter group of people do care about duplicate penalties.
I try to look more at improving the relevance of a page to a search term than to the duplicate penalty concerns. I don't know, but I think Google tries to deliver relevant results rather than not deliver duplicate results. Looking at a typical product SERP, you will usually see a couple of results using one script, a couple using another, and lots of unique sites.
This means I use ways to bring in other content, e.g. combine two scripts from different non-competing merchants, e.g. camera books and actual cameras; printers and ink refills etc. This makes the page more useful to humans.
I often wonder how much email spam gets sent to the email address no@no.no because of people filling in web forms with false information!