| 4:14 pm on Dec 9, 2004 (gmt 0)|
Well, I did the same thing with a PR5 site - because of much of the content being the same (although targeting two different market segments) both sites got dumped down because of it.
Just my experience...
| 4:28 pm on Dec 9, 2004 (gmt 0)|
"Well, I did the same thing with a PR5 site - because of much of the content being the same (although targeting two different market segments) both sites got dumped down because of it"
when did that happene and did you recover? I assume you removed one site...
| 12:51 am on Dec 10, 2004 (gmt 0)|
My advice .. Dont do it!
If you really must 'data feed' out to your other sites - run a thesaurus replace against every description in your database or be prepared to reep the whirlwind of your own complacency...
| 5:07 am on Dec 10, 2004 (gmt 0)|
I too have experienced "site dumpage" which *may or may not* be related to my habit of taking the easy road and recycling data from site to site.
Whether there is a direct cause/effect at work, can't say for sure .. (other factors may be involved in site dumpage) .. anyway, I'm more cautious as a result.
Possibly there's an "affiliated network" sniffer at work at the Plex .. as I see so many site that are wikipedia clones (just one example) that seem to do pretty darn well with the same data.
| 7:49 am on Dec 10, 2004 (gmt 0)|
I am operating several sites which are fed from the same DB and currently I do not have any problems.
Just be careful to make the sites different enough.
For example structure your content in different ways.
| 9:27 am on Dec 10, 2004 (gmt 0)|
What about all the affiliate sites that occupy the top 10 results selling the same stuf using the same provided feeds.
That's often the same db, different sites - maybe it's because there is no association between these sites.
| 12:30 pm on Dec 10, 2004 (gmt 0)|
thanks for the replies
greg, are all site on the same subnet?
surfgatinho, excatly what I was thinking with all these afflitates using datafeed (like CJ)
They do not seem to have a problems..
I am thinking the subnet might be checked, if there is any signs of the duplicate content.
For those replying, with postivie or negetive effects from using the same datafeed for mulitple site, can you please post if they were on the same subnet
| 3:31 pm on Dec 10, 2004 (gmt 0)|
djgreg (& others who may know) - just how different does the content need to be?
So if you swap the first and second paragraphs around - will that do?
If you change a word in each paragraph - will that be enough?
How about if you add an extra line at the top and or bottom of the page?
| 2:59 am on Dec 11, 2004 (gmt 0)|
"So if you swap the first and second paragraphs around "
how is that helping? The robot still sees the text on the same page. I wouldn't do it, and only if someone works for Google can tell you how they filter it.
| 7:52 pm on Dec 11, 2004 (gmt 0)|
In fact I don't know how different websites have to be. You can easily figure out how different they should be by looking at the wikipedia and it's copies on the web. It seems that it is enough to programm some navigation and design around the wiki-content to be regarded as a individual website. Same thing with all the dmoz clones.
| 11:55 pm on Dec 11, 2004 (gmt 0)|
.. try for atleast 30 % different