Welcome to WebmasterWorld Guest from 50.17.114.227

Forum Moderators: open

Message Too Old, No Replies

Sharing database for muliple sites

     
3:27 pm on Dec 9, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:May 22, 2002
posts:58
votes: 0


I have large website that is fully index by Google. (5 years old)

I want to use the database of this site, to create new websites.

The new sites will have different look, navigation structure, page titles, etc

Any hazards with Google doing this?

OK to be on the same server / subnet?

And tips, articles that you can point me to, that will assist in doing this?

Thanks in advance!

4:14 pm on Dec 9, 2004 (gmt 0)

New User

10+ Year Member

joined:Mar 15, 2004
posts:11
votes: 0


Well, I did the same thing with a PR5 site - because of much of the content being the same (although targeting two different market segments) both sites got dumped down because of it.

Just my experience...

4:28 pm on Dec 9, 2004 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


"Well, I did the same thing with a PR5 site - because of much of the content being the same (although targeting two different market segments) both sites got dumped down because of it"

when did that happene and did you recover? I assume you removed one site...

12:51 am on Dec 10, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 22, 2001
posts:781
votes: 0


My advice .. Dont do it!

If you really must 'data feed' out to your other sites - run a thesaurus replace against every description in your database or be prepared to reep the whirlwind of your own complacency...

5:07 am on Dec 10, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 12, 2003
posts:69
votes: 0


I too have experienced "site dumpage" which *may or may not* be related to my habit of taking the easy road and recycling data from site to site.

Whether there is a direct cause/effect at work, can't say for sure .. (other factors may be involved in site dumpage) .. anyway, I'm more cautious as a result.

Possibly there's an "affiliated network" sniffer at work at the Plex .. as I see so many site that are wikipedia clones (just one example) that seem to do pretty darn well with the same data.

7:49 am on Dec 10, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:July 5, 2002
posts:400
votes: 0


I am operating several sites which are fed from the same DB and currently I do not have any problems.
Just be careful to make the sites different enough.
For example structure your content in different ways.

greg

9:27 am on Dec 10, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:July 17, 2003
posts:560
votes: 0


What about all the affiliate sites that occupy the top 10 results selling the same stuf using the same provided feeds.
That's often the same db, different sites - maybe it's because there is no association between these sites.
12:30 pm on Dec 10, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:May 22, 2002
posts:58
votes: 0


thanks for the replies

greg, are all site on the same subnet?

surfgatinho, excatly what I was thinking with all these afflitates using datafeed (like CJ)
They do not seem to have a problems..

I am thinking the subnet might be checked, if there is any signs of the duplicate content.

For those replying, with postivie or negetive effects from using the same datafeed for mulitple site, can you please post if they were on the same subnet

Thanks

3:31 pm on Dec 10, 2004 (gmt 0)

Full Member

10+ Year Member

joined:May 16, 2004
posts:218
votes: 0


djgreg (& others who may know) - just how different does the content need to be?

So if you swap the first and second paragraphs around - will that do?
If you change a word in each paragraph - will that be enough?
How about if you add an extra line at the top and or bottom of the page?

2:59 am on Dec 11, 2004 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


"So if you swap the first and second paragraphs around "
how is that helping? The robot still sees the text on the same page. I wouldn't do it, and only if someone works for Google can tell you how they filter it.
7:52 pm on Dec 11, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:July 5, 2002
posts:400
votes: 0


In fact I don't know how different websites have to be. You can easily figure out how different they should be by looking at the wikipedia and it's copies on the web. It seems that it is enough to programm some navigation and design around the wiki-content to be regarded as a individual website. Same thing with all the dmoz clones.
11:55 pm on Dec 11, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 22, 2001
posts:781
votes: 0


.. try for atleast 30 % different
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members