homepage Welcome to WebmasterWorld Guest from 54.204.249.184
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Sharing database for muliple sites
snook




msg:81675
 3:27 pm on Dec 9, 2004 (gmt 0)

I have large website that is fully index by Google. (5 years old)

I want to use the database of this site, to create new websites.

The new sites will have different look, navigation structure, page titles, etc

Any hazards with Google doing this?

OK to be on the same server / subnet?

And tips, articles that you can point me to, that will assist in doing this?

Thanks in advance!

 

huulbaek




msg:81676
 4:14 pm on Dec 9, 2004 (gmt 0)

Well, I did the same thing with a PR5 site - because of much of the content being the same (although targeting two different market segments) both sites got dumped down because of it.

Just my experience...

walkman




msg:81677
 4:28 pm on Dec 9, 2004 (gmt 0)

"Well, I did the same thing with a PR5 site - because of much of the content being the same (although targeting two different market segments) both sites got dumped down because of it"

when did that happene and did you recover? I assume you removed one site...

conor




msg:81678
 12:51 am on Dec 10, 2004 (gmt 0)

My advice .. Dont do it!

If you really must 'data feed' out to your other sites - run a thesaurus replace against every description in your database or be prepared to reep the whirlwind of your own complacency...

dmedia




msg:81679
 5:07 am on Dec 10, 2004 (gmt 0)

I too have experienced "site dumpage" which *may or may not* be related to my habit of taking the easy road and recycling data from site to site.

Whether there is a direct cause/effect at work, can't say for sure .. (other factors may be involved in site dumpage) .. anyway, I'm more cautious as a result.

Possibly there's an "affiliated network" sniffer at work at the Plex .. as I see so many site that are wikipedia clones (just one example) that seem to do pretty darn well with the same data.

djgreg




msg:81680
 7:49 am on Dec 10, 2004 (gmt 0)

I am operating several sites which are fed from the same DB and currently I do not have any problems.
Just be careful to make the sites different enough.
For example structure your content in different ways.

greg

surfgatinho




msg:81681
 9:27 am on Dec 10, 2004 (gmt 0)

What about all the affiliate sites that occupy the top 10 results selling the same stuf using the same provided feeds.
That's often the same db, different sites - maybe it's because there is no association between these sites.

snook




msg:81682
 12:30 pm on Dec 10, 2004 (gmt 0)

thanks for the replies

greg, are all site on the same subnet?

surfgatinho, excatly what I was thinking with all these afflitates using datafeed (like CJ)
They do not seem to have a problems..

I am thinking the subnet might be checked, if there is any signs of the duplicate content.

For those replying, with postivie or negetive effects from using the same datafeed for mulitple site, can you please post if they were on the same subnet

Thanks

coburn




msg:81683
 3:31 pm on Dec 10, 2004 (gmt 0)

djgreg (& others who may know) - just how different does the content need to be?

So if you swap the first and second paragraphs around - will that do?
If you change a word in each paragraph - will that be enough?
How about if you add an extra line at the top and or bottom of the page?

walkman




msg:81684
 2:59 am on Dec 11, 2004 (gmt 0)

"So if you swap the first and second paragraphs around "
how is that helping? The robot still sees the text on the same page. I wouldn't do it, and only if someone works for Google can tell you how they filter it.

djgreg




msg:81685
 7:52 pm on Dec 11, 2004 (gmt 0)

In fact I don't know how different websites have to be. You can easily figure out how different they should be by looking at the wikipedia and it's copies on the web. It seems that it is enough to programm some navigation and design around the wiki-content to be regarded as a individual website. Same thing with all the dmoz clones.

conor




msg:81686
 11:55 pm on Dec 11, 2004 (gmt 0)

.. try for atleast 30 % different

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved