Forum Moderators: open

Message Too Old, No Replies

Mutliple Websites

Multiple sites with similar content

         

willardnesss

4:09 pm on Oct 30, 2003 (gmt 0)

10+ Year Member



Howdy, I'm considering creating multiple websites with very similar content for my company.

The theory is to create about 4 websites that promote my company's products, but each will have a different look and feel. The main reason for doing this is to cross-polinate links (so all 4 websites have outgoing links to all 4 of my other websites). The graphics and titles will be different, but I will probably recycle a lot of the content.

My question is: How different does the content need to be on each website? What if all 4 websites have the same text content on the majority of the pages? Will Google penalize for this? How different do I need to make each site?

Your advise is much appreciated!

John_Caius

5:10 pm on Oct 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



These kinds of techniques to artificially build PR are pretty old-hat and commonly penalised - why not build one domain with quality content?

willardnesss

5:23 pm on Oct 30, 2003 (gmt 0)

10+ Year Member



I have a lot of great content, and I am in the top 10-15 for most of my search terms (with 131,000 Google results).

I have just noticed that 2 of my competitors have created 5-6 duplicate websites, and they are both consistantly in the top 5 in Google... They even have duplicate content on all of the websites.

I plan to use fresh content on a number of my sites (each site will have a theme that it concentrates on), but I will be recycling some content as well.

Anybody else have experience with this. Thanks!

John_Caius

5:38 pm on Oct 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The main reason for doing this is to cross-polinate links

This is the key reason why it's not a good idea.

Google guidelines:

Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links.

Don't create multiple pages, subdomains, or domains with substantially duplicate content.

kaled

6:02 pm on Oct 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I recently reported to Google three sites with identical content (but different appearance). No action yet but, as I mentioned in my email, ATW shows only one of the three sites. The site contents are good and I would not wish to see all three banned but only one is necessary.

Kaled.

Small Website Guy

9:55 pm on Oct 30, 2003 (gmt 0)

10+ Year Member



If you have multliple websites and domains, each with DIFFERENT content, is it OK to interlink them?

kaled

12:48 am on Oct 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Within a domain, cross-linking is generally considered ok without reservations.

Across domains, the rule of thumb that I would use is that if the cross-links are less than 10% of the total inward links there should not be a problem. That's a guess on my part but I'm a programmer so I my mind probably works in a similar way to the programmers at Google.

Another way to look at is this. Links useful to your users should not be penalised by Google but links designed solely to raise PR may be.

Anchor text is considered by many to be critical. I am sceptical about this but use of anchor text should not no harm. I've recently added keywords to the anchor text of my internal cross-links but have seen no changes in SERPS yet (18 days into test).

Its a theory on my part, but I believe that Google may be starting to look beyond anchor text at the linking pages. So links within relevant paragraph text may be useful. However, this theory seems to lack support amongst WW members.

Kaled.

Josecito

2:18 am on Oct 31, 2003 (gmt 0)

10+ Year Member



where i can report spam? an guy have like 500 domains with same content (using php so the page change on every join)

this technique looks ugly.
he have a lot of links backs from his domains.

dont know if the bot recognize that, coz every time that the bot visit the page the content change.

where i need to report that?

btw, its allowed?

synergy

2:31 am on Oct 31, 2003 (gmt 0)

10+ Year Member



I think that in the future, one large quality site will be more of a dominating force than 2 or 3 small sites would. Especially once Google starts catching up with these crosslinking sites.

ogletree

2:35 am on Oct 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There is no reason to do this for PR. One site is better for Google. Sometimes it is good to make different sites for visitors but not for Google. You will spread out the value of what you have and you will have to work harder to get 4 sites to rank well than you would to get 1 site to rank well.

plasma

2:50 am on Oct 31, 2003 (gmt 0)

10+ Year Member



Across domains, the rule of thumb that I would use is that if the cross-links are less than 10% of the total inward links there should not be a problem

You can't be penalized by inbound links, only the linker could be penalized.
However my experience is that NOBODY will be penalized.

where i can report spam?

It's a waste of time, but you asked for it:

[google.com...]

Josecito

3:08 am on Oct 31, 2003 (gmt 0)

10+ Year Member



WHY U SAy that its a waste of time?

the guy have the top10 results of like 500 searchs taht i do :S

plasma

4:07 am on Oct 31, 2003 (gmt 0)

10+ Year Member



WHY U SAy that its a waste of time?

My experience is that it won't be dealt with.

Josecito

6:55 am on Oct 31, 2003 (gmt 0)

10+ Year Member



so i need to spam google to get my sites high?

eh?

if googleguy dont take care abt it, ill start spamming too, my competence have 3000+ links in, and all fake
damn

kaled

11:40 am on Oct 31, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You can't be penalized by inbound links, only the linker could be penalized.
However my experience is that NOBODY will be penalized.

There was a discussion on this some months ago. I seem to recall one or two old-hands admitting to having had sites banned by Google and believed it was because of huge amounts of cross-linking. Of course, such sites may well have broken other rules like duplicate content, hidden text, etc. so they may have jumped to the wrong conclusion.

Neverthless, I imagine Google has filters to detect link clusters and that if the ratio of cross-links (within a cluster of domains) to links from outside the cluster is too high, an alarm will be triggered requiring human inspection.

Kaled.