Forum Moderators: open
The theory is to create about 4 websites that promote my company's products, but each will have a different look and feel. The main reason for doing this is to cross-polinate links (so all 4 websites have outgoing links to all 4 of my other websites). The graphics and titles will be different, but I will probably recycle a lot of the content.
My question is: How different does the content need to be on each website? What if all 4 websites have the same text content on the majority of the pages? Will Google penalize for this? How different do I need to make each site?
Your advise is much appreciated!
I have just noticed that 2 of my competitors have created 5-6 duplicate websites, and they are both consistantly in the top 5 in Google... They even have duplicate content on all of the websites.
I plan to use fresh content on a number of my sites (each site will have a theme that it concentrates on), but I will be recycling some content as well.
Anybody else have experience with this. Thanks!
The main reason for doing this is to cross-polinate links
This is the key reason why it's not a good idea.
Google guidelines:
Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web as your own ranking may be affected adversely by those links.
Don't create multiple pages, subdomains, or domains with substantially duplicate content.
Across domains, the rule of thumb that I would use is that if the cross-links are less than 10% of the total inward links there should not be a problem. That's a guess on my part but I'm a programmer so I my mind probably works in a similar way to the programmers at Google.
Another way to look at is this. Links useful to your users should not be penalised by Google but links designed solely to raise PR may be.
Anchor text is considered by many to be critical. I am sceptical about this but use of anchor text should not no harm. I've recently added keywords to the anchor text of my internal cross-links but have seen no changes in SERPS yet (18 days into test).
Its a theory on my part, but I believe that Google may be starting to look beyond anchor text at the linking pages. So links within relevant paragraph text may be useful. However, this theory seems to lack support amongst WW members.
Kaled.
this technique looks ugly.
he have a lot of links backs from his domains.
dont know if the bot recognize that, coz every time that the bot visit the page the content change.
where i need to report that?
btw, its allowed?
Across domains, the rule of thumb that I would use is that if the cross-links are less than 10% of the total inward links there should not be a problem
You can't be penalized by inbound links, only the linker could be penalized.
However my experience is that NOBODY will be penalized.
where i can report spam?
It's a waste of time, but you asked for it:
[google.com...]
You can't be penalized by inbound links, only the linker could be penalized.
However my experience is that NOBODY will be penalized.
There was a discussion on this some months ago. I seem to recall one or two old-hands admitting to having had sites banned by Google and believed it was because of huge amounts of cross-linking. Of course, such sites may well have broken other rules like duplicate content, hidden text, etc. so they may have jumped to the wrong conclusion.
Neverthless, I imagine Google has filters to detect link clusters and that if the ratio of cross-links (within a cluster of domains) to links from outside the cluster is too high, an alarm will be triggered requiring human inspection.
Kaled.