Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Cloaking? with multiple domains

11:34 am on Apr 29, 2004 (gmt 0)

New User

10+ Year Member

joined:Apr 29, 2004
votes: 0


My company has experienced some problems getting "recognised" by Google, i.e. we're not up where we'd like to be because of our technical situation.

We run two webservers that host 4 individual sites, that share the same technology. To do this we redirect the visitors to the correct site according to what domain-name(host) they want to go to, so that users that have typed in [domainx.no...] go to domainx.no and users that have typed in [domainy.se...] go to domainy.se. These domains point to the same IP in DNS, of course. The redirection is performed by our controller servlet. We also use frames to make the situation worse...

Now, a scenario that we have come up with is to develop specific static pages for search engines, and use robots.txt to lead robots onto these pages. These pages will be viewable by normal users, and will come up in searches,i.e. these are the pages that users will go to if they click on the search results in a search engine, so we're not *really* cloaking.

Trouble is that we have to do some redirection in the static pages(they're not really static, they're generated using java(Silverstream)) so that spiders and also users are seeing the correct pages according to what site they're on. Can we be penalised by Google, or any other search engine for that matter, for doing this? What about the static pages? Is this frowned upon?

just to make it absolutely clear; we're not cheating or doing anything underhand with these static pages, and they are being developed so that they will appeal to the normal user.

6:23 pm on Apr 29, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 22, 2001
votes: 0

My impression is that you are not cloaking or being penalized for doing so by Google.

However, if the domains have duplicate content, you may be running into that problem. Duplicate content will cause some of the offending pages to be left out of the index, as Google will only include one copy of a page it considers to be a duplicate.

6:03 am on Apr 30, 2004 (gmt 0)

New User

10+ Year Member

joined:Apr 29, 2004
votes: 0

Thank you!

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members