Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
My company has experienced some problems getting "recognised" by Google, i.e. we're not up where we'd like to be because of our technical situation.
We run two webservers that host 4 individual sites, that share the same technology. To do this we redirect the visitors to the correct site according to what domain-name(host) they want to go to, so that users that have typed in [domainx.no...] go to domainx.no and users that have typed in [domainy.se...] go to domainy.se. These domains point to the same IP in DNS, of course. The redirection is performed by our controller servlet. We also use frames to make the situation worse...
Now, a scenario that we have come up with is to develop specific static pages for search engines, and use robots.txt to lead robots onto these pages. These pages will be viewable by normal users, and will come up in searches,i.e. these are the pages that users will go to if they click on the search results in a search engine, so we're not *really* cloaking.
Trouble is that we have to do some redirection in the static pages(they're not really static, they're generated using java(Silverstream)) so that spiders and also users are seeing the correct pages according to what site they're on. Can we be penalised by Google, or any other search engine for that matter, for doing this? What about the static pages? Is this frowned upon?
just to make it absolutely clear; we're not cheating or doing anything underhand with these static pages, and they are being developed so that they will appeal to the normal user.
However, if the domains have duplicate content, you may be running into that problem. Duplicate content will cause some of the offending pages to be left out of the index, as Google will only include one copy of a page it considers to be a duplicate.