Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
I have a problem. Our site is located on a couple of servers. Google is somehowe indexing also direct urls of these servers.
Therefore, for example, we have in google:
What's more, also IP addresses of these severs are indexed as seperate urls.
We have only one robots.txt, the same meta tags are for the whole site. The structure of catalogues and files is the same for each server - which makes it impossible to block access to chosen catalogues and files.
Do you have any idea how to:
1. block Googlebot from crawling these urls?
2. remove these urls from Google index?
I'd appreciate you help greatly,
it sounds like whoever setup your hosting created mirrors of your site at multiple locations. robots.txt will not help this situation since the robots.txt file will be copied to all the duplicate places. you should talk to your hosting person and explain that only one url should be indexed. they can make some changes to the hosting setup to deal with this situation.
as for duplicate content you should not have too big of an issue as long as you have all of your link popularity pointing to one version of the site. the engines will filter out the other duplicates.
You need a site-wide 301 redirect to fix this.
The redirect will preserve the traffic that comes in through the wrong URLs.
The 301 redirect will ensure that the wrong URLs are eventually de-indexed.
Ensure that all your internal linking points to the canonical domain.
[edited by: goodroi at 1:02 pm (utc) on April 16, 2008]
[edit reason] Examplified [/edit]
welcome to webmasterworld. having the content accessible at both URLs can cause issues in the search engines. it is ideal to use a 301 redirect and point one of them into the other (which you have done). by redirecting you make sure the link popularity is focused and not divided. this also minimizes issues with being flagged as duplicate content.