Welcome to WebmasterWorld Guest from 107.20.34.173

Forum Moderators: goodroi

Message Too Old, No Replies

robots.txt and IP addresses

     

Maciej Ziemczonek

10:17 am on Apr 8, 2008 (gmt 0)

5+ Year Member



Hi

I have a problem. Our site is located on a couple of servers. Google is somehowe indexing also direct urls of these servers.

Therefore, for example, we have in google:

www.ourdomain.com

and

www.ourdomain.hostnameserver1.com
www.ourdomain.hostnameserver2.com

What's more, also IP addresses of these severs are indexed as seperate urls.

We have only one robots.txt, the same meta tags are for the whole site. The structure of catalogues and files is the same for each server - which makes it impossible to block access to chosen catalogues and files.

Do you have any idea how to:

1. block Googlebot from crawling these urls?
2. remove these urls from Google index?

I'd appreciate you help greatly,

best regards
Maciej

goodroi

1:03 pm on Apr 9, 2008 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



hello Maciej,

it sounds like whoever setup your hosting created mirrors of your site at multiple locations. robots.txt will not help this situation since the robots.txt file will be copied to all the duplicate places. you should talk to your hosting person and explain that only one url should be indexed. they can make some changes to the hosting setup to deal with this situation.

as for duplicate content you should not have too big of an issue as long as you have all of your link popularity pointing to one version of the site. the engines will filter out the other duplicates.

good luck

Maciej Ziemczonek

1:39 pm on Apr 9, 2008 (gmt 0)

5+ Year Member



Goodroi - thanks for your help!

g1smd

12:01 am on Apr 10, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



If you simply stop the alternative URLs working, you will lose that traffic.

You need a site-wide 301 redirect to fix this.

The redirect will preserve the traffic that comes in through the wrong URLs.

The 301 redirect will ensure that the wrong URLs are eventually de-indexed.

Ensure that all your internal linking points to the canonical domain.

Maciej Ziemczonek

6:39 am on Apr 10, 2008 (gmt 0)

5+ Year Member



Thanks!

vietbds

9:54 am on Apr 11, 2008 (gmt 0)

5+ Year Member



good infomation.
Thanks

garryb

8:52 am on Apr 16, 2008 (gmt 0)

5+ Year Member



I yesterday put a redirect from http://example.ie to http://www.example.ie as both were appearing in the search engines. A friend told me that these two sites, although the same will be competeing with each other and search engines might see them as duplicates.
I put a rewrite rule in my .htaccess file. Does anyone know if this will cause prolems?

[edited by: goodroi at 1:02 pm (utc) on April 16, 2008]
[edit reason] Examplified [/edit]

goodroi

1:15 pm on Apr 16, 2008 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



hi garryb,

welcome to webmasterworld. having the content accessible at both URLs can cause issues in the search engines. it is ideal to use a 301 redirect and point one of them into the other (which you have done). by redirecting you make sure the link popularity is focused and not divided. this also minimizes issues with being flagged as duplicate content.

g1smd

6:32 pm on Apr 16, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The 301 redirect needs to be site-wide, not just for the root.

bilalseo

10:10 pm on May 2, 2008 (gmt 0)

5+ Year Member



agree with g1smd ;)

bilalseo

10:11 pm on May 2, 2008 (gmt 0)

5+ Year Member



agree with g1smd ;)
 

Featured Threads

Hot Threads This Week

Hot Threads This Month