Forum Moderators: open

Message Too Old, No Replies

Spider Mystery

I have thousands of websites under my domain but the spiders dont see them.

         

Smithy

3:58 pm on Oct 24, 2002 (gmt 0)



When I use the spider tester on searchengineworld my website is only spidered one level down. Every webpage on my site is linked to the root with an absolute url.

My homepage links to these pages are also full url's, but run off a cgi script (which I use for referral tracking).

I have a robots.txt in place which allows any engine to spider, but still no joy!

Also to add insult to injury I lost many google listings last month at google.

I point hundreds of different domains to various sections within my site, meaning I effectively have two versions of each. My results for the individual results are often lower in comparison.

Can anybody help?

Thanks.

CuriousWeb

8:30 pm on Oct 24, 2002 (gmt 0)

10+ Year Member



<I point hundreds of different domains to various sections within my site, meaning I effectively have two versions of each. My results for the individual results are often lower in comparison. >

Welcome to WebmasterWorld Smithy

Not sure if what you have said constitutes mirroring but Google is against this practice. If you look through some of the old threads in Google News you should be able to find some threads about this...

Hope this helps

Chris_R

8:34 pm on Oct 24, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am not sure I understand you completely, but here is my 2 cents under the keep it simple principle:

1) Don't use robots.txt to allow sites.

2) Don't expect search engines to crawl thousands of pages off your site.

3) Don't excect search engines to follow dynamic links.

They can do all of these, but it doesn't mean they will.