Forum Moderators: open

Message Too Old, No Replies

Googlebot Won't Crawl These Pages

Any advice? Googlebot won't crawl 2 pages on my site.

         

mbennie

2:55 am on Nov 22, 2002 (gmt 0)

10+ Year Member



I have an asp site (URL is in my profile). The site is 5 pages of content that swells to 800+ with the various parameters. The last 2 pages (cities.asp and directory.asp) are never crawled by Google. I've run it through the spider simulator on this site and, for some reason, the links to these 2 pages don't seem to appear on the pages that would refer them. Other search engines don't seem to have a problem crawling them - just Googlebot and the spider simulator here.

When drilling down to the last 2 pages there are 3 and 4 separate parameters necessary depending on the page. Is this too many? Can anyone take a look and see if I'm missing something obvious?

These 2 pages represent 75% of my site content and I would very much like to have them indexed.

Woz

3:37 am on Nov 22, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>missing something obvious?
>there are 3 and 4 separate parameters

I am betting this is your problem. Try and keep the url string as short as you can. You can do this a number of ways such as

1) using ID numbers instead of words, eg., instead of Country=America, try Id=4 (assuming 4 equates to America)

2) concetanting your query string and then strip it apart again at the server, eg., instead of Country=America&Transport=Road&Fuel=Wind , try ID=4t3f5 (assuming Road = t3 and Wind = f5), then split the string at the server using letters as split points and get the info from the database that way.

There are lots of things you can do, put your thinking cap on and try the search feature here for dynamic URLs to get more ideas.

Onya
Woz

PS., we generally don't do site reviews here so let's keep the discussion general, OK.