Forum Moderators: open
I have a set of catalogue pages with category and sub category, the links are all simple hrefs and are listed like this:
example.com/catalogue/spoons_general.asp?startAt=1
example.com/catalogue/knifes_general.asp?startAt=1
example.com/catalogue/forks_general.asp?startAt=1
with more sub categories like this:
example.com/catalogue/spoons_silver.asp?startAt=1
example.com/catalogue/spoons_wooden.asp?startAt=1
etc. etc.
Googlebot visits me very reguarly (nearly every day) and indexes the *_general.asp pages very nicely but refuses to touch ANY of the sub categories.
I've browsed with Lynx with no problems at all, checked the HTML at htmlvalidator with no errors or warnings. The pages are not excluded in robots.txt and I'm specifying "index,follow" even though I know I don't need to.
Does anyone have any ideas or are there any programs that "test" a crawlers ability to crawl pages?
[edited by: vitaplease at 5:23 am (utc) on Aug. 20, 2004]
[edit reason] examplified [/edit]