Forum Moderators: open
I am using 404 custom error, within it i use
Server.Transfer("/Items.asp")
My Robots.txt has the following line
Disallow: /Items
The URLs for my pages does not have Items in them,
But the 404 transfer the request to items.asp and create the page dynamically
The Question now, Does Googlebot know it is being transfer to items.asp? and since i am disallowing this page in my robots.txt, Do you think that is why my pages are not being visited or crawled by Googlebot?
The Question now, Does Googlebot know it is being transfer to items.asp?
A standard 404 response contains a 404-Not Found header and the contents of the default or custom 404 error page. The URL to the error page is not included in the response, as it would be in a 301 or 302 redirect response.
So no, Googlebot does not know that you are using a custom error handler, unless some other redirect takes place after the 404 response.
Check your complete 404 response using Wannabrowser or a similar tool to be sure.
Jim
That is an excellent tool.
my concern is Google only indexing my main page and whatever page that has a link to it from the main page.
I have categories links on the main page but google is not following them to index the pages under these categories.
I am using the "index, follow" on all the pages, but i thought i was telling google not to crawl the pages since i was using the server.transfer("/Items.asp")
and blocking google from /Items in my robots.txt.
Make sure your category links are plain-HTML text links and not JavaScripted, and also be aware that Google will not usually crawl and index all pages on a site all at once unless the site has high page rank. If your site is less than two months old, then you may just have to wait through a couple more updates for Google to find all the pages. If waiting doesn't work, you could try creating a site index page - a flat directory of all important pages on your site - and submitting that (once) to Google.
Jim