Forum Moderators: open
The path and method on that, is it will usually walk back to root and then start down into the site from there.
I have wondered how this can happend, is it relative to this question? I.E. If the whole site does not get re-spidered, some of the pages will get left out, even though they were in prior indexes.
If this is the case....
how can one make sure that the spider crawls the maximum number of pages? Will smaller file sizes on the pages help in this respect?
Just call me the "questionator":)