Forum Moderators: open
They can't see the files last modified date, that information is not sent out in the headers.
Problem with putting random text on a page is that Google can soon twig on to what you're doing unless it's done properly.
that information is not sent out in the headers.
It can be [webmasterworld.com].
Use Navigator and view Page info and you'll see it, if available.
If you see a 304 response for client request in your logs for Googlebot, that tells it that there was no modification since the last visit, so it won't spider the page any further.
What you want to see is a 200 response.
Here is an example:
No Change:
"GET /yourpage.htm HTTP/1.0" 304 - "-" "Googlebot/2.1 (+http://www.google.com/bot.html)"
Changed:
"GET /yourchangedpage.htm HTTP/1.0" 200 14293 "-" "Googlebot/2.1 (+http://www.google.com/bot.html)"
They use the server response codes to save bandwidth and crawl times. If the data has not been modified, no since crawling it again.
You need access to your raw log files to see this.
Your server pastes any SSI's (SERVER SIDE includes) into the page at the time of the request. The browser / user agent receives the stuck together outcome. If googlebot / your browser had to do the work they would be called CLIENT side includes.
The pages are, therefore, indistinguishable from a normal HTML page, apart from the possibility of delayed server response. Lots of traffic + lots of SSIs = lots of processing work for the server to do for each request. This could slow your site down markedly.
This is not to say changing content on a page is bad. Of course it isn't - especially for the home page - where fresh content is always a good thing.
However, trying to "trick" Googlebot into visiting your far-interior pages more frequently by feeding up "garbage" content that has no reason to be there is only playing with fire if you ask me. You may get that page spidered more often - but that doesn't necessarily mean your rankings for those interior pages will improve. Quite the opposite may happen, in fact.