Forum Moderators: open
This normally works fine, but when Googlebot crawls the site, it is as if the number of active users climbs significantly, slowing down the site.
In the medium term we will be moving to SQL instead of Access, which should solve the problem.
However, in the short term, is there something we can add to the robots.txt file to prevent global.asa being called when spiders crawl the site?
Thanks for any advice
Jgar
Robots.txt cannot control anything ASP related - all it does is tell robots what files they can access and what they cannot.
Instead, I'd try it this way -
In your global.asa file, check the user agent in the server header to see if it contains the string "Googlebot". If it doesn't, then you assume it's a normal (human) visitor, and process global.asa code as normal.
This check could be also be extended to all the different spiders run by different Search Engines...
HTH,
JP
This code should do it (although im a little bit pissed so you may need to tweak it). add this to your global.asa where the code to add data to access executes:
'----------------------------------------------
agent = request.ServerVariables("HTTP_USER_AGENT")
if instr(agent, "google") > 0 then
'code which adds required data to access
end if
'----------------------------------------------
Search Engine World Spider List [searchengineworld.com].
JP