dusky - 10:15 pm on Apr 21, 2010 (gmt 0) [edited by: Brett_Tabke at 4:45 am (utc) on Apr 22, 2010]
Brett_Tabke, my understanding is No, it's not directly, I have it on one of our large sites, gbot is testing bogus query strings to see if they return a proper 404 error, in our case I managed to fix a major issues [snip potential security issue]
How gbot discovered it, as said above, it only needs one onpage link or bookmark the gtoolbar can follow to a search result page from a user, and that's passed for next crawling. Because it's a search string, gbot will test the bogus search request while they are at it.
Here are the steps:
User searches for say, ajax help, lands on a useless to Gbot link which is [webmasterworld.com...] as it's using POST not a GET form and all the rest of the search params are missing, so not crawlable, BUT on that page the search results returned, one of whom is:
Now that is bookmarkable and many will find it interesting enough to post on their sites. Gbot collects it and spiders it, AND finds inside the page the link for the other option which says search for this or that on WebmasterWorld, it was there few minutes ago, have you removed it Brett_Tabke, anyway that link is the one GBot tried as [snip] and also tries it as [snip].... or a similar request which it does not exist, though I believe ".../?terms=" structure existed but buggy probably that's why it is removed.
[edited by: Brett_Tabke at 4:45 am (utc) on Apr 22, 2010]