Welcome to WebmasterWorld Guest from 220.127.116.11
joined:June 2, 2003
In the past few months we have been exploring some HTML forms to try to discover new web pages and URLs that we otherwise couldn't find and index for users who search on Google. Specifically, when we encounter a <FORM> element on a high-quality site, we might choose to do a small number of queries using the form.
For text boxes, our computers automatically choose words from the site that has the form;
For select menus, check boxes, and radio buttons on the form, we choose from among the values...
You can take consolation in the fact they only do this on what they feel is a "high-quality site", I suppose.
At the first PubCon in Austin, I had a chat with Matt Cutts about this behavior. He confirmed that content found only through form crawling gets established as a "virtual URL" on Google's back end, and that link equity, such as PR, can pass through.
He also reinforced the idea that Google does not want to index site search results - so those search forms should be disallowed from googlebot.