Forum Moderators: goodroi
User-agent: *
Disallow: /viewtopic
Disallow: /posting
Disallow: /viewforum
Disallow: /privmsg
need your help on this
thanks a lot
exp...
Basically, if Google finds a link to one of your pages, either on your own site or anywhere on the Web, they may include a URL-only listing for it. Yahoo! will do the same, except that they often use the link-text of the 'best' link pointing to your page as the title for the search listing (Yahoo! defines 'best' link here, not me.)
I disagree strongly with this approach, as it makes it difficult to prevent people from landing on a page out of context. This might mean landing in the middle of an article whose logic depends heavily on the previous pages having been read, or it might mean landing in the middle of the checkout process for a simple shopping cart, with all of the purchased items 'undefined'. Neither of these could be said to enhance the user's experience, but this is the search engines' decision, and I just live with it.
My consolation is that it's easier to make other pages rank higher for the link-text Yahoo! uses, and that very few people use the site:example.com-type searches in Google.
You can cloak these pages, and rewrite the robots' requests to a password-required page. But I've never bothered.
2) Your robots.txt is not quite valid. Only one "User-agent: *" record should appear in the file. I suggest combining all of your disallows into one record under a single "User-agent: *" directive
Jim