Forum Moderators: open
It began when I was working on a site for someone and noticed that there were WAY to many pages indexed in Google. I figured out that it was the "print" and "send friend" links, which appear on most of the site's pages.
To eliminate this problem: I moved the scripts to their own folder, added the folder to robots.txt, added the meta robots tag, and used javascript to link to them.
Now, G has relisted every single page with it's new URL! They now have no description or title, just the URL. I figure this is because they actually obeyed the robots file, but I don't want the links indexed at all.
Can anyone explain this? Thanks!
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
User-agent: *
Disallow: /print/
Disallow: /dev/send_friend
Disallow: /dev/property_detail_print