Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: mademetop
Currently our robots.txt is telling search engines to not look at any of the affiliate sites, out of a fear of it being called spamming. Are we overreacting, or could this be considered spamming?
The affiliates are showing up in Google now, but with just the URL as the page title and no description. My boss wants them to show up correctly, but I'd rather not be banned from the search engines, for obvious reasons, heh. We have found several other search engines that are apparently ignoring our robots.txt, but that's another issue.
I do that for my production server, so widgets.productionserver.com redirects only for googlebot to www.widgets.com to prevent my production sites to get listed.
But this is at your own risk as It might be seen as cloacking.
Even better, redirect ALL visitors form bob.widget.com to www.widget.com, and use bob.widget.com to set the cookie/tag.
In addition, each of the affiliate pages might be slightly different in terms of content and layouts, enough to make using a single set of pages (your second suggestion) not workable.
As you can imagine, this is not the optimal way I would have chosen to do this, but I'm restiricted to using the current affiliate method which really hasn't changed since it was setup in 1996. We even have to keep legacy code around to handle affiliates who have not changed their links. It's a mess, but thanks for the suggestions!