|Multiple Subdirectories Considered Spamming?|
Would an affiliate site setup like this be penalized?
| 4:57 pm on Nov 24, 2003 (gmt 0)|
My company basically has an affiliate system (ex. bob.widgets.com, joe.widgets.com, jane.widgets.com). Each of these subdomains looks almost identical in text and content, with the exception of a different logo and color scheme.
Currently our robots.txt is telling search engines to not look at any of the affiliate sites, out of a fear of it being called spamming. Are we overreacting, or could this be considered spamming?
The affiliates are showing up in Google now, but with just the URL as the page title and no description. My boss wants them to show up correctly, but I'd rather not be banned from the search engines, for obvious reasons, heh. We have found several other search engines that are apparently ignoring our robots.txt, but that's another issue.
| 5:12 pm on Nov 24, 2003 (gmt 0)|
You could cloak and redirect googlebot to www.widgets.com.
I do that for my production server, so widgets.productionserver.com redirects only for googlebot to www.widgets.com to prevent my production sites to get listed.
But this is at your own risk as It might be seen as cloacking.
Even better, redirect ALL visitors form bob.widget.com to www.widget.com, and use bob.widget.com to set the cookie/tag.
| 6:09 pm on Nov 24, 2003 (gmt 0)|
Unfortunately, www.widgets.com is our corporate site which just tells about the company, and has links to the affilate pages buried three levels deep.
In addition, each of the affiliate pages might be slightly different in terms of content and layouts, enough to make using a single set of pages (your second suggestion) not workable.
As you can imagine, this is not the optimal way I would have chosen to do this, but I'm restiricted to using the current affiliate method which really hasn't changed since it was setup in 1996. We even have to keep legacy code around to handle affiliates who have not changed their links. It's a mess, but thanks for the suggestions!