Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Fixing multiple domain problems that resulted in penalty

         

covercopy

4:23 pm on Sep 23, 2008 (gmt 0)

10+ Year Member



Hi there. I have 4 URLs that point to one site. We didn't do this to game the system, but rather to economically address the requirements of four semi-independent companies. Anyway, despite trying to play fair by assigning product pages to specific URLs and "nofollowing" other pages (to avoid duplicate content), Google has applied a penalty. We plan on 301 redirecting three of the domains to our one primary domain, but it will take us some time to sort out internal branding issues before we can do that. In the meantime, would it be advisable to add "nofollow, noindex" tags to every page of the domains we intend on redirecting? Should I also remove the sitemaps for those domains and then request reconsideration for our primary domain? Thanks in advance.

ecmedia

2:29 pm on Sep 24, 2008 (gmt 0)

10+ Year Member



I am not convinced that that merely pointing 4 URLS could be the cause of this penalty, if there is one. Getting links from four websites is very easy, even for a very ordinary website.

I think you should spend some more time to find out what else you might have done that is fishy or that you actually have a penalty. Is your website completely deindexed or has your traffic gone to zero? How much traffic were you getting before? How much now? Remember that traffic fluctuations are very common. A site receiving just 100 visitors daily might go to just 10 visitors daily without any penalty at all (just because the SERPS change for one or two keywords). Yes it is a 90% drop in traffic but is more likely to happen. On the other hand, if a site with 10,000 visitors daily loses 90% of its traffic then it is probably a penalty.

So don't change anything for the time being and spend some time in finding out what really happened.

netmeg

2:34 pm on Sep 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have five domains pointing to one of my sites. I only want Google to know about one of them, so I use PHP to deliver a different robots.txt depending on which url the bots come in on. It's not perfect (I still have a couple snippetless urls from the other domains in the index), but I don't have any penalties as far as I can tell, and thousands of urls aren't being duplicated on every domain. YMMV.

So my robots.txt file looks like this:

<%
$h = explode('.',$_SERVER['HTTP_HOST']);
$x = count($h);
$robot_host = $h[$x-2].'.'.$h[$x-1];
if (file_exists($_SERVER['DOCUMENT_ROOT'].'/robots.'.$robot_host.'.txt')) {
header("Content-Type: text/plain");
echo '# robots.'.$robot_host.'.txt'."\n";
echo '# host '.$_SERVER["SERVER_NAME"]."\n";
readfile($_SERVER['DOCUMENT_ROOT'].'/robots.'.$robot_host.'.txt');
} else {
header("HTTP/1.1 404 Not Found");
echo '<html><title>File Not Found</title><body><h4>File Not Found</h4></body></html>';
exit;
}
%>

and I have files like this:

robots.domain1.com.txt (contains my actual robots.txt instructions for the domain I want indexed)

robots.domain2.com.txt (contains a disallow * for every user agent)

And so on.

If someone knows a better way, I'm sure up for hearing it; I have quite a few clients with multiple domains that we're always trying to get sorted out.

covercopy

5:53 pm on Sep 24, 2008 (gmt 0)

10+ Year Member



I apologize--I don't think I explained the problem correctly. I'm not redirecting each URL--I'm using URL-awareness to serve up custom content depending on the URL through which you entered but the sites contain similar content and share resources so Google views this as a domain farm.

The sites haven't been de-indexed, but I can determine by searching for specific phrases and page titles that they are imposing 150, 250, and 950 penalties depending on the specific pages. It has been this way for nearly a year.

I fully intend on 301 redirecting three of the domains to our primary domain, but this will take some time and I'm wondering if using "noindex, nofollow" on every page of the three lesser domains (effectively removing them from Google's index and eliminating cross-linking) will be enough to get Google to lift our penalty.

tedster

6:13 pm on Sep 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sounds like a very complex system, with many potential points of failure for googlebot. Yes, noindex might provide a temporary band-aid to some degree - but clearly, the best answer is getting it all sorted.

andrewsclothing

9:52 pm on Sep 24, 2008 (gmt 0)

10+ Year Member



Perhaps you may need three different pages for each of the redirects, considering you are displaying different content depending on which domain they are redirected from?
Or is this spreading the juice too thinly?

covercopy

1:22 pm on Sep 25, 2008 (gmt 0)

10+ Year Member



That's a good suggestion and we do plan on doing that. Similar to what IBM does with www.lotus.com. If anyone out there has specific experience with this, I'd love to hear whatever advice you might have. My problem is I can't 301 redirect right away.