Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

301 by Domain or pages?

Can I place a 301 for entire domain and not suffer any google issues

         

marty98

9:18 pm on Oct 27, 2005 (gmt 0)

10+ Year Member



I have several small sites that I want to get out of Yahoo and Google and points them towards our main site.(Yahoo definately penalizes our site so we are hoping this ends whatever penalty we are sufferring) My webserver, webstar, has a gui to set up the individual pages which will take some time. Since the page rank on the small sites are not important can I just place a domain wide 301 and point them to the corresponding section in our main site? Or should I place the 301 page by page to a similar corresponding page. I am not interested in transferring and PR just want to avoid and 301 sandboxing or other issues. The main site is several years old and highly ranked.

Vadim

1:42 am on Oct 29, 2005 (gmt 0)

10+ Year Member



Disclaimer: only Google knows how it works.

I believe that the type of the 301 redirection in your case is not important. Redirect as would be convenient for your customers.

However as a precaution, to avoid the duplicate content penalty, I would explicitly delete the redirected site form Google index (Google describes how to do it on their site). I would also disallow the robots to crawl the redirected site with robots.txt

Vadim.

macdave

12:15 pm on Oct 29, 2005 (gmt 0)

10+ Year Member



Redirect as would be convenient for your customers.

Agreed.

However as a precaution, to avoid the duplicate content penalty, I would explicitly delete the redirected site form Google index (Google describes how to do it on their site). I would also disallow the robots to crawl the redirected site with robots.txt

Using Google's Remove URL tool doesn't actually remove URLs from their index, it merely hides them for six months. In six months those URLs will come back into the index. If they don't have any backlinks, or if Google can't crawl them (because you've diallowed it in robots.txt, for example), they will become supplemental and they will never go away. If you've moved the content of the old site onto the new site, this is a good recipe for being hit hard by a duplicate content filter/penalty.

Vadim

1:01 am on Oct 30, 2005 (gmt 0)

10+ Year Member



Using Google's Remove URL tool doesn't actually remove URLs from their index, it merely hides them for six months. In six months those URLs will come back into the index.

Is it true also when you use "disallow" in robots.txt? I doubt it. I.e. Google may keep the URL in their inner index, but how can they punish for the content that the author required not to be found in Search Engines?

In any case I assumed that after 6 months the old server will either respond with 404 or, better, with 410. I believe that Google is clever enough not to punish for duplicate contents in such cases.

Vadim.

Ankhenaton

3:31 am on Oct 30, 2005 (gmt 0)



I've set up now for all my servers example.com a new webserver that just 301s to www.example.com

Indeed although site:example.com showed the same amount of pages as site:www.example.com they indeed had different pageranks.. Quite annoying since I never promoted example.com ..

<VirtualHost 127.0.0.1:80>
Servername example.com
# Serveralias example.com www.example.com

RewriteEngine On
RewriteRule ^(.*)$ http://www.example.com:80/$1 [R=301]

Hope that's that stuff sorted now ..