Forum Moderators: open

Message Too Old, No Replies

301 redirect from lots of pages

Will this transfer PR effectively

         

xlcus

3:30 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



Let's say I have a site... http://fortune.example.com/ that lets people enter their name and gives them a fortune cookie based on their name. When they enter their name, the site redirects them to... http://fortune.example.com/TheirName/ which they can then bookmark for later.

Now let's say this site gets quite popular and linked to from various Weblogs, but most of them don't link to the root page, they link to /TheirName/

So now I have a site with hundreds of incoming links, but to pages like http://fortune.example.com/JohnSmith/ and http://fortune.example.com/MaryBrown/ and not to the root page where I'd like them to point.

Now, I don't want Google to index these pages, but don't want to lose the PR from these links, so I was thinking of making these pages appear as a 301 redirect for just GoogleBot.

Will this transfer the PR ok? Even though there are lots of 301 redirects to the same page? Is there a penalty for more than a certain number of 301s to the same page? Would it be considered cloaking because only GoogleBot gets a redirect?

daisho

3:40 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



Why don't you want them linked to your sub pages. That is where your content is. Google likes that! Put a link on every one of your pages back to the home page. Also put a "Not Mary Smith? Enter your name here for your fortune:" on the page for people that find a sub page. Is it really that important that they see your home page?

xlcus

5:25 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



Hmmm. Perhaps the example scenario I made up isn't as close to my actual problem I thought. I think I'll try and stick with it though... :-)

Why don't you want them linked to your sub pages.

I want all the PR to transfer to the root page. The reason I don't want Google to index the sub pages is that they're all interlinked and as each one takes a lot of processing power to generate, when Google crawls the site it cripples the server.

A 301 redirect for GoogleBot would solve this problem. I'm just unsure what it will do when it finds hundreds of pages all 301 redirected to the same page, and also it's technically cloaking.

daisho

5:42 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



I can understand that. Something that you should keep close to heart is that PR on your main page is not everything. As a matter of fact it is becoming less important.

A proper solution to make search engines happy and be the most benifit to you would be some type of caching solution. I went this way after a few crawls form Scooter (AltaVista) which seemed to have NO controll. It basicly DoS'd my site for weeks on end. I even got to the point to calling their tech department and talking with them to solve their run away scooter.

Anyway that prompted me to develop cache and many of my database generated pages never touch the database if it passes a simple check. Saved me TONS of CPU power and will let me get lots more out of my server.

You really do want google to index the specific pages since that will be _REAL_ content and google loves content. You will start to get search results you have never though of. It's much more benificial than putting all your eggs in one basket (home page).

Personally I would like many pages with a PR of 5 rather than 1 page with a PR of 7. I would still like a PR of 7 but I would not trade that if all my other pages dropped to say 3.

Also you may hurt yourself if any URL 301's to your home page for google since google will only see that your site only has one page. How good could a 1 page site be?

Not sure if this helps at all but in the long run you are much better of in figuring out how to get google to index ALL your text you have in the database.

mbennie

6:01 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



Why not just place a link on each of the users pages to your home page? This will transfer some of the page rank and won't cause any problems with cloaking.

xlcus

6:14 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



A proper solution to make search engines happy and be the most benifit to you would be some type of caching solution.

Yeah that would be a good solution, but the problem is that there are effectively an infinite number of these sub pages, so I can't really cache them. :(

xlcus

6:18 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



Why not just place a link on each of the users pages to your home page? This will transfer some of the page rank and won't cause any problems with cloaking.

Yeah, I would like to do this, but I've really got to stop Google indexing the sub pages as it puts such a strain on the server.

I just wondered if anybody knew the effects of lots of 301 redirects to a single page.

daisho

6:22 pm on Feb 26, 2003 (gmt 0)

10+ Year Member



Have the cache expire after a while. I think my cache right now is about 3gigs and growing. Not sure how much space you have.

Cloaking is a bad idea and could get you banned for using it. Are you finding that google is killing your site right now? How many back links do you have?

Are you sure it's GoogleBot and not maybe scooter? As I said no matter what I did scooter never slowed down. Even with my caching scooter just then maxed out my bandwidth. I used iptables to limit them to 10k on my site and I've been happy ever since.