Forum Moderators: open

Message Too Old, No Replies

PR Dilution

Pages excluded with robots.txt

         

rover

9:01 pm on Jul 8, 2004 (gmt 0)

10+ Year Member



If I have a starting page with several internal links to html files in a specific directory, and I exclude pages in that specific directory with robots.txt, the PR for that starting page wouldn't get diluted would it?

I know that Google won't index the pages excluded by the robots.txt, but does that mean it also doesn't pass PR to pages that are excluded?

(I want to do this to make printer-friendly pages, which I don't want indexed, but I want to also make sure that it doesn't dilute the PR for the starting page if there are several links like that on one page).

troels nybo nielsen

10:40 am on Jul 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If page A links to page B that link will from a PageRank point of view be regarded as a vote for the credibility of page B. This means that the PageRank of page B will be boosted without the PageRank of page B going down.

If page A links to both page B and page C then those two pages will have to share the boost in PageRank.

Pages that are not crawled by googlebot are not given a PageRank. They do not take a share of the PageRank boost given by pages that link to them. AFAIK.

doc_z

10:42 am on Jul 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I know that Google won't index the pages excluded by the robots.txt, but does that mean it also doesn't pass PR to pages that are excluded?

No, PR is passed to these pages. Therefore, PR is wasted because these pages receice but don't pass PR.

troels nybo nielsen

10:49 am on Jul 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> PR is passed to these pages

Interesting. As shown in my first post I did not know that. A solution might be to let those pages link back to the front page.

doc_z

11:08 am on Jul 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A solution might be to let those pages link back to the front page.

Yes, I would do this. This is much better than exculding the pages with robots.txt. The PR decrease for the front page would be non-significant (in the first case). Of course, using JS is also a solution but I would put a link on the printer-friendly pages.

doc_z

2:42 pm on Jul 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have to add this: As long as there is just one link (to the front page) on the printer-friendly pages, there is no decrease for the front page. More likely - for realistic link structures - there is even an increase of PR for that page. There is only a (normally non-significant) decrease of PR for the inner pages.

rover

8:09 pm on Jul 9, 2004 (gmt 0)

10+ Year Member



Thanks very much, I'll keep this in mind when I build these pages. I think I'll just have just one link on the printer-friendly page back to the front page.