Forum Moderators: open

Message Too Old, No Replies

Not passing page rank

How do i stop a page getting pagerank?

         

Svengalie

2:09 pm on Oct 6, 2003 (gmt 0)

10+ Year Member



I would like to stop my links page getting page rank. Currently I put this page in the "disallow" cat in my robots.txt file yet the page has still received page rank (I assume because google can still see links linking TO this page even though it can't see the page itself.)

Any ideas?

kaled

1:22 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just one more thought - It is probable (but not certain) that using the NOFOLLOW Meta/Robot directive would have the same effect of preventing PR leaks, etc.

Does anyone have experience of this?

Kaled.

dillonstars

3:31 pm on Oct 7, 2003 (gmt 0)

10+ Year Member



I'll add my apologies for wrong assumptions made.

But this thread will still serve to help show new webmasters that hording links for PR reasons is bad advice.

Macro

4:21 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It was fun to read, everyone (except for HughMungus) assuming bad intention

dirkz, no I didn't assume anything. I reserved judgement till I could get more info from Svengalie.

But I agree that it is annoying to have to hide "honest" links.

claus

4:57 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> <META NAME="robots" CONTENT="noindex">

That should not be noindex, rather it should be nofollow. Of course you could also use "noindex,nofollow" or "index,nofollow".

Here's the documentation for the tag:

[robotstxt.org...]

/claus

twilight47

5:58 pm on Oct 7, 2003 (gmt 0)

10+ Year Member



Everyone keeps talking about hording PR.
I believe having a highly ranked PR page has little to do with whether that page has outbound links or not.
The inbound links are utmost in helping to determine PR and the number of outbound links effects the amount of PR that is past (ie. 1 outbound link will pass more PR to that page than 100 outbound links would to those pages).

I see no point in disguising a link page. IMHO

dirkz

6:04 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I see no point in disguising a link page. IMHO

Actually by linking you lose a fraction of PR because of the distribution to other pages, but the amount is really ridiculous :)

On the other hand you even gain if the page you link to links back to you, or a page linked from the page linked ... links back to you.

I wonder whether my link partners thing the same ...

dirkz

6:07 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Does anyone have experience of this?

No experience, but NOFOLLOW means just that: Don't follow the links. But the links Googlebot sees on the page itself should be counted from a logical point of view.

If Googlebot sees a link on a page it spiders it's too late :)

HughMungus

6:08 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I see no point in disguising a link page. IMHO

I think that the OP was trying to increase the value of some outbound links rather than others --like in my example, wanting more PR to go to a friend's website instead of to a site that already has PLENTY of PR (e.g., google, yahoo, etc.).

hutcheson

6:13 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A bit of calculation, based on typical page probabilities (say, page rank 6=.000001) will indicate just how futile this kind of calculation is. You're sitting in the middle of the Somme battlefield, polishing your bayonet ring -- and you're not going to live long enough to use it anyway.

If all the pages on your site aren't thoroughly interlinked, you are inconveniencing visitors and spiders alike: page rank and site stickyness for protoplasmic visitors will both suffer.

If all the pages on your site ARE thoroughly interlinked, then adding more pages will do more for your page rank than any amount of twiddling with spider blocks.

HughMungus

6:16 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If all the pages on your site ARE thoroughly interlinked, then adding more pages will do more for your page rank than any amount of twiddling with spider blocks.

<newb question>So the size of a website determines how valuable its content is?</newb question>

dirkz

6:39 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So the size of a website determines how valuable its content is?

Serious question?

HughMungus

6:54 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes. Someone said that you can increase your PR by having more pages on your site. I don't understand how that raw fact, alone, makes one website more relevant than another.

dirkz

7:04 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The average PR of every page on the Web is 1.0 (see PageRank explained simple paper).

So for every page that has more than 1.0 there must exist a page to have less than 1.0. The more pages you have, the higher is your (potential) total share of PR. Every added page adds to the PR total, but if it's your page YOU get the PR if you let it flow to that page.

In addition, if you have navigation on every page, every page more means a complete set of anchor text for your other pages. And every page that gets PR can distribute PR (again via navigation to your pages).

Eltiti

8:09 pm on Oct 7, 2003 (gmt 0)

10+ Year Member



So HughMungus wonders if size matters... (Sorry, could not resist!)

Seriously, the *size* of a website (in pages) is not a "pure" measure of its *value*: I could take all the info on my site and put it on a single page, or I could put every sentence on a page of its own --and the "information value" of my site would not have changed. (Obviously, certain arrangements will do more to endear me to my visitors than others.)

However, if I keep my current site *and start adding new info pages*, that will certainly increase the site's value! (Well, let's pretend it does.)

Conceivably, webmaster X may come across my new pages and link to them, which will increase the total PR of my site.

Yes, some people believe there will be a PR increase even *without* the additional external links, but that effect is often believed to be smallish.

I wonder --if the web as a whole grows by N% during a certain period, would your site have to grow by N% pages as well to keep the same PR? (Ceteris paribus, of course.)

[edited by: Eltiti at 8:10 pm (utc) on Oct. 7, 2003]

kaled

8:10 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My thinking with respect to use of the NOFOLLOW meta/robot directive is that a robot may simply switch off its link-scanning for the page. This would be the most efficient method of observing the directive. In this case, no PR leaks, etc. would occur through a page with this directive.

However, whilst I always strive to write efficient code, not everyone else does, so the NOFOLLOW directive may work with some robots but not all. It is also possible that a reasoned decision might me taken to behave differently.

Kaled.

HughMungus

9:11 pm on Oct 7, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, some people believe there will be a PR increase even *without* the additional external links, but that effect is often believed to be smallish.

Yeah, this is what I thought someone else was saying. Thanks for clarifying everybody!

This 46 message thread spans 2 pages: 46