Forum Moderators: martinibuster

Message Too Old, No Replies

Optimising internal links for higher PR

How to channel PR to the pages that really need it

         

Janet

6:20 am on May 13, 2003 (gmt 0)

10+ Year Member



I recently read an article from a Page Rank expert about "closing off" certain links from Google by using Javascript links (onclick="window.location(page.htm)"). This allows the distribution of PR votes to be focused on the pages that have been set aside for optimising (which use standard html links to reach those pages).

I have run into a problem trying to implement this technique on a Zope site I am optimising, which uses the domain name dynamically (for a good reason) and must have the absolute path in the link url. This of course makes it difficult to use the javascript technique.

I was wondering whether shutting down access to the pages I don't want indexed using robots.txt would be effective at achieving the aim of channeling a higher page rank to the optimised pages. Or does Google just count 10 links on the page and apportion the PR vote by 10 irrespective of whether robots.txt forbids it from following those links?

Are there other methods I can use to better channel the PR vote?

chris_f

8:25 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Janet,

I'm afraid I don't subscibe to the 'using JavaScript to preserve PR theroy'. Why would linking from a page to another page reduce it's value to the user. It won't, if anything it will do the opposite. I like pages that link :).

IMHO, I wouldn't worry about using JavaScript links. I regards to the robots.txt. I would only every excluded pages like a site disclaimer.

Chris.

mil2k

8:43 am on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Or does Google just count 10 links on the page and apportion the PR vote by 10 irrespective of whether robots.txt forbids it from following those links?

Yes. Remember Vitaplease saying that in another thread. Also remember the recent pubcon thread where Matt is supposed to have said that they(google) do read Javascript.

Janet

11:30 pm on May 13, 2003 (gmt 0)

10+ Year Member



Thanks for the lead on the vitaplease thread mil2k. The discussion in that thread dealt with the Robots meta tag. I would guess that Google would have to first follow the link to the page with the robots META in order to read the tag, so would have to count it as a link. Does anybody know if excluding Google with robots.txt makes any difference in how it sees the link. i.e. whether it counts it in its PR vote for subpages?

Also as a side issue, what is the affect of using a noindex/follow on the PR of the pages that are linked to from that page. I have found in the past that my sitemap pages show up in the SERPs and would like to stop them being indexed, however I do want to retain the PR to the linked pages and also the value of the anchor text on the site map page. Any thoughts?

TIA

Janet

caine

11:40 pm on May 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Janet,

javascript is good for adding links that you do not the search engines to follow, but you do want users to have access too.

The idea behind it is that you create an optimised or well thought out site, with themed content distributed with no cross theme linking, hence strengthening the theme's and the keyword structures inherent in the theme's as they work down to the ROI pages. However, relatively consistent theme's may also stop quite obvious connections to human's being on the page via a crawlable link. Hence is a case for using .js links, so that the SE's can't crawl them, and the users can cross theme's to seemingly connected things, though not in a purely logical sense, which the SE's address the internet as.

If youre using client side .js, you should not have a problem implementing js links into what ever themed pages, you would not wish the SE's to abridge across, hence lowering or more to the point generalising the specialisation of that page to its theme.