Forum Moderators: martinibuster
I have run into a problem trying to implement this technique on a Zope site I am optimising, which uses the domain name dynamically (for a good reason) and must have the absolute path in the link url. This of course makes it difficult to use the javascript technique.
I was wondering whether shutting down access to the pages I don't want indexed using robots.txt would be effective at achieving the aim of channeling a higher page rank to the optimised pages. Or does Google just count 10 links on the page and apportion the PR vote by 10 irrespective of whether robots.txt forbids it from following those links?
Are there other methods I can use to better channel the PR vote?
I'm afraid I don't subscibe to the 'using JavaScript to preserve PR theroy'. Why would linking from a page to another page reduce it's value to the user. It won't, if anything it will do the opposite. I like pages that link :).
IMHO, I wouldn't worry about using JavaScript links. I regards to the robots.txt. I would only every excluded pages like a site disclaimer.
Chris.
Or does Google just count 10 links on the page and apportion the PR vote by 10 irrespective of whether robots.txt forbids it from following those links?
Yes. Remember Vitaplease saying that in another thread. Also remember the recent pubcon thread where Matt is supposed to have said that they(google) do read Javascript.
Also as a side issue, what is the affect of using a noindex/follow on the PR of the pages that are linked to from that page. I have found in the past that my sitemap pages show up in the SERPs and would like to stop them being indexed, however I do want to retain the PR to the linked pages and also the value of the anchor text on the site map page. Any thoughts?
TIA
Janet
javascript is good for adding links that you do not the search engines to follow, but you do want users to have access too.
The idea behind it is that you create an optimised or well thought out site, with themed content distributed with no cross theme linking, hence strengthening the theme's and the keyword structures inherent in the theme's as they work down to the ROI pages. However, relatively consistent theme's may also stop quite obvious connections to human's being on the page via a crawlable link. Hence is a case for using .js links, so that the SE's can't crawl them, and the users can cross theme's to seemingly connected things, though not in a purely logical sense, which the SE's address the internet as.
If youre using client side .js, you should not have a problem implementing js links into what ever themed pages, you would not wish the SE's to abridge across, hence lowering or more to the point generalising the specialisation of that page to its theme.