I just realized that all the pages of my website have the same number of internal links ( which is VERY wrong I think ) because for 90 % of them I don't want to rank as the keywords are not request.
I want to give juice to only a few and I guess I need to sculpt my PR
is robot or no follow the way to do it ?
4:50 pm on Dec 20, 2011 (gmt 0)
I don't think it is "very wrong" - I think it's pretty common for a smallish site with good internal navigation. It's natural and not a problem for Google.
I would not even start trying to sculpt PR. First, Google has changed the way nofollow works, so that approach is futile. Second, if you don't allow crawling for a lot of pages (robots.txt), you're likely to break all kinds of important circulation for your link juice. Remember that PR calculations iterate over and over, they do not stop after just one "hop".
Most people who have tried to "sculpt PR" have ended up hurting their rankings. For the very few who've had small success (and the potential gain is small) they find that it isn't worth the intense investment of resources. This is partially because Google has many times changed the way PR is calculated compared to the initial patent - and we don't even know the details any more.
for 90 % of them I don't want to rank as the keywords are not request.
If the keywords are not requested, then it doesn't matter if they "rank" or not. Google watches search impressions for the page. If there are no searches, then there are no impressions.
I'd say you should relax about all this. Make sure your pages are useful for your visitors and the internal navigation is logical and easy to use.
11:43 pm on Dec 21, 2011 (gmt 0)
By the way you're only risking to affect negatively your rankings and saying to Google that you're trying to manipulate your rankings sculpting your PR, this may be dangerous, so focus more on the new panda updates more than on PR.
12:57 am on Dec 22, 2011 (gmt 0)
There are much better, more sophisticated ways to help "guide" the flow of page rank through your site than trying to sculpt it with noindex or with robots.txt.
This is established by how the internal links are set up, where they appear in the page, what anchor text is used, how relevant the material is, etc.,
However, done incorrectly, one can incur an Over Optimization Penalty (commonly referred to on these boards as an OOP).
It usually takes a good deal of messing up a site to get such a penalty, but some sites might be "on the edge" of getting a penalty without realizing it, so a simple mistake might push it over the edge.
(This is based on the assumption that Over Optimization Penalties are cut and dry - you are either penalized or you aren't penalized. As far as my understanding, in terms of over optimization, there isn't any "kind of penalized" state. Further, this comes from me seeking out and asking DOZENS of people who have incurred an over optimization penalty - I haven't incurred one myself (touch wood).)
5:01 am on Dec 22, 2011 (gmt 0)
It usually takes a good deal of messing up a site to get such a penalty
From what I've seen, the most common way to get into OOP territory is to over-think what "SEO" is and then try to use every approach and tool you've ever heard about.
Most SEO advice on the web is several years out of date and it often leaves out a good bit of background context. Even worse, a lot of advice is being passed on second- and third-hand, with distortions added at every step.
8:55 am on Dec 22, 2011 (gmt 0)
From the pages that are considered not so important by you, link out to the more important pages from the 1st paragraph or as early as possible. This will pass on the juice to the more important pages and might make them more valuable.