Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
Webmasters really have no way of knowing for SURE that PR is being passed or not, even with a supposedly controlled experiment like this. The PR number shown in the Google toolbar is just an approximate number on a 1 - 10 scale, but not the actual PR number. Even if the sub-page doesn't show up as a backlink to the home page, this doesn't mean that it isn't passing PR (or won't when it is indexed and PR is recalculated).
How long ago did you add this page, and what are the PR values of your home page and sub page?
I found a way to look up my page rank(s) without the
Google Toolbar. Very grateful for that. What I would _really_ like to see is the same results, but to 3 digits accuracy, as in 6.19 instead of just "6".
Then at least, I could see if small changes are helpful or detrimental. I suppose that's why we don't get to see the decimals.
As an experiment, I did a keyword count on my main
page (the only one I'm really concerned with for
Google serps / placement.)
I found my single 3-letter keyword in 12 places,
something like 6% of all the words. (Page is
dominated by a .gif image.) I reduced that count to
10. 2 days later my page went up from #29 to #25
for that particular KW.
This is no scientific test, just an observation.
Maybe I should try 8 after a few days and see if that is any better. Is keyword count a significant factor in the ranking algorith? Best - Larry
Yes, but that is speculation on my part.
My keyword is "UFO". My main entry page (/index.html) had that word about 12 times, maybe 6% keyword density. These are words visible to the visitor, including page title, <H1> header etc., but not META tags.
After reading these forums, I began to suspect that this might incur a slight "penalty", meaning lower placement in the rankings for that simple KW. I think the word 'penalty' is misleading in cases like mine. In effect its a penalty, but no red lights flash. You just sink lower in the listings due to some automatic algorithm we'd all like to know.
As an experiment, I cut back the KW from 12 to 10 occurrances. Soon after that, I went from #29 or so, bottom of 3rd page of results when searching for "UFO" .. up to #25, i.e. halfway up the same 3rd page.
If this has any merit, a 5% keyword density is better than 6% in my particular case.
Hope this helps - Larry
G now indexes this new page but I don't think G is passing the PR back to the home page. Does G not like this kind of site structure?
No, it should be fine. Probably you just have to wait some time.
I then switched to the test where the home page links to a page and that page just links back to the home page. This seems to be the ideal scenario for focusing most of the PR to the home page.
Yes, for PR this is one of the best linking structures [webmasterworld.com] but this doesn't mean that this is the best for the SERPs.
My rank in the serps decreased after the test which tells me that PR is not making it back to my home page when before at least some of it was.
You cannot draw this conclusion from your experiment for the following reasons:
- You not only changed the PR of your home page but also other off-page factors.
- You cannot compare different situations in time. The parameters which influence your PR could change such as the PR of the pages linking to you, the number of links on those pages or other factors. Also, the SERPs are changing due to changes of the other pages as well as changes in the ranking algorithm. A valid experiment would be comparing different situations (linking structures) at the same time.
Yes, patience is a good thing. Sometimes I forget that it's amazing how quickly G can process and update the amount of data it does and dish it up in less than a second when it takes my computer a few minutes just to search one drive for a word...