Forum Moderators: open
Welcome to WebmasterWorld!
Seeking the critical factor is counterproductive. In the basic paper on The Anatomy of a Large-Scale Hypertextual Web Search Engine [www7.scu.edu.au] Page and Brin write:
Combining all of this information into a rank is difficult. We designed our ranking function so that no particular factor can have too much influence.
In the Google Ranking FAQ [google.com] they write:
Google's order of results is automatically determined by more than 100 factors, including our PageRank algorithm.
Link text and on-page factors. PR is the deciding factor when all other things are equal.
This is what I used to believe. With sj, fi, etc. it appears that the root level index page is given a great deal of weight. This has bumped up a good many commercial sites over the information pages that are within a site.
We shall see if this is true when the update is complete.
It is a lot like the interpretation of coffe-seeds just watching what is going on.
My impression is freshness helps a lot, the freshness of links pointing to my site, the last update of the page.
The rules seem to have changed now, but before you could do well with freshbot, at least the fairly sleepy uncompetitive ones where I move.
I would not concentrate too much on one or two keywords though - people find my pages with keywords and sentences I would never dream of - I find it worth while however to study different spelling options in my language and watch what google is doing with that ...and what other web sites are doing with that.