Forum Moderators: open
To me it would make sence and save space in the database, I'd probably want to up the limit with the page rank but I'd still have a limit. It would probably help stop very high page rank sites that have incidental text on them from appearing at the top of the listings.
A good example of this is the google index page, it has to be one of the most linked to pages going and has a lovely 10 page rank so surely anything the spider could pick out of the page must be very relevant but it doesn't show up for 'preferences', is this delibrate or just a by product of the linking of keywords to link text? even then the preferences page links back to the index and that has a pretty high page rank and lots of relevance to the term.
When a search for "Preferences" is submitted, the engine examines all the text on every page, looking for matches. The system then decides which are most relevant based on a number of important factors. Google.com should be listed somewhere in those search results, but not very high.
Page Rank has a lot to do with the SERPS, but not nearly everything. A good, well-optimized site with lots of content about Preferences will be ranked MUCH higher than Google.com, even with only a PR5.