Forum Moderators: open
that makes value of a link at top of page the same as one at top of page.
does google account for geography of link when computing PR?
101k page limit - This is only the amount that google will cache, they process and index the whole page.
Content vs. Navigation page - There is absolutely no reason to think that Google is unable to tell the difference between many different types of common pages. Directory or links type pages are fairly easy to spot. The same goes for internal navigation pages, content pages, forums, archives, guestbooks, blogs, etc.
I have not seen anything conclusive either way on whether or not google considers the position of a link in valuing it. I do know that if you want to make sure you get traffic to a page, you should put that link above the fold, but I don't think that PR is the place for google to consider that.
I bet that Google has considered link placement, and tested it to see how it would influence the results. And even if it improved the results slightly, was it worth the processing required? If it increases link processing time by 50%, yet only yealds a 3% improvement, you can bet they will keep it in mind, but look elsewhere for now.
I'm simply saying, I doubt Google gives more weight to a external link 'higher' on a page than a link 'lower' on a page.
But i think its fair to say (from useability testing reports I've seen) that larger links and links 'above the fold' will be clicked on more often as people tend not to scroll if they find the information they are looking for immediately. (exactly how you should write web pages..IMHO)
Plus, a search engine could detect the size of the link by parsing css information. This would give another indicator of how 'important' the link was to the author.
I certainly wouldn't dismiss the idea.
It is entirely likely that Google or other engines would add in a weighting based on this kind of information to improve their search results.
<added>and as for the alphabetical link list, if all links were formatted the same and in an obvious list, sure they would all be treated the same</added>
But i think its fair to say (from useability testing reports I've seen) that larger links and links 'above the fold' will be clicked on more often as people tend not to scroll if they find the information they are looking for immediately. (exactly how you should write web pages..IMHO)
Yes, of course it's fair to say that. It is also fair to say the links at the bottom probably don't get as many clicks. Not sure what either statement has to do with the topic though?
and as for the alphabetical link list, if all links were formatted the same and in an obvious list, sure they would all be treated the same
That would be about 90%+ of link pages.
Plus, a search engine could detect the size of the link by parsing css information. This would give another indicator of how 'important' the link was to the author.
Yes, perhaps, but I doubt that Google would bother. What is "important" to the owner may not be "important" to a searcher.
Dave
[edited by: Marcia at 1:42 pm (utc) on Oct. 6, 2003]
But i think its fair to say (from useability testing reports I've seen) that larger links and links 'above the fold' will be clicked on more often as people tend not to scroll if they find the information they are looking for immediately. (exactly how you should write web pages..IMHO)
But where is the fold? The fold is defined on the client end, window size, choice of browser, monitor size and settings all make a big difference. When my mom is on google she doesn't get to see any results without scrolling.
lus, a search engine could detect the size of the link by parsing css information.
A perfet example for my previous post about something that would fail a cost/benefit analysis. It could give you some useful info, but would require huge amounts of processing. Not worth it.
That would be about 90%+ of link pages.
No, probably more like 60% of links pages or less are alphabetical. Mine are, because they are large and are database generated. Most hand genreated ones are not.
And I think everyone elses point has been that links pages are a special case minority, that is easily identifiable as such. My site has 6000 pages. Approximately 4000 of those have outbound links. 4 of those pages are links pages. Google has to take into account the 1500 other pages for each of my link pages.
Not sure what either statement has to do with the topic though
Title of thread = "importance of location of link on page"
I've presented two example of where using common useability observations can help search engines weigh link importance and improve their search results.
1) location of link on page (above the fold)
2) formatted size of the link
That would be about 90%+ of link pages.
My point exactly. If its a link list, don't weigh the A's higher than the rest of the link list just because its above the fold... an exception just for link lists.
Yes, perhaps, but I doubt that Google would bother.
I know why they would bother: To keep state of the art SERP's and stay ahead of their competition.
There are several threads on WW that have reported Googlebot spidering both css and js files.
I'm sure Google or any other search engine would make the decision to actually include this kind of link analysis during beta testing, the same as when they introduce any new algo element.
Calculating the location of the link based on its position in the HTML is seriously flawed (and I don't need a PhD to figure that one out) given the complex layouts of sites.
Title of thread = "importance of location of link on page"
That is taking the thread out of context as it goes on to read "it would make sense, but does google actualy take it into account?" with the part you *omitted* being the question. What you stated previously was simply the obvious, i.e high links get more human clicks.
Sorry guys, I see nothing that would give good reason to believe that Google adds more weight to links based on their placement. Sure there is no doubt an exception to the rule, but that's about all.
Even Google sorts it's directory (by default) by page rank. If they were then to use 'link placement' as an importance factor on other directories and link pages it would be a contradiction.
Dave
Calculating the location of the link based on its position in the HTML is seriously flawed (and I don't need a PhD to figure that one out) given the complex layouts of sites.
Calculating the visual location of a generic object (a word, a picture, etc.) is just what any web browser does.
Some layout engines are even open source (Gecko).
I don't think that Google uses this technology for now, but I think they are able to implement it.
Calculating the visual location of a generic object (a word, a picture, etc.) is just what any web browser does.
Yes it is, and how loong does your browser take to render an average page? Now multiply that by 5,000,000,000. Then consider that they will also have to crawl every image, CSS and JS file that they currently ignore. Oh yeah, they also need to do a bunch of additional computations on every element on the page and add that information to the index.
The way I figure it, you are probably looking at around 20 million+ additional CPU hours each month not counting the additional crawling involved. And you can forget about getting your fresh content in there as fast as it is now.
They already have a hacked up browser to look for hidden text, and they only run it on spam reports. I'm sure they would love to be able to run it on all the pages, but they simply aren't going to.
Coincidentally, I became suspicious about ten days ago that Google might be doing this. I've even considered running some tests.
Kaled.
Now multiply that by 5,000,000,000.
BigDave, I agree with you. As I said in a previous message, "Google does not have the technology to render billions of pages".
My point was that it could make a sense if in the future they will be able to do it.
Also, I think that they don't really need to render a page, but they could just calculate the coordinates of links. Visual rendering is probably necessary for hidden text detection, not for calculating coordinates.
Finally, they could start applying this technology just to a subset of the pages in the database and not to the entire archive.
I'm not saying that it's a problem with a simple solution, I understand very well how complex this technology would be and how much resources it would need.
But I'm confident that we'll not have to wait much time to see this technology (partially) implemented, IMHO.
Other than and 101 Kb limitations, I do not think so at the moment.
Sooner or later, if not already; - words surrounding the link and headings preceding the link could make sense "location wise" - but mainly for ranking - not for Pagerank purposes.
One could speculate if, in the future, with enough advanced toolbars installed, the click-through of one link over the other (and therefore to a certain extent its location) could play a role. But should a popular link mean a lead to a more important (Pagerank'ed) page?
Do you suggest that it is better to spread links out throughtout a site within the general content pages? Is this better than on a links only page, which many people typically have?
If so, is the rational for this better distribution of PR, or is it for some other reason?