| 9:58 pm on Aug 14, 2002 (gmt 0)|
Funny - his page comes up as a gray bar when I click through to it ...
| 9:58 pm on Aug 14, 2002 (gmt 0)|
I'm not entirely sure what you're saying - html structure hasn't anything to do with the PR.
I'm looking at stallman.org on that page - it's a PR8 with 1960 back links.
Getting nearly 2,000 links to a single page? I wouldn't call that so easy.
| 10:06 pm on Aug 14, 2002 (gmt 0)|
In the Google directory you see a different green bar as you do on the regular, browser installed toolbar. This site is coming in at 8/10 in the browser bar.
| 10:16 pm on Aug 14, 2002 (gmt 0)|
Still comes up gray for me. :)
| 10:23 pm on Aug 14, 2002 (gmt 0)|
I noticed that the bar doesn't light up sometimes when you click a link to a site. Try refreshing the page or else cutting the url and pasting it back into the address bar (I know, sounds dumb but it works).
| 10:26 pm on Aug 14, 2002 (gmt 0)|
You OWN an entire category (Computers > History > Pioneers > Stallman, Richard).
You better complain to Google because you should have a PR of 10!
| 10:38 pm on Aug 14, 2002 (gmt 0)|
Aah. I had to completely shut down IE and open it up again, THEN paste the link and the PR showed up.
Thanks for the tip.
| 2:41 pm on Aug 15, 2002 (gmt 0)|
Guess I've seen it too many times where the really simple pages with tons of content and links are at the top. Even just simple pages get ranked higher. Graphics, java and anything else that mucks it up will hurt you. If you want high PR, make your website circa 1996.
| 2:51 pm on Aug 15, 2002 (gmt 0)|
Claris Home Page version 1.0 (7-18-96) is my friend :)
| 3:04 pm on Aug 15, 2002 (gmt 0)|
"If you want high PR, make your website circa 1996."
Hate to beat an old drum again, but the idea is to get the message across and provide a navigable web site as efficiently as possible and to as many people as possible right? If circa 1966 pages without java, js, graphics, cut the mustard then do it. Most likely it does.
In 1990 I acheived 99% with 1 MG of memory, 2 floppies and wordperfect than I do now with 150 megs of MS Word on 32 MG memory.
In many cases simple text pages with maybe one pretty piccie to spice it up, are far more efficient than "hi tech" pages, and in some cases, far more effective - (for both the user/reader and the Search Engine)
Simple is beautiful.
| 3:46 pm on Aug 15, 2002 (gmt 0)|
I build sites for mass tort law firms (PPA, toxic mold, etc.) These are generally very comptitive. I can get a PR6 with a one to two page website with the home page having a load of content and resource links to information about the mass tort. I keep the site incredibly simple, with very few graphics and no graphics for the nav bar, just hyperlinked text.
With PR6, I can knock the other sites out since they're usually PR4 or less. The trick is to try and balance marketing with positioning. A site with too much clutter may not generate calls, so I have to be careful how to organize it.
| 4:28 pm on Aug 15, 2002 (gmt 0)|
I've seen plenty of posts where someone noted a PR4 listed above a PR6 so maybe your position in the serps isn't really PR driven! Maybe you're just doing everything else right! But I'm wondering how you get a tort law firm up to a PR6. I mean the top site in the serps for "tort law" is a PR6 and that's on a university site! I've now got a PR7 site but it took a lot of inbound links to get there. When I was a 6, I had a lot of links including several DMOZ listings for my level 2 pages, Yahoo listing, and a bunch of links from PR7s and even a few 8s. So my question is, how do you build a site for someone and get it to a PR6 without either charging them through the nose or spending more time than you should.
| 4:29 pm on Aug 15, 2002 (gmt 0)|
I like light web sites, but his page is really ugly! No navigation menu, too much text on first page, ugly graphics with blue outline, too many links all over the page. It looks like a mess and makes me want to leave right away.
| 7:29 pm on Aug 15, 2002 (gmt 0)|
I think its the fault of the SE's. Its their algorithms that mean that they think a old, boring, cluttered, visually totally unappealing site is a good one. Sure if your number 1 aim is to get high and no more, then i page like this is fine, but i for one would not want my company image to be portrayed by a site like that. Rather like in the real world if you owned a shop...a department store for example. You wouldnt want everything cluttered right up in the lobby, all spread out on the floor, having to crawl over piles of useless stuff designed simply to get you into the shop just to get to what you want. I blame the SE's totally...if they were better at their job then people wouldnt be forced to have to make bad pages to be marked well...
People will move acroos to directories if SE's only bring up all the ugliest, least navigatable pages...
| 9:49 pm on Aug 15, 2002 (gmt 0)|
The inbound links to the mass tort sites and all my other sites for that matter all must be the same topic matter (legal in nature) and have a decent page rank. I'd rather have a few PR6 sites linking in than 50 PR 2 sites that are off topic.
| 10:40 pm on Aug 15, 2002 (gmt 0)|
Hint: Its possible to do simple, visually appealing sites using straight HTML and good content that will rank well.
| 11:28 pm on Aug 15, 2002 (gmt 0)|
I think we get a lot of crossed wires when discussing Google’s ranking systems, mostly because of the terminology. “Page Rank” is such a generic sort of term that it causes a lot of confusion, so we need to refer back to Google and find out what they actually intend it to mean:
“The heart of our software is PageRank™, a system for ranking web pages...”
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important." “
It’s the heart of their web page ranking system, not the whole body - a vital part, but just one part of the whole.
It is only about how many other pages link to yours, and how many pages link to those pages. You get an extra tick if those pages who link to you include words relevant to your page, but that’s all.
A more accurate term than page rank would be ‘link popularity rank’. Not that I think we’ll get Google to change their terminology :) , but keep that phrase in mind when you talk about Page Rank and things will become a lot clearer.
It’s not about any other optimisation you have on your page - the structure of your html, keyword density, use of heading tags, keyword proximity to top of page, title tag, page size or any of those things we spend so much time tweaking.
It is all of those things PLUS “Page Rank” that go to make up your position in the SERPs (search engine result pages). If you really want to know why you ended up no 7 in a search for the keyword ‘widget’, then you need to think in terms of all these factors, not just page rank.
Recently I was asked by a website owner asked why his company’s PR7 page was not appearing for a term, when his PR6 competitor had the No 1 and 1a positions. The answer was simple.
The competitor had the particular word (widget) appearing 9 times on the page - in the title, in an H1 heading, in two links heading off the page and five times in the body of the page.
The website owner had the term once. He then referred to their proprietary name (widgy-didget) throughout the rest of the page.
Well duh. No amount of page rank was going to save him.
I think it is time we put page rank in its proper perspective. It is one factor in the equation - a relatively heavily weighted one maybe, but still just one factor.
“Guess I've seen it too many times where the really simple pages with tons of content and links are at the top. Even just simple pages get ranked higher. Graphics, java and anything else that mucks it up will hurt you. If you want high PR, make your website circa 1996.”
“TONS OF CONTENT AND LINKS” - well yea, because tons of (good) content BEGETS tons of links.
“If you want high PR, make your website circa 1996.” I disagree. If you want high PR, then have great content on an attractive subject and tons of it so people want to link to you.
I could have the most fabulous page ever conceived: fast, gorgeous, content up to the gills, so cross-platform-compatible your ten year old sandwich maker could toast the content onto the face of your sandwich - the kind of page that makes you want to lean back in the chair and light a cig after you read it - but if the content is on a topic that no-one is interested in linking to.... my PR is gonna be zip.
On the other hand, I could have one page with the word “widgets” in completely unformatted text top left of page, with no title, no nothing.... default text color, default type. Now if everyone here linked to that page from their top PR page, and better yet they used the word ‘widget’ in their link text.... yippee, PR up the wazoo.
Even if we got to PR10 on the one-word widget page though, I’ll bet if I were to query ‘widget’ in Google, that nothing-of-a-page would not come up tops in the SERP.
Because Page Rank (aka link popularity rank) is only one factor.
I just think we’d get a lot further in our discussions if we first recognised PR for what it is... and maybe more importantly what it isn’t.
| 3:13 am on Aug 16, 2002 (gmt 0)|
"tell it on a mountain deejay!"
great soapbox chatter :)
| 4:48 am on Aug 16, 2002 (gmt 0)|
I see PR 8 on the toolbar; it's mostly due to top 2 inlinks from PR 8's from a gnu project site, each of which contains very few links. A large fraction of the PR from those pages is transferred directly to the gnu guru's personal page; the large fractions of PR 8 just add up to another PR 8. I'm sure the several PR 7 inlinks don't hurt either.
| 1:55 pm on Aug 16, 2002 (gmt 0)|
Sheesh man, this is Richard Stallman we're talking about!
We're talking about someone who created a pile of useful software (gcc, emacs, etc.) and gave it to the world - and inspired tens of thousands, if not millions, of other people to create more such software. If you had to try to put a monetary value on it, you'd be talking tens of billions of dollars.
There are places where PageRank really is just totally irrelevant, and this is one of them.
| 1:57 pm on Aug 16, 2002 (gmt 0)|
Well, let's be clear about something, google uses a software program, it's not omnicient; nor is it capable of any kind of thought for that matter. It has no way of evaluating whether content is good or bad. It only works with the parameters given, which I think are simplistic and too easy to spoof. I can give you example after example of the top ranked site looking pretty much the same.
Put a huge amount of content on your home page with very simple html and tons of links. Google's "evaluation" of good content is nothing more than giving higher marks to these kinds of sites.
This has nothing to do with the google's software rubbing its chin, saying "hmmmm, this is a quality site" or any silly stuff like that. I can take any topic, write a 10,000 word home page, link to 200 sites of similar subject matter and Google would consider it to be the best site. For many subject matters, there aren't going to be many inbound links because they're commercial sites and not informational. When that happens, all you really need are a few inbound (create your own network) and go crazy on the outbound, particularly to any high ranking sites.
While I like this search engine, like I said, their approach is simplistic. I get aggravated looking at empty lawyer directories that have the top spot for no other reason than they link all of their 150 or so regional directories together using hidden links and use their keywords at the bottom of the page in just slightly darker than the background color a few dozen times.
Need examples? I'd be happy to provide them. These are not the high quality sites you're referencing that the mighty google bot has blessed.
| 2:37 pm on Aug 16, 2002 (gmt 0)|
One other thing, I am supremely confident googlebot has no clue who Richard Stallman is.
| 3:55 pm on Aug 17, 2002 (gmt 0)|
|I am supremely confident googlebot has no clue who Richard Stallman is |
Of course not, but an awful lot of people who have web pages have, and Google sees their links! My point was that Richard Stallman's page is well-known because Richard Stallman is famous (for reasons that have nothing to do with his web presence), so it's not a useful example.
| 4:11 pm on Aug 17, 2002 (gmt 0)|
There is nothing unusual about this page. This is not a problem with PR, but shows WHY PR works.
This page is not of interest to most people, however THE PERSON that wrote it is highly regarded by many other sites.
THIS PERSON'S Opinion is given more weight than others.
Same with their page, but what exactly is he going to get hits for:
his name - which is clearly what the page is about.
If you look at the links in his page - most use accurate anchor text in describing other pages.
This page is a great predictor of the content of other pages. There is nothing wrong with the use PR here.
PR doesn't always equal great ranking PRxIR does. This page has high PR, but low IR for most queries other than the only one that is clearly relevant.
The anchor text from this page is very helpful in assigning other pages a rank for both PR and IR.
And yes - it is true - if you had pages on the web back in 1996 - they are probably ranked higher than other pages out there.
| 1:05 am on Aug 18, 2002 (gmt 0)|
Just a thought.
If I was out to give the world a search engine like Google, I would make sure that I am able to put top quality sites on the top of SERPs.
See it this way: In a category named "Richard Stallman", the private homepage of Richard Stallman (the "official homepage") is most certainly the most important one.
If I am Google, I want that anybody searching for "Stallman" will see his homepage as the number one search result (same for companies like Microsoft, Real, Cisco or "Stamford University").
And, I want this page on #1 no matter how many people link to it! So Page Ranking does not apply to certain web sites.
And, I I want this page on #1 no matter how good or bad it is coded! So Googlebot's software does not apply to certain web sites.
Therefore, Google has editors who have the power to "override" PR and Googlebot if they want too (actually we know they can - otherwise they would not be able to ban sites).
Is this good or bad?
I don't know, but this is the way I would do it (making sure surfers get what they want - if a user keys in "Chrysler" he wants the Chrysler web site and not a "Chrysler hate page" that has a lot of links to it).
However, there is a potential for ethical arguable use of this "editorial" option.
Relax and trust Google!
| 3:05 am on Aug 18, 2002 (gmt 0)|
You're making my point.