Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Hate to beat an old drum again, but the idea is to get the message across and provide a navigable web site as efficiently as possible and to as many people as possible right? If circa 1966 pages without java, js, graphics, cut the mustard then do it. Most likely it does.
In 1990 I acheived 99% with 1 MG of memory, 2 floppies and wordperfect than I do now with 150 megs of MS Word on 32 MG memory.
In many cases simple text pages with maybe one pretty piccie to spice it up, are far more efficient than "hi tech" pages, and in some cases, far more effective - (for both the user/reader and the Search Engine)
Simple is beautiful.
With PR6, I can knock the other sites out since they're usually PR4 or less. The trick is to try and balance marketing with positioning. A site with too much clutter may not generate calls, so I have to be careful how to organize it.
People will move acroos to directories if SE's only bring up all the ugliest, least navigatable pages...
“The heart of our software is PageRank™, a system for ranking web pages...”
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important." “
It’s the heart of their web page ranking system, not the whole body - a vital part, but just one part of the whole.
It is only about how many other pages link to yours, and how many pages link to those pages. You get an extra tick if those pages who link to you include words relevant to your page, but that’s all.
A more accurate term than page rank would be ‘link popularity rank’. Not that I think we’ll get Google to change their terminology :) , but keep that phrase in mind when you talk about Page Rank and things will become a lot clearer.
It’s not about any other optimisation you have on your page - the structure of your html, keyword density, use of heading tags, keyword proximity to top of page, title tag, page size or any of those things we spend so much time tweaking.
It is all of those things PLUS “Page Rank” that go to make up your position in the SERPs (search engine result pages). If you really want to know why you ended up no 7 in a search for the keyword ‘widget’, then you need to think in terms of all these factors, not just page rank.
Recently I was asked by a website owner asked why his company’s PR7 page was not appearing for a term, when his PR6 competitor had the No 1 and 1a positions. The answer was simple.
The competitor had the particular word (widget) appearing 9 times on the page - in the title, in an H1 heading, in two links heading off the page and five times in the body of the page.
The website owner had the term once. He then referred to their proprietary name (widgy-didget) throughout the rest of the page.
Well duh. No amount of page rank was going to save him.
I think it is time we put page rank in its proper perspective. It is one factor in the equation - a relatively heavily weighted one maybe, but still just one factor.
“Guess I've seen it too many times where the really simple pages with tons of content and links are at the top. Even just simple pages get ranked higher. Graphics, java and anything else that mucks it up will hurt you. If you want high PR, make your website circa 1996.”
“TONS OF CONTENT AND LINKS” - well yea, because tons of (good) content BEGETS tons of links.
“If you want high PR, make your website circa 1996.” I disagree. If you want high PR, then have great content on an attractive subject and tons of it so people want to link to you.
I could have the most fabulous page ever conceived: fast, gorgeous, content up to the gills, so cross-platform-compatible your ten year old sandwich maker could toast the content onto the face of your sandwich - the kind of page that makes you want to lean back in the chair and light a cig after you read it - but if the content is on a topic that no-one is interested in linking to.... my PR is gonna be zip.
On the other hand, I could have one page with the word “widgets” in completely unformatted text top left of page, with no title, no nothing.... default text color, default type. Now if everyone here linked to that page from their top PR page, and better yet they used the word ‘widget’ in their link text.... yippee, PR up the wazoo.
Even if we got to PR10 on the one-word widget page though, I’ll bet if I were to query ‘widget’ in Google, that nothing-of-a-page would not come up tops in the SERP.
Because Page Rank (aka link popularity rank) is only one factor.
I just think we’d get a lot further in our discussions if we first recognised PR for what it is... and maybe more importantly what it isn’t.
We're talking about someone who created a pile of useful software (gcc, emacs, etc.) and gave it to the world - and inspired tens of thousands, if not millions, of other people to create more such software. If you had to try to put a monetary value on it, you'd be talking tens of billions of dollars.
There are places where PageRank really is just totally irrelevant, and this is one of them.
Put a huge amount of content on your home page with very simple html and tons of links. Google's "evaluation" of good content is nothing more than giving higher marks to these kinds of sites.
This has nothing to do with the google's software rubbing its chin, saying "hmmmm, this is a quality site" or any silly stuff like that. I can take any topic, write a 10,000 word home page, link to 200 sites of similar subject matter and Google would consider it to be the best site. For many subject matters, there aren't going to be many inbound links because they're commercial sites and not informational. When that happens, all you really need are a few inbound (create your own network) and go crazy on the outbound, particularly to any high ranking sites.
While I like this search engine, like I said, their approach is simplistic. I get aggravated looking at empty lawyer directories that have the top spot for no other reason than they link all of their 150 or so regional directories together using hidden links and use their keywords at the bottom of the page in just slightly darker than the background color a few dozen times.
Need examples? I'd be happy to provide them. These are not the high quality sites you're referencing that the mighty google bot has blessed.
I am supremely confident googlebot has no clue who Richard Stallman is
Of course not, but an awful lot of people who have web pages have, and Google sees their links! My point was that Richard Stallman's page is well-known because Richard Stallman is famous (for reasons that have nothing to do with his web presence), so it's not a useful example.
This page is not of interest to most people, however THE PERSON that wrote it is highly regarded by many other sites.
THIS PERSON'S Opinion is given more weight than others.
Same with their page, but what exactly is he going to get hits for:
his name - which is clearly what the page is about.
If you look at the links in his page - most use accurate anchor text in describing other pages.
This page is a great predictor of the content of other pages. There is nothing wrong with the use PR here.
PR doesn't always equal great ranking PRxIR does. This page has high PR, but low IR for most queries other than the only one that is clearly relevant.
The anchor text from this page is very helpful in assigning other pages a rank for both PR and IR.
And yes - it is true - if you had pages on the web back in 1996 - they are probably ranked higher than other pages out there.
If I was out to give the world a search engine like Google, I would make sure that I am able to put top quality sites on the top of SERPs.
See it this way: In a category named "Richard Stallman", the private homepage of Richard Stallman (the "official homepage") is most certainly the most important one.
If I am Google, I want that anybody searching for "Stallman" will see his homepage as the number one search result (same for companies like Microsoft, Real, Cisco or "Stamford University").
And, I want this page on #1 no matter how many people link to it! So Page Ranking does not apply to certain web sites.
And, I I want this page on #1 no matter how good or bad it is coded! So Googlebot's software does not apply to certain web sites.
Therefore, Google has editors who have the power to "override" PR and Googlebot if they want too (actually we know they can - otherwise they would not be able to ban sites).
Is this good or bad?
I don't know, but this is the way I would do it (making sure surfers get what they want - if a user keys in "Chrysler" he wants the Chrysler web site and not a "Chrysler hate page" that has a lot of links to it).
However, there is a potential for ethical arguable use of this "editorial" option.
Relax and trust Google!