Forum Moderators: open
something like this:
incoming links 10
metatags 0
keyword density 7
directory levels 1
page type 4
valid html 2
Then each section would have to be broken down further ie:
Incoming links:
link text to page 6
link relevance 8
No. of links 10
referrer pr 8
This is a bit simplistic but you get the idea.
know anywhere?
If not who wants to help?
[webmasterworld.com...]
[webmasterworld.com...]
(read them)
How is Page Rank calculated?
(this thread was renamed by moderators)
Speeding Up Permanent Entry - Adwords? Links?
>Most Reliable way to speed up getting PageRank of 1 or 2
The above threads I really see as "How Googles Works for non-geeks" - A complete set of rules to how google works
and how to get the best out of it.
for some reason - no one is listening, makes total sense to me. Your right. I've suggested setting up a SEO language/variables and a set of rules for google, like 1-3 pages.
It's all too ad hoc right now, and no scientific methods are applied to the current state of google.
In a short sentence too much wish wash and no peer reviewed
hard facts on how things work. There should be a summary of the wisdom so far. A non Geek is not prepared to wade through thousands of posts and read loads of outdated papers, MDs want hard and fast quick set of rules - no "noise" just the nitty gritty.
Think of it as a Cheat Sheet for Google, this should be public/open source/free/non-commercial and peer reviewed by everyone who's interested in SEO.
I wish people would stop the personal stuff, and start getting something sorted.
Think of it as a Cheat Sheet for Google, this should be public/open source/free/non-commercial and peer reviewed by everyone who's interested in SEO.
you heard 'em Brett, skip the whole open forum concept and just condense wwworld down to one post
;)
Seriously though, how many times have you referred people to the google knowledge base, have you hit the billion mark?
behold, thyn mighty golden thread:
[webmasterworld.com...]
Essentially, it's like a public, open source, free, non-commercial, peer reviewed by everyone who's interested in SEO cheat sheet for Google.
The title of this thread is "How is page rank calculated"
You're not likely to get many people taking time to answer your question, because to many of us, the question is rather pointless.
The title has been edited but the webmasters here - see below of what the title should be.
My best guesses about the subtle knife's identity:
Think of it as a Cheat Sheet for Google, this should be public/open source/free/non-commercial and peer reviewed by everyone who's interested in SEO.
you heard 'em Brett, skip the whole open forum concept and just condense wwworld down to one post
;)
The forum is perfect for discussion, but we have
no discussion reference point, or any method
to distill the results. Just more
and more information, judgements , and ad hoc speculations.
What I suggest is simple. A list of Rules/Axioms that describe
Googles Behaviour with real examples - these are peer reviewed.
In other words, go back in all posts forums, filter our
what generally seems to be in agreement, and Create
a simple list of conclusions.
Did anyone even read the the BBC news about the power of a web think thank?
I suggest setting up a wiki - this means free information
exchange, and any one can edit the conclusions.
That way we can move forward, instead on endlessly speculating. No-one should have to read tonnes of posts etc.. just to get an inkling about how google works.
We are talking about creating a public book, that could be described as "How Google Works for Non-Geeks/Dummies and how to getthe best out of Google to Improve your page rank - in 24 hours" Surely you the idea by now.
behold, thyn mighty golden thread:
[webmasterworld.com...]What's golden about this? this is just common sense, you build a "proper" site, it falls into place and get's found and gets a page rank, in 12 months. Basically a no brainer. You can build a site with one sentence and it get's page rank in 12 months. How the hell does that actaully help someone in an commercial business enviroment,
where getting found quickly in the right place is critical.As for building a successful sites, well, sorry life is too short. I'm talking you know for a *fact* that you'll be in with a new within 60 days - with a page rank, and
you'll know why your in, and where is it why you rank where you are - anything else is fine for non-commercial sites who have the time and money to burn. I sure most people don't have time to endlessly research and can put aside 12 months to get in google. So it just doesn't cut I'm afraid.This is possible, as I've seen seen sites go in within 60 days, last Nov, and the site has not changed, and it has a page rank of 3 now - perhaps it was just luck, I don't know - I need to know. Plus the site is just bollocks and personal home page with very few links. NOTHING here tells me here the mechanics of this. Is this a social chit chat forum to make fiends to figure out was is going in a scientific open way? I'm talking real examples here - not theory.
I'm talking rule, assumptions, example. If it is just sodding links, we all have sites, let's set up a test. we know where the sites are, what there current rank is, and how the rank of the new site changes.
Why is this concept so hard for you SEO lot to understand?
You'll never get a rule book for that situation. You're better off spending the time developing more great content. Google is designed to reward content, that we know for sure.
[webmasterworld.com...]
Think from googles perspective, the information/language/medium that they use is no different from us or anyone else. Google say they use "100 variables" to weight their results, but, all things being equal, just how many variables are worth considering? (barring the fact googles resources are not unlimited)
I'm talking rule, assumptions, example.
Many webmasters come here, many more have sites online. When someone makes an example, it applies to some of us, most usually not all of us. The "golden thread" you referenced is one of those threads that can apply to almost all of us.
And as tedster says, the algo moves with the landscape, so when you want to apply any scientific method (inside or outside SEO), its hard to measure goalposts when they are moving IMO :)
What's golden about this? this is just common sense -- how does it help someone in a commercial business environment?
dude. it tells you everything you need to know, right down to generalized specifics:
"Use the keyword once in title, once in description tag, once in a heading, once in the url, once in bold, once in italic, once high on the page, and hit the density between 5 and 20% (don't fret about it). "
you want more specific than that?
It isn't about getting a page rank in 12 months, it's about getting a consistently well trafficked site in 12 months. It's about building a quality site from the ground up.
I have a site that was put live last friday, and it's in the top ten for one of my phrases already. Speed is irrelevant, 12 months / 12 days is irrelevant. You end up doing the same junk either way, how fast you do it is up to you. It only takes one link to get into google, and every site gets a page rank within an update or two... google is capable of getting your site indexed very quickly. Generally stated (egad i know), it will take a couple of months before that site will be as engrained in the web as its competition, and therefore it will take that long to rank among them.
There is no specific consistency with google, this is what people are trying to tell you (the two responses above this are from extremely well respected webmasters/SEOs). The weighting of factors change all the time. It is infinitely more valuable to catalogue general information about what google likes/dislikes, over trying to nail down specifics such as:
"three links in via perfect anchor text from on-topic pr 5 sites to a 4 week old pr 4 site will increase the page rank from 4 to a 5 and jump ranking 10 places if you were at 20 before... on a phrase that returns 1.2 million results"
You to want to 'know' this stuff, but its not knowable. It's workable, but not knowable. It could only ever be conjecture based on observation, and it could never be exactly right at predicting google's behaviour. But the concepts of PR transference, incoming links, anchor text, and competition are all discussed at length in the forums.
The simple fact is, you can do two identical sites, and mirror every single aspect you have control over, but the sites will perform differently. All we're trying to say is, don't let that fact bug you, or you're getting caught in a mental trap that's not likely to be productive.
Most of us are in a commercial business environment. If you don't have time to build a site with content, it kinda means you don't have time to build a proper site. You can always buy traffic.. but thats not a topic for the google-news forum.
I'm talking real examples here - not theory.
tedster and brotherhood of LAN have expressed why theory is more valuable than examples at webmasterworld. Outdated examples would be much more detrimental for the novice to wade through than outdated theory IMO, and every example is outdated just days after it's given.
We've found that all necessary information can be conveyed without the use of specific examples at webmaster world... this is very rarely not the case.
What's golden about this? this is just common sense -- how does it help someone in a commercial business environment?dude. it tells you everything you need to know, right down to generalized specifics:
"Use the keyword once in title, once in description tag, once in a heading, once in the url, once in bold, once in italic, once high on the page, and hit the density between 5 and 20% (don't fret about it). "
you want more specific than that?
It isn't about getting a page rank in 12 months, it's about getting a consistently well trafficked site in 12 months. It's about building a quality site from the ground up.
Fiver's post is much the same - basically, let's not try to create
a proper collaborative project, things change too much, bla .. bla..
60 PHDS, one brain surgeon etc...etc..why bother...we have the 12 months plan...
Well, I've got news for you, we are all combined better than 60Phds, and we can come up
with a good fuzzy logic description of google, the will is not
there as far as I can see, it's all "hands up" type of talk.
I'd had sites in within months, with no effort, and virtually NO links,
they were submitted in Nov, and now the site has a PR3 - with very little
content.
Another site, >WITH NO CONTENT - Nothing! <, and virtually no links to it get's a PR3
without trying. It's effectively a doorway page. How do you explain
that? The link test in google, produces 3 sites! Of which
I made no concious effort to get into! I thought a link test only worked if you had PR4?
You can get a site in google with PR3 in a 1-3 months, and you can
get a PR4 also quickly there are ways I've found.
This is probably just luck, BUT I NEED TO UNDERSTAND
WHAT'S GOING ON. If you can do that with some knowledge,
what can we do if you all get some better knowledge?
Again, not good enough, we should be able to come up with a very precise
set of rules. Even if they are "fuzzy" logic. The google algorithm is not the Holy Grail, it's a bit of software written by humans who've been lured into it by money. google attempt to change the word "google" into a verb for trade mark clearly shows they'd passed the mark, and are too powerful. Only a combined collaborative project (like liniux) can sort this mess out.
I thought a link test only worked if you had PR4?
You've got that backwards. The pages linking to your page must be PR4 for them to show in the list of backlinks. <add> Your page itself can have any Page Rank value and still show backlinks </add>
If you have 100 links from PR3 pages and 1 link from a PR4 page, a link check on Google will show you as having one backlink - the PR4 one.
That's why it is recommended that you also check links on another search engine (I forget for the moment which one is preferred), as other don't use Page Rank and therefore show all your backlinks that they know about.
[edited by: deejay at 8:23 pm (utc) on Feb. 26, 2003]
You can get a site in google with PR3 in a 1-3 months, and you can
get a PR4 also quickly there are ways I've found.
This is probably just luck, BUT I NEED TO UNDERSTAND
WHAT'S GOING ON. If you can do that with some knowledge,
what can we do if you all get some better knowledge?
I'll give it a bash on what ive read, not much fact involved (for fact, see google papers on backgrub-hilltop etc)
You started with a pagerank 0. This is because your page was new to the web and no one had referenced it as a resource. In laymans terms this means your site is unimportant to the world -at_large-
The pagerank risen because people with more important websites chose to link to you, offering a % of their "authority" in the form of a link to your site. This raises your PR. If the sum values of all pages was 1, you would still be closest to 0, but not as much as you were without the PR. So once a scale of 1-100%, with a PR3-4, perhaps you are in the top 60% high PR pages.
Once again, if the sum of all Pagerank and pages was 1, and you got a link from google home page (which will have one of the largest percentages of one due to its high number of high PR and volume of links) then most likely youd be in the top 10% bracket of highest PR sites.
Pagerank is just measuring how the web is reacting to your website. They are numbers. You cant break the numbers down into anything simple, though you can read up on why they chose to represent their algorithm in this way.
IMO Brett's theme pyramid (and the other threads referenced) set out to do this. Any they don't just apply to the "mom and pops" or the larger sites of this world, it applies as a general rule of thumb that SLICES through the algorithm so we can have it in human digestible form ;)
Anything outside I imagine most of us will be blinded with maths, coefficients, charts, and too many pretty graphs to actually say anything constructive to a larger audience.
It's hard enough knowing their is a 100 variables, without knowing their are many ways to use the variables......perhaps if someone were to construct a chart of all possibilities (brett, i know your busy nowadays), we could all share some good information between us ;)
Think of it as a Cheat Sheet for Google, this should be public/open source/free/non-commercial and peer reviewed by everyone who's interested in SEO.
It certainly does take a person with clarity of vision to initiate such a project, but others don't always necessarily comprehend it or see it the same way prior to it being operational so they can see it for themselves.
In your place I'd analyze the variables and set the naming conventions myself in a glossary type feature on a website and get something going starting right here:
[hotscripts.com...]
> That's why it is recommended that you also check links on another search engine (I forget for the moment which one is preferred) ...
I use alltheweb with the syntax:
url.all:my.do.main
It differs from the Google link: search in two ways:
1. It shows links to all pages on your site (Google shows links only to the specific page you query).
2. It shows all the links it knows about, including many PR0 ODP clones as well as legitimate pages with low (but not zero) PR.
I have enough links from pages with PR of 4 or more that I do not really care about links from low PR pages, but I do like the ability to learn about the few deep links I get without having to do a Google link: search for each of my pages.