Forum Moderators: Robert Charlton & goodroi
URL: [example.com...]
PR: 8
URL: [example.com...]
PR: 7
URL: [example.com...]
PR: 6
URL: [example.com...]
PR: 5
URL: [example.com...]
PR: 4
URL: [example.com...]
PR: 3
URL: [example.com...]
PR: 2
URL: [example.com...]
PR: 1
URL: [example.com...]
PR: 0
What do you guys think?
I have tended to notice that subpages on a high pr site seem to get ranked much higher in serps then a separate page that you link to from that high pr site.
Very interesting. Does this mean if you have a pr 8 site its better to make a page at www.site.com/widgets then it is to link to a site made at www.newsite.com
What do you guys think?I have tended to notice that subpages on a high pr site seem to get ranked much higher in serps then a separate page that you link to from that high pr site
It's definately better to create the page on the old site for ranking purposes. There is no Sandbox for pages so it will get your indexed and in the serps quicker.
This is one of Google's biggest flaws. I am seeing more and more of www.authoritysite.com/choose-high-paying-keyword-that-has-nothing-to-do-with-site/...
I have been able to get indexed and listed in top 2 pages in days using isp web pages that give you a subdirectory from their home page while my 7 month old site is sitting in the sandbox.
What I did is use these pages to "brand" my real site so that I get the branding I need. So to answer your question, its a good idea to do both.
[edited by: JaySmith at 3:55 pm (utc) on April 19, 2005]
For people who think the sandbox does not exist, try this.. You will see there is definately some type of aging filter put in place.
The original tests were performed on W3C Link Checker using Firefox v1.0.2 (with Google Pagerank Status v0.9.3) on Ubuntu Linux. I just tested this on Windows using both IE and FF. It produced the same results.
PR 8: http:// validator.w3.org/checklink?uri=www.example.com
PR 1: http:// validator.w3.org////////checklink?uri=www.example.com
Click on your inbox to go to your stickymail, ^^^above^^^, then look at the url and then add slashes between the .com and the stickymail.cgi eg.
.com////stickymail.cgi
PR decreases with every slash.
I have always wondered how google can allocate pr to these pages. My guess that it doesn't, but the toolbar gives an approxomation of the pr by looking at its directory depth and root domains pr.
[slate.msn.com...]
PR7, must be PRonemillion with out the slashes
You might want to use the Google toolbar instead, especially before posting something so obviously wrong.
[slate.msn.com...]PR7, must be PRonemillion with out the slashes
You're comparing a site with mod rewrite and dynamic content to sites without.
On the other hand, I did try that validator site that moltar suggested and was able to repeat what he showed.
So unless my brain hasn't kicked into gear, it seems to affect only certain sites. Can anybody explain this?
oh and Steveb, I guess as usual, everyone else is wrong and you are right.
Try this
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
You might want to use the Google toolbar instead, especially before posting something so obviously wrong.
I said later on:
I just tested this on Windows using both IE and FF.
And IE obviosly had a real toolbar, not an extension ;)
i think it happens for pages that either cannot be crawled or maybe have been crawled with a different query string. Thus my stickymail example.
I would say that it is not PR at all but just the toolbar trying to estimate pr.
In the case with W3C link checker, yes, it cannot be crawled, but many people link to it. PR was calculated based on incoming links, so it shouldn't affect anything.
What are you talking about? What was posted originally is total nonsense, as anyone can check with any URL. It's not news either that the Google toolbar has never read complex query strings right, ever, but what does that have to do with the original post? And what's then point of post those stickymail URLs where none have pagerank?
Is it still April Fool's day in some places?
(Or are people seeing this mysterious behavior with static urls all using Firefox)
[edited by: steveb at 5:52 am (utc) on April 21, 2005]