Forum Moderators: open
A page goes up and inherits it's estimated PR (usually one point blow the linking page). Then there is an update and the page goes PR0. I am wondering if it is because the page was added after the last deep crawl and it will take another month for it to get it's own PR rating. The pages are listed in Google so it can't be a penalty could it?
Anne
The single link going to this new page if a PR4 and share the transferring PR with many other pages can be just below PR1 thus showing PR0 on the toolbar.
You will probably see an increase occur but only after googlebot crawls other inbound links to it.
problem since November or December
I hope it sorts itself out in the next update. If it's just a lag in picking up the inward links it should.
With my pages that do have a good PR (ones that have been up for quite a while) I am finding that having a good user friendly internal linking system to really help. Usually they will make it to just one PR below the homepage then. If I have just one or two outside links coming into the page I seem to move up to the same PR as the homepage.
Anne
It kind of seems to me that new inbound links can take more than one cycle to get picked up and factored into the pagerank. If the page doesn't move up within another cycle or so, you should take a close look at the way you are linking to it.
When you upload a new page a "guessament" is offered but if googlebot does not crawl even one link before the next update the page become unranked (or greybar).
If Googlebot finds a link to the page in question but doesn't get back to crawl all links to the page -- the corresponding PageRank for that update will be low.
Give this time - noting: new pages should have as many links to them as possible offering a better chance of Googlebot finding and following many.
Could this be a server issue that is preventing google from deep crawling properly.
Unlikely a server issue - particalarly if no crosslinks, even a few crosslinks wouldn't hurt -- googlebot needs to identify a pattern (normally associated with lots of crosslinking) to reduce (or penalize) a page or site.
Some possibilities might be:
nofollow tag on linking pages
noindex tag on the specific page
robots.txt instruct bots not to crawl
.htaccess file redirects bot elsewhere or stops access
links to this specific page are deep within your link hierarchy making difficult to find.
Lower PageRank of linking pages means googlenbot less likely to crawl -- crawls more important pages first.
links to the page are obscure - and with no external path to the page googlebot doesn't have a clue it exists.
The best way to resolve this is have a few links to the page as close to the upper left corner as possible on more important pages - once googlebot finds it these links can be removed.
This is a little like navigating a city at rush hour. most people (googlebot) tend to stay on major routes. Obsure sidways, and byways are less dense with traffic because the bulk of navigators don't know they exist.
Make access highly apparent usually works.