| 2:09 am on Jan 4, 2002 (gmt 0)|
I haven't been getting anything on the toolbar for the past hour. Maybe that's a sign that they are hard at work trying to fix this month's PR glitch. :)
| 2:26 am on Jan 4, 2002 (gmt 0)|
Think it was a glitch WG?
| 2:27 am on Jan 4, 2002 (gmt 0)|
Maybe ... :) It aint helping my midnight search for good link candidates any ;)
| 2:29 am on Jan 4, 2002 (gmt 0)|
I'm just getting a blank (white) page rank for every site too.
definately something going on :)
Hopefully something good :)
| 3:35 am on Jan 4, 2002 (gmt 0)|
>>Think it was a glitch WG?
It looks to me like a glitch. It is just too wide spread, and it has been applied to far too many sites for me to think it's some kind of intentional spam filter.
If it is a spam filter, what is the point of only removing the PR of the pages, but keeping the content in the database? All of the sites that I've been watching that have experienced the zero pagerank hit always end up back in the database the following update.
I just can't see the logic in a spam filter that only kicks out offending pages every other month. (Unless of course, the real purpose of the filter is to simply rotate results).
I think that server errors may be playing a role. I spent some time digging through some error logs on a site that has been having this problem. I found that just about every other month, Googlebot has been showing up and making numerous requests for a file that hasn't existed on this particular site in almost three years. The file was an old duplicate index page that had been built by a previous seo firm.
On each request for the page, Googlebot was given a standard 404. Like clockwork, the update following the crawl containing all the 404's for the old page resulted in zero pagerank for the entire site.
Apparently, somewhere on the web, there are a couple of pages that still link to the old index page. Because of the old age of these pages, Googlebot doesn't come across these links in every crawl, but when it does, it adds the URL to the que and makes several attempts throughout the month to retrieve it.
When it finally gives up, the PR for all the pages are dropped, but the contnent from the previous crawl remains in the database, and Googlebot shows up again the following month.
Now temporarily removing a site that returns a high number of 404's in a given crawl may make sense, but I think that some of the new things they've been toying with may have caused Googlebot to collect a much higher than normal level of server errors.
Since most hosting companies don't provide access to server error logs, it becomes quite difficult for a lot of webmasters to pinpoint what the problem is, which makes the whole thing that much harder to diagnose.
| 3:55 am on Jan 4, 2002 (gmt 0)|
WG - I am sure you are onto something here !
I haven't had any sites hit by the recent 'PR Glitch', but I DO use custom 404 pages on all of my sites.
I took a look at the logs of one of my older sites , thats been around for five years or so and has had three or four complete re-writes, and sure enough there are hundreds of googlebot requests for pages that havent existed for years, but in each case the spider was fed a custom 404 page.
So the work around looks like it could be Custom 404 pages !
| 4:59 am on Jan 4, 2002 (gmt 0)|
>Google Glitch ?
I just looked and the PR is now gone from several sites I watch - not grey, just gone. On one that has some quality links including Google directory the toolbar says nothing showing but 27 are showing at the Google site.
One night early during the update the PR kept flickering in and out on the toolbar, all night. It was very peculiar, I've never quite seen that, even during updates.
>making numerous requests for a file that hasn't existed on this particular site in almost three years
Still looking for a directory that hasn't been on a site since last April, and there never were any external links to it, except from another of my sites.
| 5:05 am on Jan 4, 2002 (gmt 0)|
I noticed this too. Started around 4-5PM CST.
| 10:57 am on Jan 4, 2002 (gmt 0)|
yes, i hope it is glitch.Must be a glitch. all my 5 sites 0 pagerank. Bad resuls (my pages nowhere to find) is now for around 4,5 days.
BUT i checked some of other not mine sites and those on top google results and they all have PR ok. So it effects just some pages, not all.
But it must effect a lot of them, otherwise it is really strange that all 5 that i own are affected. They are crosslinked ,yes, maybe entire "thread" of linked sites is effected.
| 11:55 am on Jan 4, 2002 (gmt 0)|
Not pretending to be an expert, in fact only found this place a month or so ago and just lurking around trying to find out how to rank well (any tips very gratefully recieved!) but my restaurant review site last month had a PR of 0 on the toolbar but this didn't affect the results at all, this month it jumped up to a respecitable 5. Are your positions affected or is it just a visual thing?
| 4:37 pm on Jan 4, 2002 (gmt 0)|
PR heavily influences SERPs
| 5:44 pm on Jan 4, 2002 (gmt 0)|
Just to clarify, we're really talking about two different things in the same thread. Conor's original post was about the Google PR toolbar not funtioning for any sites. For a couple hours yesterday it just wasn't working.
The toolbar ceasing to function won't have any impact on search results. The PR is still there, it's just a matter of the toolbar not being able to retrieve it.
The other issue is the zero PR penalty everyone has been talking about. If your site was hit with it after the latest update, then it will effect search results.
| 6:08 pm on Jan 4, 2002 (gmt 0)|
I have a custom 404 page on my site, and with this last Google update 4-5 days ago, I went from a PR5 to PR0. I'm still confused as to why. I don't use spam techniques. My traffic has decreased a lot. Maybe it really is a glitch. It seems so widespread. This has never happened to me before.
| 9:04 pm on Jan 4, 2002 (gmt 0)|
I don't think it has anything to do with whether or not you have a custom 404 page. If you use a custom 404 page you will present a page with links that can be crawled, but your server will still return 404 rather than 200. I think it is the total volume of 404's GoogleBot receives in a given month that is contributing to the PR being dropped.
I'm going to collect some info and then post a new thread in a moment.