Welcome to WebmasterWorld Guest from 188.8.131.52
over 1000 sites and they are showing huge changes for today.
I also saw the chart that this is the largest number of changes since they started tracking the top 1000 however in checking our sites there do not appear to be any changes and the Goggle forum is very quiet.
I wonder if they access one data center that has made changes while the other DC's have remained the same?
As of 2005-04-30 at about 23:30 UTC Fresh Dates reappeared across all SERPs pretty much simultaneously, after having been missing for 36 hours.
I originally noticed them missing around midday of 2005-04-29.
Anyway, looks like an update was due based on the date and a pattern of spidering I see on my site. On Wednesday my site was deep spidered by Google and that plus the date told me an update might be coming soon.
I also looked at that pulsing rank thingy but all is so quiet here. Anyway, the pattern I usually see for my site is this:
Heavy spider about 3 - 6 days before an update
Heavy spider about 7 - 10 days after an update
Repeat (in a very predictable way)
Anyway, I was deep spidered on Wednesday, so I was not surprised to see the rank thingy report, figuring that was the update. But what I am surprised to see is that Google is deep spidering my site again today. The last time that happened was in September.
Anyone else seeing Gbot spidering hard today?
On one site, that lets you store your keywords and track your +/- in the rankings showed be at -9999 at one point earlier in the day, and just checked now and I am back to 3rd...
perhaps there is some kind of update on the way?
I also saw the chart that this is the largest number of changes since they started tracking the top 1000
I too am seeing the ranking chart that all is speaking of.
Can someone help me out here please? What chart?
g1smd - what are you using as baseline to test with? I used webmasterworld the other day as a test. Valid?
At least 20 different searches . . . . BBC . . .
BBC works for me. Also perhaps my favorite, content rich site, and I'm across the pond.
Webmasterworld is not a good choice for checking fresh dates as Brett does not allow pages to be cached.
Yes, see that now, missed it earlier in the thread.
Also, thanks to the folks who forwarded me the chart url.
So would not know which DC is leading the way.
Seeing good Googlebot activity last night and this morning though (mostly Mozilla - did someone mention that the Mozilla bot did not add pages to the index though?)
I'd be interested in hearing how you are doing over the next month or so in Google. That Mozilla bot is a strange one, sometimes it works from the same IP address as the Adsense bot.
The theory I heard was that a visit from this bot is not a positive sign, only Google knows for sure what this critter is looking for.
I do believe that it takes compressed pages when offered.
Not sure what Mozilla Googlebot does - there are suggestion that it does not add pages to the index, or that it lays a path for regular Googlebot or that it is a duplicate content checker
For those that don't know the "Mozilla/5.0 compatible" section of the userAgent. It is derived from the days of the browser wars,
Netscape included a whole bunch on non-standard extensions to HTML in their browser such as blink and script tags.
Web designers started using these extensions in their pages, but in order to satisfy users of the other browsers, they detected the userAgent and served a standard HTML document to any browser that didn't include the term "mozilla" in its userAgent.
Since then every browser that supports these Netscape extensions has started their userAgent string with "mozilla (compatible" to make sure that these detection scripts showed the main version of the page.
Now Just a thought...
Google is rumored to be creating a browser of their own. I guess it might be possible that this bot is actually an automated version of their in-development browser. In which case they may be testing it out on a large sample of real-world web sites.
It may be detecting cloaking. However the type of cloaking that it's detecting for is "good cloaking" i.e serving the best version of HTML that the userAgent supports. I can't see Google downranking for that.