Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
Sorry that I haven't been around as much to answer questions lately--I've been really busy with a project.
Makes me suspect that something is cooking at Google. I just saw some *hugely* different SERPs for some very uncompetitive keywords that freshbot activity is minimal. I can only assume I hit one of those "hidden" datacenters briefly. I had thought a dance had started. However, it now occurs to me that Google may be using some of these datacenters for test indexes. Test indexes they want to hide from the world; or at least the SEOs of the world.
Added. Dang, I wish I had screen snapshot software. MASSIVE difference on a SERP on www2 and www3, with www being the old index. Somebody please confirm I'm not crazy. ;)
Unfortunately I did not get a chance to try out any real searches before it switched back. Did you get a chance to try out any other than in your own category?
Nope. However, I did manage to save one SERP on www and www2 to a text file to prove if anyone thinks I didn't see this. This is a very uncompetitive SERP, and the pages on it rarely, if ever, change. Something is afoot at Google.
That begins around 17-18th. Google cache reverted back few weeks, along with SERP using keywords in those pages.
After two days, cache slowly started to return. Right now all cache for my sites back to normal (their last cached state) along with my ranks & keywords.
But again, I don't have any bot visits since 18th (still). I was getting at least two hit's a day before.
Other weird thing that I am noticing is that the links from links.htm or similar pages are not showing up when doing a 'link:' search, even if they have a PR > 4. But if the 'links' word in the url is either prefixed or suffixed with some other word (abc-links.htm, linksacb34.htm etc.) then the links are very much being credited. May be this is also affecting the serps.
GoogleGuy said loud and clear: the datacenters are there, but the outside direct addressing is gone.
If I would be in their situation, I would install a Round Robin DNS entry and watch the load of the datacenters closly, while I kick out 1,2 or 3 datacenters... that is almost a live check of the minimum amount of datacenters I need, without risking the world to be offline.
Only aliases & their usual ip has been removed/filtered BUT!
will do a trick :)
Maybe some center will not accept .99 but you can use .98 or .101 and it will still work fine:)
I wouldn't be surprised to see a lot of the www-xx.google.com aliases (like www-sj and www-fi) go away, at least for a while
Hmmm, why would that be?
How exactly does Google's search work? I thought perhaps it directed people to the nearest datacenter for every single search. After seeing the totals change so much today, I now think you get routed to different datacenters based on load balancing. I guess that's okay but boy when the results are really low it's gets kinda confusing. On the first screen, it would say like 200 total results then selecting display 100 the total changed to like 150. Then when select show hidden results the total would drop again to like 80. I couldn't tell if pages on the same site were being dropped or whole sites were.
If it is, it certainly is an argument for cranking the anchor text value in the algorithm down a few dozen notches.
I do hope that anchor text is devalued. Last week I got one more link from an academic site, and the professor put my name as the anchor text! I am not trying to come up #1 for my name or 'click here.'
Another thing I've noticed, when you can't get to a datacenter, google's error page requests to run an ActiveX control, which I found that a bit strange. Anyone know what it is (I'm not game to allow it to run)?
edit- just tried doing a search across the datacenters and only got 2 responding. Also, the active x control was gone (it's been about a week since I last searched across the datacenters)
I just wanted some help on the problem below?
Sorry if my terminology is lacking.
i am totally stumped with what is going on with Google?
a few days ago (i'm in the UK), both the .com and .co.uk sites were slow at first , now totally inaccessable? I have resorted to using different IP addresses to get the english version?
Got no viruses, no trojans & no spyware?
So, is this really Googles fault?
[edited by: sjp1 at 10:22 am (utc) on Jan. 23, 2004]