Forum Moderators: open
teeceo.
yes
it makes allot of sense.
on the other side I can understand why the -sj behaviour creates so many postings. The results are quiet different from the other datacenters.
And it appears to be a new phenomen. So people can not help but to
speculate about whats going on.
Any info that is real helps. I rather see brief, yet acurate messages, than more info-noise based on speculation which is often fueld by pure SERP-drop-paranoia for indivual sites/keywords.
And that these brief messages have a natural limit in their scope is very understandable.
[edited by: eraldemukian at 2:37 am (utc) on May 5, 2003]
I think that your last couple of posts have told us heaps.
Thank you again for taking the time out to help and clarify.
rfgdxm1 - all datacentre results are available for public view - as are www2 and www3 etc. And this isn't the first time we've seen 'things' slip in and out of the live index - remember when we first saw freshbot appear?.
I think its now that we pay attention to 'glitches' more than previously, we read more into them - and more people share their views here!
Best
Chris_D
Happy to try to help. Some people get into quick knee-jerk reactions, but even in the 500 post thread, some solid info came out in the first 16-17 posts or so.
my knee-jerk reaction to that was to read the first 16-17 posts again.
Which basically said that -sj is a datacenter, and the links are down on it. [doh].
tounge-in-cheek question:
how literal is '16-17'
Message 11:
"This looks like an old database to me. Probably something they are toying with."
Message 12:
"I thought it was an old database too until I noticed some recently acquired incoming links to my site, and relatively new cache."
Message 13:
"The database is old but definitely a different algo."
BZZZZZZZZZT. Wrong.
or even(gulp)dmozguy?
I've been wondering if -sj is not a bit screwed up on purpose, with a mixture of an old database and new to keep us from finding out too much about this special sauce
Then a few weeks ago saw my googlebot 2.1 hits go up another 30, for a couple days (30 new statics) and when I look at -sj there they are. Not on www. but -sj has them, so it relaxes me a bit, to know they got indexed and its just a matter of time.
It would be nice if google could index my other 1000 pages, maybe I could rank a little better :)
Ahh...I see that since that thread was closed for arguing with the umpire, you had to respond here. ;)
>I've been wondering if -sj is not a bit screwed up on purpose, with a mixture of an old database and new to keep us from finding out too much about this special sauce
Very unlikely. Why not just take sj offline so it can't be publicly seen? Although, I wonder if there is perhaps even though this is madness, there is a method in it? Consider the possibility that some Google intelligence agents (we do have a Google programmer posting here) decided to "accidently" let this leak, and Google is using us as beta evaluators? If things are seriously hosed, someone here (or, another forum) might spot things that the Google crew would miss. Google may have decided that the risk of letting some low life SEO types get some clues about the algo was less of a risk than letting loose a buggy update on the whole Internet. Lots of dissatisfied surfers would be worse than a small handful of SEOs who could use this to advantage.
<!-- it's easy to re-sync something like backlinks or spam snapshots once you're convinced that an algorithm or method is an improvement. ->
Sorry, I am not very good at reading comprehension, but... what if we just ignored the majority of spam results in SJ...
just a thought at 6:50 a.m.
Regards.
The waiting bit is new to me, Im inpatient, but you know what, Im becoming more patient : ) Im learning other things here that reflect in my daily life, heheh
The fun part is watching my un-attentative competitors get bounced out of their cushy #1 slots.
I tell you what I really get a kick outa that! Most have enjoyed the good life for too long, time to get back into their sites and do some optimizing.
Meanwhile its fun to come out of no where and snag their position. I just get a kick because I can picture these people faces " Who's this So and So?" And by the time they try an take what was theirs back, I will all ready have optimized even more. This to me is a fun game!
"Secret sauce" needs to be cooked some more before it can be served.
Google guy said..
Competition and diversity are good things--they make search better for users.
Google guys comment here got me to thinking.
Nothing ever goes into a production environment unless it is tested and the SJ datacenter is no fluke. Here is what i think is going on.
1. A few data centers are going to have near real time calculation of on the fly PR, and pages that are fresh botted will go into the index right away and stay there. Right now this would only be the SJ datacenter.
2. The other 7? data centers are going to continue doing the once a month updates coming from the deepcrawl.
3. These results are going to appear on www.google.com 1/8th of the time so that google can moniter the performance of this datacenter and see how the new algo is performing.
I think the move to on the fly PR, and constant freshness, means that google can not test this index/algo in a testing environment as the resources needed are just to great to mirror a production environment. Don't be surprised if in the months to come other datacenters start acting like SJ...
www-sj is strange.
It shows a tonne more inbound links to my site - Definetely nice. BUT. My site itself isn't in the index! Not a single page. It's in all the other www's, and has been for the last two months (when it first went online).
Strange stuff, but like others, I won't get too worried until there's a real update. My site doesn't use any spammy tactics, so it should only be improving in the serps.
Google Guy said concerning www-sj: "bear in mind is that it's easy to re-sync something like backlinks or spam snapshots once you're convinced that an algorithm or method is an improvement."
I think we are going to see an update very close to what we all expect... If you are playing by the rules and have good back links.
It only makes sense that they put in old spammy (hand-edited) results for testing purposes to see if their new algo will manage to filter them out. If it works on the old ones, then the new results should be cleaner too and catch a lot of similar spam before it is on www3.. and save them a lot of time hand-editing the results.
Of course, just a wild guess ;-)
could I be close, GG? :-)
First time I've actually seen this on www. Anyone else?