Forum Moderators: open
teeceo.
So far to me it seems that they have purposely included old spam results in the sj results as a way of testing some new filters.. (something we have been told about)
I can't see this being too far fetched from the truth Daarsie. I see the logic in it and the reasons for it too.
As for the suggestion that a selection of datacentres would hold freshie results and the imminent release of a "one the fly" PR change then there obviously has to be changes made to the current infrastructure for it to work or the increased load will be phenomenal.
My tip is to try and spot when and if the Google toolbar changes version.
Now I just have to figure out what this new algo is all about :)
I've been wondering if -sj is not a bit screwed up on purpose, with a mixture of an old database and new to keep us from finding out too much about this special sauce
Powdork had it in that post, I bet, except it might just be freshbot additions to the old index with nothing from the most recent deepbot. I doubt if it's what we'll see in the next update.
Of course, I might be totally wrong. :)
As for the results on -sj . I have 2 comments:
1) Google tries to eliminate "buy PRs" as in my industry there are about 5 sites, which does this thing or similar: on -sj 4 of them are just nowhere ...
2) Or this server shows just part of the backlinks ... so they test it on the part, not the whole.
Number 2 couldn't be 100% true as the cash of these sites is good - it has these links ... so it is just strange.
If number 1 is true - then the algo need to be much more accurate and updated. I also see one of our biggest competitor (in fact it was #2 for wingets for about 5 months) also disappear (now it is ~ #500). And he just has a lot of links from other sites, and they are related, not bought. They are not hidden, and seen very well on all pages, and these sites are related by theme ... This situation is OK for me personally ;) but I afraid of such changes, which moves away good companies without any reason ... - just because the algo is not very good. :(
the serps appeared on Google.ch, they match the -sj exactly. Really hope this is not it, the results are not relevant. Some excellent results for my client, but overall just not "Google Standard".
I previously had more that one domain pointing at the same site. Following the great advice I got here, I organised 301 redirects on the outdated domain names. As a result, got my main site unbanned, went up PR1 as backlinks were combined etc etc. And the old domains disappeared from SERPS (they had had pages indexed).
So, I cleaned up my act - good for me, good for Google and good for users.
SJ results are showing thousands of pages from those outdated domains - obviously a backward step. There's no way that they would be reincluded on the real update.
The question is - why would they include results that they have been encouraging us to remove?
I say, let them fiddle around, it's clearly some testing. To those who say they shouldn't test in public, well, clearly they are testing something that they can't test in any other way.
Seems obvious enough to me.
Why not just take sj offline so it can't be publicly seen?
Because the whole point is to see if it generates more complaints or not! It's all very well for SEOs or Google employees to speculate about what is or is not spam, or the relative value to end users of different pages, but there's only one way to test different algorithms/SERPs on real end users...
I think Google will be watching the "Dissatisfied with your search results" feedback very closely, and analysing how it differs on results generated by the -sj data centre.
the same rule applies as every other time panic starts to set in - wait - and it will be fine - it always always is - and no number of threads and posts makes the damnedest difference...
end of rant :)
Harley
i dread to think what this forum will be like when google do start really testing out continuously updating indexes, contstant spidering ect - can you imagine all the posts that we will get...
Yeah, continous testing datacenter(s), the normal updates and now, the (up)coming hidden text/links one month penalty.
More interesting and more difficult.
For goodness sakes, it's not nice to reveal something (make it publicly accessible) and then skirt the issue on what it is and let everyone stew in their own juices.
I'm with the "OMG" crowd if this is the new index, because my site is *nowhere* to be found in sj-, after having 10K pages crawled in April.
GG, can you confirm please? We don't need to know what sj- is specifically--a gentle "Yes" or "No" on whether sj- is the update will suffice.
Peter
It would be nice if they kept doing this however as its nice to see waht pages on my site get indexed before the big update, and its refreshing to see that a search for my new "sitename" has gone from 150 instances to 700 :) But yet again, i have taken it with a pinch of salt.
Google owns your #1 ranking ... not you.
Earning your living based on your #1 ranking is a risky business.
Your site can dissapear without warning and the only person you have to cry to is yourself.
Im glad this -sj thing happened ... its a wake up call to all of us.