Forum Moderators: open
I think this mix of old/new data could easily be explained.:) Bear with me a second...
If Google wants to test some continuous update process, they won't be doing that on the real database (the one generated from the latest deepcrawl) but on a previous version.
They most probably don't want to f** up the latest database in case there is any bug in the process. Don't you think so?
A continuous update would mean permanent changes in the live database, i.e. for testing purposes, they HAVE TO use an old version. This could explain, IMHO, the mix of old data with freshbot data.
I believe that some really interesting change is under way...
Dan
I have a new site that went up in February. It's been deeped in February and has about 1,700 pages in Google right now (except sj-).
In March I got crawled to the tune of almost 10,000 pages.
In sj- I have *no* pages at all. Since my site is *completely* clean this leads me to believe that it's an old index.
Peter
All in all, I don't see any major changes in the -sj index as far as my site is concerned, although I think it may give slightly more weight to PageRank than the current index does. (In a couple of my topic categories, hotel-booking sites and other sites that use multiple domains with crosslinking have moved to the top of the rankings even for non-lodging search terms.)
[edited by: europeforvisitors at 3:06 pm (utc) on May 3, 2003]
Yes, it's really strange.
One of my sites dropped from #2 (of 220.00) to #290 for the main keyword. However, for allinanchor:keyword it is just going down from #2 to #3, the "link:" command still shows all important backlinks and the cache shows the current version.
Thus, I couldn't find any reason for this behaviour and I think that no kind of 'on the fly' calculation / update could explain this.
All of the sj testing and tweaking is fine, but why would they let the sj results become the www results, which is what I have right now. Currently sj=www SERPs.
If truly sj=www, then perhaps they've just replaced it with the current database (maybe getting ready for a new test? or maybe they're just finished toying with us? ;))
I doubt that!
I think they're merely testing us ;)
What I would really like to see (are you reading GoogleGuy?) is that they test merging deepcrawl data into the index on the fly.
But I would be happy if it was.
On the most important phrase I've been working on for a client they were at #34 2 mos ago, then Freshbot punched them in and out of #17. The update had them bouncing in and out of #17 too.
When it settled though they landed at #32. :(
Been doing a good link campaign and sj has them at #11 for the phrase.
On the other hand, a phrase we got them to #3 on, is at #11 & #12 (on sj), and results in different pages.
Come on GoogleGuy, #11 is SUCH a tease!
:)
Of course, this means nothing...unless GG states it DOES mean something. ;)
AW
Google are tweeking the Algo. As was said earlier they will use an older database so as not to damage the fresh index that is being built.
Google have been knows to use click tracking in their serps from time to time. I belive that click tracking is a way of determining how well any gived database is performing. How many peopel click on result 1,2,3 and how many people do another search. This coudl be a method of determining how well their new algo is performing by showing sj results to random users.
This is something MSN used to so quite a lot when they had altered their web search. Most users got the origional seach, but how and then a user woudl see something new. It is possible that google have found the best way to test changes is to publish them and gauge feedback.
Just my opinion.
I have a site in the main google.com index that has not been updated and its still showing a cached snapshot of my old page. After optimising my site during the last deepcrawl, my site still shows my old page title etc.
I know a lot of people have made the point that the SERP's on www-sj.google.com can't be an update. How is it that when i search for my website using my keywords on www-sj.google.com, it shows an updated version of my website.
true, Yidaki, thanks. That's why I moved the previous messages over here
[edited by: heini at 4:48 pm (utc) on May 3, 2003]
We also noticed in the April ( which was supposed to be March) update some of our monitored keywords now showed sites that didn't even have the keywords on the site but only in incoming links to that site. Example: get "red hot widgets" at this site. So the site did not contain the keyword red and yet still got ranked very highly for "red hot widgets". The net effect is that massive amounts of links with targeted anchor text may win the day over content.
What a way to get some amazing advice on the fly to help the new technology over at google, they must love us, I now most of us are no dummies, in fact the majority of the regular posters here if all combined would create the best search ever ever known to man, there is no doubt in my mind, I say just halt the conversation and see if anyone stresses out, then we know who the Google lurkers are? lol
ciao ciao
Hollywierd - out
I am pretty sure its an update.
I would suggest to call it 'google reloaded'
I lost all my backlinks the I had from my own server. Which could
mean by pre May2003 standards that my PR dropped so that the linking
pages go below the PR4 threshold. It could otherwise mean, that google
does not count backlinks from your own servers anymore. Which I would like: Even though I would have lost most of my links. Its cleaner that way. I never understood why google did allow these inflated backlink counts to be relevant in the first place. C'mon this was way to easy: I did generate 80000 html pages in a matter of minutes. All had unique content. If google would have continued with the way they did things then my links would have gone from update to update something like:
20
80
200
5000 (this never happened ;-)
on sj- I am back to 40.
It makes me smile to read here about 'spammy serps'
I think its just a reflex: ouh, I lost #1 in specific SERPS. They must be spammers. Funny that people say that. Makes me believe that people meant to say: "they are bigger spammers then me"
[no specific people meant. just the general attitude]
Google Traffic is like a warm summer day in germany: You better enjoy
it while it lasts. It can (and will) be gone any minute. And there is nothing wrong with it. You have no right to be somewhere high up in the SERPs. Google was made for users to produce meaningful search results. Not for you and me to harvest traffic. Thats just a positive side effect of people using google for searching and if you have a decent site.
If this should really be the update (did I say 'If' , darn), then I simply like it because it appears to be a big change. The web is good if it changes. Once it stops doing that, it looses its main attraction. Since google regulates a big portion of the traffic, and if sj- is the next thing, then this is really good, since it means change for everybody.