Forum Moderators: open
dvduval:
Indeed I am missing a PR7 backlink!
Worth noting: It is highly unlikely Google will reduce the size of its index. Otherwise they'll get hit with a barage of those "enlargement" emails (if you know what I mean), and that would be embarrassing!
;-)
Good post Napoleon must have missed that one.
Im with kevin c though, they should have done this behind closed doors - look at the furore.
Hmm...my main site on -fi is up to #13 from #16 on -sj (down from #4 on this SERP from last month.) Perhaps there is some hope. If anyone knows one of those Google Dance machine sites that includes -fi, sticky me. Beats having to keep an open browser window for -fi. ;)
onionrep: In a way they did. sj is not www. You just happen to be able to look at their development server. The average joe would have no idea about this. In fact many people that are on the boards here don't seem to know about them which is why there is always questions regarding them and what they are.
troi21: I feel your pain. Page1 #2 to somewhere on page 8. Like I (and many others) have mentioned this seems to be a testing index and not the index that will roll out. As much as it hurts to see your site so low have faith that it should come back up. Like you I don't spam and I have only added content over the month. I have faith google will sort things out.
The best explanation I heard here (don't know if it's true and I forget who said it) is that it looks like google is using known data to test their new algo. When they are confident hopefully they will apply it to the new data :)
daisho.
How do you know that?
What we are seeing is a PRE-UPDATE.
How do you know that?
Google is getting their servers ready to execute the update.
How do you know that?
I read from another post that the update was scheduled for mother's day. It makes sense. However, if GG says I'm wrong, then I'm wrong...
Perhaps part of this update is to look at sites that have too many backlinks and examine those links more closely. Guestbooks, FFA links, and other things like that might be getting nuked this time out... maybe if we are lucky!
Alex
Including www? If so, what is the point? Why not wait until its ready?
<added> Or was that just a red rag to a field of Bullocks :) </added>
[edited by: onionrep at 9:37 pm (utc) on May 6, 2003]
1. We know that various filters and backlinks will be added OVER TIME... OVER TIME are we talking :
a). days
b). weeks
c). months
2. When you say that the SJ results/algo will be ported to other datacentres does this include WWW (I presume not).
Thanks.
Just a thought, ya know!
Also, those 41k links are spread over how many domains? Are you talking hundreds of pages per domain, or 41k unique domains?
Alex
1. In my case, two of the sites I work on have freshbot results from early April included, however the deepbot crawl from later in the month is nowhere in sight.
2. The Google directory is out of date -- doesn't include sites that were included in the April update.
3. PR seems to be about 2 months out of date.
And, how is Google supposed to figure out if the link is from a related page or not? With billions of pages, adding that in the algo would be non-trivial. Also, such a filter would lead to a lot of false positives. For example, I link from my page about vodka to your page about Scotch whiskey. Both are obviously related. However, since your page is laden with the keywords "Scotch whisky", and mine "vodka", to an algo it wouldn't look like it.