Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
[edited by: claus at 3:20 pm (utc) on Dec. 7, 2003]
Hopefully one day I will say "I see it now." But for now - I'll sit on the sidelines here until I see some movement.
joined:Oct 23, 2002
joined:Nov 9, 2003
Earth is round!
Nope. It's ellipsoid ;)
In fact it is an oblate spheroid. There's nothing worse than a pedantic statement which is, in itself, incorrect.
I simply mean we have come to the point from where we started -> the position again is wait and watch.
Start reading useful posts instead of rubbish like mine ;-)
[added]claus, your sticky box is full[/added]
There is always algo tweaking going on, but this is not the root of the issue.
Both sides of this filter/no filter argument could be right.
The one thing the guys at Google don't want us to do is figure out what they have done otherwise they will have wasted their time.
Steveb is right there has been a change in the algo but I think he's wrong to rule out a filter as well as a shift in the algo, on the basis of the evidence that he has stated in this thread. I've suggested elsewhere that perhaps pages flagged for filtering have some filter factor applied if there is an exact match with a term in a "filter list". I'm suggesting that this filter factor may be BadRank calculated off-line in the same way as PageRank is calculated but in this case it is not published. I guess that this extra factor does not have to be stored as an on/off switch but could be on a 1-10 scale like PageRank.
This would explain all of the anomolies that have been reported here and would explain the extra factor that appears to be applied to some pages for certain terms and not to other pages as an extra hidden factor. Whether you call that a filter is a matter of semantics. No pun intended ;-)
Google was and is in danger of having the quality of its results drowned in a sea of spam. The big idea of its founders "PageRank", based on the "Unique democracy of the Web", was being destroyed by rigged elections.
In my delusional state I think that they may have found what they think is the perfect solution to this problem. A kind of scales of justice where the good, PageRank plus positive in document factors etc. are weighed against the bad dodgy backlinks and overoptimisation applied only to search terms that themselves are weighed by some mechanism that decides if they are or are likely to be the subjects of abuse. By applying this principal Google can be returned to its utopian vision.
I've been badly affected by Florida and I still think that if this is what they are doing it is a good idea for the long term. In the short term I'm buisily trying to pile stuff on the positive side of the scales whilst reducing the weight on the negative side. I recon that even if I'm wrong it can't do me any harm long term.
PS My understanding of the mechanism of "search term -nonesensestring" is that we are searching for, in this case the words search and term excluding any document which includes a string which would never occur in any document. Therefore it forces Google to search for the search term but fools it into thinking that it is not searching for one of the terms in its filter list. This proves that some extra factor is applied to certain search terms. That sounds like a filter to me but perhaps its just part of one big algo. One things for sure if they don't swithch it off we are going to have to learn how to live with it.
Uhm... is this minor update or data inclusion/whatever over now, or?