Forum Moderators: open
Today, everything seems to be back as it should be; the index page is returned when I search on the site name, and it has been boosted a couple a places from its pre-Dom ranking.
Is anyone else seeing this problem getting fixed for their own sites?
I hate to come off as a masochist, but hey, its something new to learn. Google was getting a bit to easy don't ya think?
If I bookmarked every site I visited, my bookmarks would be unmanagable. I am sure many people are the same way.
I think people will tire of the incosistancy and go elswhere just like they did when they found Google.
Will it happen overnight, of course not. But you can bet they will lose a good portion of people once there is an alternative that is mass advertised as such.
Right now, there is none so what are peoples choices.
People come to know and love something for a reason. If that thing suddenly changes and is not better than what it changed from, the love affair is over.
I will admit I am seeing less spam in the results. But shuffling results so drastically using two sets of data is not a consistant way to run a search engine.
My guess is they are doing this for the on the fly update. Freshdeepbot goes out and crawls for a few days then they shift to the second set of data while they compile the crawl data from the previous days. Once it is compiled, they switch back. This would explain the here now gone tomorrow and back in a few days syndrome.
It mostly effects sites that were SEO'd in the last 3 or 4 months and established sites with a lot of prior links are not effected as much.
I love Google but the last few months have been a roller coaster ride. In their attempt to battle spam, they actually added to it and just don't know it yet.
didnt SEs change the behaviour of users in the first place? Why not again?
didnt Google change the behaviour of users of search engines services?
I'ld say it less changing users to "suit a SE", but rather changing a SE to suit users, but thats just IMHO.
Ive read the posts here closely, but i still cannot see any evidence that consistent SERPs are in the interests of users at all.
On the downside, it could mean they get tired of the same old results and go to competing SEs.
Relevance yes. Utility yes. Diverse yes. But consistent? Why? Iid rather see new resources all the time, and it would'nt make me think that the Se was "broken" because it wasnt returning the same every time and that by not delivering a "perfect" set of the same queries each time is suboptimal. I really dont think average users expect SE's to turn up perfectly alogorthimcally dertermined "perfect" listings but instead expect them to take a good guess at what they want from the fairly vague queries SE users use. I would think that it meant they are delivering different sites that seem to fit my keyword enquiry, and i keep on cliking through SERPS to see more.
[edited by: chiyo at 5:47 pm (utc) on July 4, 2003]
A Kalman filter, he explained, is a mathematical device that smooths out fluctuating time series data so that you are not too affected by the "noise". It seems to be related to the notion of exponential smoothing of time series, that I became aware of many years ago.
Given that the current behaviour of the Google datacenters/algorithm/filters complex seems to have become very erratic and noisy, GoogleGuy might like to suggest the use of this same Kalman filter to his colleagues who are running the ship.
Barry Welford
Do you think it makessense for you to go to google, type in your searchword, and then hit refresh a couple of times to see all the possible sERPS?
I often think of pages and only emember the words I used to find them. I use google to go directly to certain manual pages of certain sites. I'd NEVER know the URLs by heart or bother to bookmark them (my bookmarks resemble a subsection of the web, worthy of it's own google).
I believe consistency is , in SERPs as in any other aspect, vital.
SN
A search engine develops an algo to SCORE pages, it does not simply return results in a random order. Google's strength has always been returning not only relevant pages, but scoring pages based on popularity/importance as well.
All the major SE's have a ton of pages in their index. It is the way in which the pages are scored that seperates a good engine from a bad one.
When you search for books you expect amazon and bn, not joes fishing book store, even if it pertains to books. Google still returns these serps but in some categories they are slipping.
Since links and content aren't changing so dramatically to make a page drop from #1 to #500 every other day, it seems GG's results have become a bit random, in terms of scoring.
If Google were to not "rank" sites anymore, it would lose much of it's value to users.
This just as background to where I'm coming from with this.
I find the debate if constantly fluctuating serps, as well as possible filters to deemphasize index pages/sites in favour of pages are a deliberate step on Google's side very interesting.
My problem is however I don't see any evidence whatsoever for that. I furthermore don't see more than anecdotal evidence for a pattern under which Google is "broken" in this regard.
I don't mean to disregard the members experiencing problems with index pages jumping around, I just can from my end not see if this is really a widespread pattern, which would justify far fetching conclusions.
If a site is the most relevant for widgets today, I'll tell you what it ISN'T... it ISN'T the 500th most relevant site tomorrow. That just brings Google into disrepute. It'll end in tears if they continue down that road for too long.
eg: >> people surely want the most relevant sites. They don't want them shifting on some random basis just for the sake of it. That's just a farce.<<
I wasnt talking about relevance. Whether results are relevant is a different debate. My point is that they can return different SERPS and still be relevant and satisfying to users. Its possible to return different results, especially with the millions of sites out there, many very similar, and still be as relevant SERP from SERP. And that to users, this may well be an attractive feature.
>>If a site is the most relevant for widgets today, I'll tell you what it ISN'T... it ISN'T the 500th most relevant site tomorrow" <<
Depends on the widget, and what exactly a user meant by just typing in "widget"
Again let me stress, im not debating the relevance of the current google index, just that consistency may not as desirable to users, all other things being equal. (relevance, interesting, useful, credible etc etc.), as appears many assume here. Nor does it necessarilly mean that something is wrong.
[edited by: chiyo at 6:18 pm (utc) on July 4, 2003]
And in the same respect it may also not be an attractive feature.
Yes they can be relevant, but if a surfer has come to rely on being able to pull up a specific research article by using the search query, I bet they won't find that as a useful feature because they wont' find that same article. So
We can debate whether it is or isn't, for some it is, for some it isn't.
Only time will tell how many current Google users feel it is.
Real example here because thats part of my job. I bookmark pages if they look sensible, and as always use several search engines. i cant waste time going back to quries again once ive FOUND a site, especially if they are the same all the time
>>students looking for information <<
They should BOOKMARK or better still make an annotated bilbliography. Why go back to the same place twice if you have already found the resources you need from there?
>>" I know I checked the first 5 sites earlier today/ Hey wait they're all different!<<
Great! 'Ive found some new resources. Maybe i should look at those too.
But seriously, I dont see any of the problems you guys talk about in the areas i research in, which are specialised information/research areas like economics, management tips etc and news resources. There has been almost nil movement for the past 3 months in these over 50 areas.
I can only assume the index page problem and spam problems are appearing in keyword areas i never look at - highly competitive or commercial searches?
In that case then the type of people you quote - "researchers" and "students" will probably not be affected.
[edited by: chiyo at 6:41 pm (utc) on July 4, 2003]
No... I don't think we do.
Sure, there may be 100 relevant results for a particular term. BUT... I don't want them returning at random. I want them ranked in sensible order - a rational stab at relevancy. I'm certain that's what 99% of people want.
>> consistency may not as desirable to users <<
The consistency is that the sites that are most relevant today, will broadly be most relevant tomorrow. The ranking algorithm should focus on determination of that relevancy factor. With some of the results you see they certainly haven't got the luxury of saying that "ahhh... these sites are all pretty similar... I think I'll just randomize them". No - they are rather a long way from that position.
The situation in which topic leading sites are here today and gone tomorrow is not healthy. It doesn't look good at all. Sorry, but more and more people will notice that if it continues.
Frankly, I still don't think for a minute that this is intentional.
I went looking for Amazon Canada (from Venezuela) this morning and they came up #8 after what I guess are affiliates. Is this how a good SE should work?
Actually, you have missed the point entirely.
Sure, any search engine can return hundreds of different relevant pages on any search query.
The POINT is that search engines are thought to be scoring/ranking the pages somehow.
Would you be happy to see Google scramble the first 250 SERPS on every search- completely dropping their ranking of pages?
<<
I don't mean to disregard the members experiencing problems with index pages jumping around, I just can from my end not see if this is really a widespread pattern, which would justify far fetching conclusions. >>
Man, in 50+ different categories I see my pages and others dissappearing and coming back at the top all the time. I can search 20 times on one term and see 5 different sets of results in any given day.
The huge day to day flux really isn't a point of debate. It exists- you just aren't seeing it.
To me much of the "science" of keyword research that pervades much of WebmasterWorld is based on some assumptions that are out of date or erroneous, one being that searchers only enter in one query and analyse that to kingdom come as we do! What i see is that they refine searches, add words, delete words etc or change their tenses. Some really smart ones even use quotes!
And with that im agreeing to disagree and turning in for the night. Happy Independence day your American guys!
Yah... have a good one. I'm off for the night as well now.
I suppose if you have no understanding of math that you might think voodoo would be better, but ranking is all math. The highest score wins. The score is the result of many factors that change, but it is always the highest score wins. And that is as it should be.
What person would use a (non-pfi) search engine that deliberately didn't rank sites to the best of its ability? "Hi, here at Google we know that when you do a search that you don't want what we consider the best results but rather want us to give you a little variety so SEOs have something to obsess over."
My 2 cents...
Dave.
Some of us have been doing that all along, and there is a lot of talk of Google irrelevancy out there, from radio shows to blogs.
Are some of the results in the top ten accurate? Of course. Are there a LOT more trivial/crap sites rocketing to the top ten for some days? Absolutely.