Forum Moderators: open

Message Too Old, No Replies

Index page ranking highly again

         

jetboy_70

9:30 am on Jul 3, 2003 (gmt 0)

10+ Year Member



Like many others here, since the first stirrings of Dominic I've had an internal page ranking higher than my index page, and a corresponding slide down the SERPs for searches on my site name. In my case the internal page in question was listed on DMOZ in addition to the site as a whole. As overall Google traffic didn't seem to be affected negatively, only my pride was hurt ...

Today, everything seems to be back as it should be; the index page is returned when I search on the site name, and it has been boosted a couple a places from its pre-Dom ranking.

Is anyone else seeing this problem getting fixed for their own sites?

pmac

5:19 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, its a mystery. Pretty hard to do analysis. That could be the whole point. I have been ranking since in Esmerelda for a hyper competitive one word term and just got bounced. Gone. Toast.

I hate to come off as a masochist, but hey, its something new to learn. Google was getting a bit to easy don't ya think?

mrguy

5:27 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Changing the behavior of surfers to suite an engine will fail.

If I bookmarked every site I visited, my bookmarks would be unmanagable. I am sure many people are the same way.

I think people will tire of the incosistancy and go elswhere just like they did when they found Google.

Will it happen overnight, of course not. But you can bet they will lose a good portion of people once there is an alternative that is mass advertised as such.

Right now, there is none so what are peoples choices.

People come to know and love something for a reason. If that thing suddenly changes and is not better than what it changed from, the love affair is over.

I will admit I am seeing less spam in the results. But shuffling results so drastically using two sets of data is not a consistant way to run a search engine.

My guess is they are doing this for the on the fly update. Freshdeepbot goes out and crawls for a few days then they shift to the second set of data while they compile the crawl data from the previous days. Once it is compiled, they switch back. This would explain the here now gone tomorrow and back in a few days syndrome.

It mostly effects sites that were SEO'd in the last 3 or 4 months and established sites with a lot of prior links are not effected as much.

I love Google but the last few months have been a roller coaster ride. In their attempt to battle spam, they actually added to it and just don't know it yet.

chiyo

5:42 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Changing the behavior of surfers to suite an engine will fail.<<

didnt SEs change the behaviour of users in the first place? Why not again?

didnt Google change the behaviour of users of search engines services?

I'ld say it less changing users to "suit a SE", but rather changing a SE to suit users, but thats just IMHO.

Ive read the posts here closely, but i still cannot see any evidence that consistent SERPs are in the interests of users at all.

On the downside, it could mean they get tired of the same old results and go to competing SEs.

Relevance yes. Utility yes. Diverse yes. But consistent? Why? Iid rather see new resources all the time, and it would'nt make me think that the Se was "broken" because it wasnt returning the same every time and that by not delivering a "perfect" set of the same queries each time is suboptimal. I really dont think average users expect SE's to turn up perfectly alogorthimcally dertermined "perfect" listings but instead expect them to take a good guess at what they want from the fairly vague queries SE users use. I would think that it meant they are delivering different sites that seem to fit my keyword enquiry, and i keep on cliking through SERPS to see more.

[edited by: chiyo at 5:47 pm (utc) on July 4, 2003]

bwelford

5:45 pm on Jul 4, 2003 (gmt 0)

10+ Year Member



At one point during the heightened Google Frenzy a few weeks ago, GoogleGuy suggested we should all apply a Kalman filter to the results we are seeing and not become too reactive. In other words we should all ease up and not react to every last happening.

A Kalman filter, he explained, is a mathematical device that smooths out fluctuating time series data so that you are not too affected by the "noise". It seems to be related to the notion of exponential smoothing of time series, that I became aware of many years ago.

Given that the current behaviour of the Google datacenters/algorithm/filters complex seems to have become very erratic and noisy, GoogleGuy might like to suggest the use of this same Kalman filter to his colleagues who are running the ship.

Barry Welford

killroy

5:50 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd like to make a case for consistency.

Do you think it makessense for you to go to google, type in your searchword, and then hit refresh a couple of times to see all the possible sERPS?

I often think of pages and only emember the words I used to find them. I use google to go directly to certain manual pages of certain sites. I'd NEVER know the URLs by heart or bother to bookmark them (my bookmarks resemble a subsection of the web, worthy of it's own google).

I believe consistency is , in SERPs as in any other aspect, vital.

SN

mfishy

5:57 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



<<Iid rather see new resources all the time, and it would'nt make me think that the Se was "broken" because it wasnt returning the same every time and that by not delivering a "perfect" set of the same queries each time is suboptimal.>>

A search engine develops an algo to SCORE pages, it does not simply return results in a random order. Google's strength has always been returning not only relevant pages, but scoring pages based on popularity/importance as well.

All the major SE's have a ton of pages in their index. It is the way in which the pages are scored that seperates a good engine from a bad one.

When you search for books you expect amazon and bn, not joes fishing book store, even if it pertains to books. Google still returns these serps but in some categories they are slipping.

Since links and content aren't changing so dramatically to make a page drop from #1 to #500 every other day, it seems GG's results have become a bit random, in terms of scoring.

If Google were to not "rank" sites anymore, it would lose much of it's value to users.

heini

5:59 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Luckily I have with none of the sites I monitor encountered this problem. On the contrary all serps look pretty stable. But like all people here that's just a very very small subset of the whole picture.
I can say though that some of those serps are very competitive, with lots of domain farming/link farming and lots of trashy sites, with new sites showing up permanently.
Most of the top10 are index pages, or subdomain index pages. Most of them are pretty stable.

This just as background to where I'm coming from with this.
I find the debate if constantly fluctuating serps, as well as possible filters to deemphasize index pages/sites in favour of pages are a deliberate step on Google's side very interesting.
My problem is however I don't see any evidence whatsoever for that. I furthermore don't see more than anecdotal evidence for a pattern under which Google is "broken" in this regard.
I don't mean to disregard the members experiencing problems with index pages jumping around, I just can from my end not see if this is really a widespread pattern, which would justify far fetching conclusions.

Napoleon

6:01 pm on Jul 4, 2003 (gmt 0)



IMHO mfishy's on the button again.... people surely want the most relevant sites. They don't want them shifting on some random basis just for the sake of it. That's just a farce.

If a site is the most relevant for widgets today, I'll tell you what it ISN'T... it ISN'T the 500th most relevant site tomorrow. That just brings Google into disrepute. It'll end in tears if they continue down that road for too long.

1milehgh80210

6:08 pm on Jul 4, 2003 (gmt 0)

10+ Year Member



If a site is the most relevant for widgets today, I'll tell you what it ISN'T... it ISN'T the 500th most relevant site tomorrow"

Agree--
Google could put a -RANDOMIZE RESULTS- button on their search page.
I'll bet few would push it..

chiyo

6:12 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you miss my point mfishy and napolean..

eg: >> people surely want the most relevant sites. They don't want them shifting on some random basis just for the sake of it. That's just a farce.<<

I wasnt talking about relevance. Whether results are relevant is a different debate. My point is that they can return different SERPS and still be relevant and satisfying to users. Its possible to return different results, especially with the millions of sites out there, many very similar, and still be as relevant SERP from SERP. And that to users, this may well be an attractive feature.

>>If a site is the most relevant for widgets today, I'll tell you what it ISN'T... it ISN'T the 500th most relevant site tomorrow" <<

Depends on the widget, and what exactly a user meant by just typing in "widget"

Again let me stress, im not debating the relevance of the current google index, just that consistency may not as desirable to users, all other things being equal. (relevance, interesting, useful, credible etc etc.), as appears many assume here. Nor does it necessarilly mean that something is wrong.

[edited by: chiyo at 6:18 pm (utc) on July 4, 2003]

mrguy

6:16 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



--And that to users, this may well be an attractive feature.--

And in the same respect it may also not be an attractive feature.

Yes they can be relevant, but if a surfer has come to rely on being able to pull up a specific research article by using the search query, I bet they won't find that as a useful feature because they wont' find that same article. So

We can debate whether it is or isn't, for some it is, for some it isn't.

Only time will tell how many current Google users feel it is.

1milehgh80210

6:25 pm on Jul 4, 2003 (gmt 0)

10+ Year Member



How does this affect people who use google as a research tool. Businesses researching the competition, students looking for information etc.

As in-
" I know I checked the first 5 sites earlier today/ Hey wait they're all different!

needhelp

6:29 pm on Jul 4, 2003 (gmt 0)

10+ Year Member



yes, i'm one of the in/out sites...i agree with the conclusions that have been posted that something is plain "broke", since in at #1, then gone, then #1, then gone, etc. clearly says to me that I have to wait and see (I did tweak keyword density & anchor text though, hee hee). I wanted to say, as I sit and laugh outloud in hysteria, as my dogs and husband watch me quickly losing my mind, that a year ago, I designed my site to be great for the user; his/her sense of design, product needs, navigation needs, etc. Until I found that unless I opitimized and I mean really OPTIMIZED everything I could, I wouldn't rank well. So I did. Now, the optimization may be killing me. I made sure I knew what was considered spam, turned down doorways even though they were easier, basically tried to follow the rules while still meeting the "optimization" requirements necessary to get ahead. Now, although I am totally on-topic, have pages upon pages of UNSOLICITED customer testimonials about my products, and really provide the best and most honest customer service as compared to my competitors according to the people who call us (these other sites haven't been affected by the wierd SERPS), I am in/out on a daily basis. I stick to my hopes that it's something that's simply broken, not an optimization penalty or whatever. Cuz I don't know where to turn now if it is!

chiyo

6:32 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Businesses researching the competition<<

Real example here because thats part of my job. I bookmark pages if they look sensible, and as always use several search engines. i cant waste time going back to quries again once ive FOUND a site, especially if they are the same all the time

>>students looking for information <<

They should BOOKMARK or better still make an annotated bilbliography. Why go back to the same place twice if you have already found the resources you need from there?

>>" I know I checked the first 5 sites earlier today/ Hey wait they're all different!<<

Great! 'Ive found some new resources. Maybe i should look at those too.

But seriously, I dont see any of the problems you guys talk about in the areas i research in, which are specialised information/research areas like economics, management tips etc and news resources. There has been almost nil movement for the past 3 months in these over 50 areas.

I can only assume the index page problem and spam problems are appearing in keyword areas i never look at - highly competitive or commercial searches?

In that case then the type of people you quote - "researchers" and "students" will probably not be affected.

[edited by: chiyo at 6:41 pm (utc) on July 4, 2003]

Napoleon

6:36 pm on Jul 4, 2003 (gmt 0)



>> you miss my point mfishy and napolean.. <<

No... I don't think we do.

Sure, there may be 100 relevant results for a particular term. BUT... I don't want them returning at random. I want them ranked in sensible order - a rational stab at relevancy. I'm certain that's what 99% of people want.

>> consistency may not as desirable to users <<

The consistency is that the sites that are most relevant today, will broadly be most relevant tomorrow. The ranking algorithm should focus on determination of that relevancy factor. With some of the results you see they certainly haven't got the luxury of saying that "ahhh... these sites are all pretty similar... I think I'll just randomize them". No - they are rather a long way from that position.

The situation in which topic leading sites are here today and gone tomorrow is not healthy. It doesn't look good at all. Sorry, but more and more people will notice that if it continues.

Frankly, I still don't think for a minute that this is intentional.

Tropical Island

6:41 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"When you search for books you expect amazon and bn, not joes fishing book store, even if it pertains to books. Google still returns these serps but in some categories they are slipping." mfishy

I went looking for Amazon Canada (from Venezuela) this morning and they came up #8 after what I guess are affiliates. Is this how a good SE should work?

1milehgh80210

6:46 pm on Jul 4, 2003 (gmt 0)

10+ Year Member



IMO A search engine is about indexing and ranking pages. (free and ppc)
Like I said before, let 'em put a scramble button on their search page if they want.

mfishy

6:48 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



<<My point is that they can return different SERPS and still be relevant and satisfying to users. >>

Actually, you have missed the point entirely.

Sure, any search engine can return hundreds of different relevant pages on any search query.

The POINT is that search engines are thought to be scoring/ranking the pages somehow.

Would you be happy to see Google scramble the first 250 SERPS on every search- completely dropping their ranking of pages?

<<
I don't mean to disregard the members experiencing problems with index pages jumping around, I just can from my end not see if this is really a widespread pattern, which would justify far fetching conclusions. >>

Man, in 50+ different categories I see my pages and others dissappearing and coming back at the top all the time. I can search 20 times on one term and see 5 different sets of results in any given day.

The huge day to day flux really isn't a point of debate. It exists- you just aren't seeing it.

chiyo

6:49 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Napolean. I guess, as sometimes we do, agree to disagree. I feel users may (without an evidence of course) have less need for preciseness as we do. Just a few good looking sites on the first SERP may well be perfectly acceptable to them, and from what ive seen of user behaviour in just a few tests we have done, they are more likely to refine their search than go to further SERP pages.

To me much of the "science" of keyword research that pervades much of WebmasterWorld is based on some assumptions that are out of date or erroneous, one being that searchers only enter in one query and analyse that to kingdom come as we do! What i see is that they refine searches, add words, delete words etc or change their tenses. Some really smart ones even use quotes!

And with that im agreeing to disagree and turning in for the night. Happy Independence day your American guys!

Napoleon

6:57 pm on Jul 4, 2003 (gmt 0)



>> Happy Independence day you American guys! <<

Yah... have a good one. I'm off for the night as well now.

steveb

7:22 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Amazing then when people can't see beyond their own horizon. Search engines rank results. They want to put up the best, most relevant results every time because the vast majority of users will use them one time. Consistent results are the bedrock of the point of search engining.

I suppose if you have no understanding of math that you might think voodoo would be better, but ranking is all math. The highest score wins. The score is the result of many factors that change, but it is always the highest score wins. And that is as it should be.

What person would use a (non-pfi) search engine that deliberately didn't rank sites to the best of its ability? "Hi, here at Google we know that when you do a search that you don't want what we consider the best results but rather want us to give you a little variety so SEOs have something to obsess over."

davewray

8:42 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's time we looked at this through the user's experience. Are the results consistent? NO. Are the results relevant? Generally YES. All of my friends and family use Google. I sometimes try and get a feel for how They feel about the results they get from searches they perform at Google. As of yet I have not heard ONE complaint about Google results being bad or inconsistent or them not finding what they want. So, it begs the question, Is it just us webmasters ****in and complaining about the SERPS or is there truely a problem. My feeling is that there is no problem, and YES, I am one of the unfortunate ones that has my site bobbing from 1st page to oblivion day to day.

My 2 cents...

Dave.

steveb

9:00 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"It's time we looked at this through the user's experience."

Some of us have been doing that all along, and there is a lot of talk of Google irrelevancy out there, from radio shows to blogs.

Are some of the results in the top ten accurate? Of course. Are there a LOT more trivial/crap sites rocketing to the top ten for some days? Absolutely.

SlowMove

9:06 pm on Jul 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



After all this time, my index page finally made it into the Google top ten. The only problem is that I'm not getting as much traffic as I thought. That Overture Search Suggestion Tool isn't what it's supposed to be. That's the last time I use it exclusively to find keywords. I think automated programs really skew the numbers.

davewray

2:30 am on Jul 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



SlowMove...Welcome to the club. I too found the same thing when I got #1 for a pretty popular keyword. Turns out it only equated to about 6 to 10 referrals per day! Overture definately overinflated expectations....

UK_Web_Guy

7:55 am on Jul 8, 2003 (gmt 0)

10+ Year Member



Anyone seeing their index's back in?

I know things have been coming and going for weeks - but I have never seen such a shift in SERPS and so many sites all re-appearing at the same time as they are right now.

Not holding my breath though

my3cents

8:01 am on Jul 8, 2003 (gmt 0)

10+ Year Member



mine are back, but still show a couple places where it's listing the index.shtml page and the domain.com for the same serps even though they are the same page.

It looks like they started coming in for each keyword seperately over the course of about 16 hours.

Napolean, you seeing any improvements?

MOOSBerlin

8:39 am on Jul 8, 2003 (gmt 0)

10+ Year Member



2 of mine index-pages are today (in germany) also back!

MOOSBerlin

8:41 am on Jul 8, 2003 (gmt 0)

10+ Year Member



I forget: I've made no changes at these pages!

nervous_seo

9:11 am on Jul 8, 2003 (gmt 0)

10+ Year Member



Things are definitely coming back.

Lots of Index pages back again.

This 158 message thread spans 6 pages: 158