Welcome to WebmasterWorld Guest from 54.144.79.200

Forum Moderators: open

Message Too Old, No Replies

Post Austin SERPS Starting to Improve

Keep turning the crank back google

     

customdy

1:54 am on Feb 5, 2004 (gmt 0)

10+ Year Member



In the last few hours I have noticed a moderate improvement in keywords that were heavily filtered in Florida and Austin. One of my competitors that domain name is keyword1_keyword2 is now back in top 10, he was gone in both Florida and Austin, doesn't look like he made any changes.

We are now back to page #2 or #3 on most 2 word searches, we have reduced keyword density but I think it is more of a tweek that Google is doing.

Keep in coming Google.

Chelsea

8:32 pm on Feb 5, 2004 (gmt 0)



It is really hard to understand this batch of algo changes by 'reverse engineering' (which is effectively what we collectively try to do on WW).

But it seems the new algo(s) have removed the boundaries around quality sites.

That is: what links to a page, the content of the page itself, and what that page links to, seem to have been merged into one continuous idea for evaluation by Google, which it then attempts to rank.

It doesn't appear to be working very well because simple lists of sites now often rank higher than the sites themselves. The algo has turned what was an index, into an index of indexes.

(In my particular case, which is illustrative only & admittedly statistically insignificant, I have seen my own CV, and a site that simply describes my site, ranking higher than the site itself.)

Perhaps it is time for Google to roll it back, take stock, and re-think.

People are now, understandably, even openly discussing creating directories themselves, in order to rank well (in fact I had the same idea the other day) - but surely this isn't the solution. There's little point in a laudable Google mission statement unless all its employees know and understand it, and it is followed. This state of affairs can't be healthy for anyone - especially the WWW :)

Edit: Grammatical / clarity.

[edited by: Chelsea at 8:49 pm (utc) on Feb. 5, 2004]

soapystar

8:48 pm on Feb 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Perhaps it is time for Google to roll it back, take stock, and re-think

they already did that once....you think they would do that again?

Chelsea

8:51 pm on Feb 5, 2004 (gmt 0)



Soapy,

I'm not arguing :) but when did they roll-back? I've seen some dodgy past algo's repaired in a week or so, but this unfathomable situation has been going on for nearly 3 months now.

[edited by: Chelsea at 9:00 pm (utc) on Feb. 5, 2004]

pavlin

8:57 pm on Feb 5, 2004 (gmt 0)

10+ Year Member



I can't think of any other way around.
It's clear, that the problem lies in the core of the new algo. There is no way to fix it doing some minor fixes.
Tha idea of giving so much power to some autoritative (read preffered) sites is in fact the main reason we are here now. This all DMOZ and other directories flooding is the most logical consequence of this idea.

webdude

9:05 pm on Feb 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is a gas... The last few days, as I have stated, my site for my money phrase has ping ponged between #1 and #30. Now it seems everyone else is seeing sites popping in and out, yet for the past 36 hours, I have been rock solid at #16.

Very Strange. It seems I am either behind or ahead of everyone else.

[edited by: webdude at 9:27 pm (utc) on Feb. 5, 2004]

Chelsea

9:11 pm on Feb 5, 2004 (gmt 0)



Very Strange. It seems I am either behond or ahead of everyone else

But don't forget that it was possible to partially understand this yo-yo effect when Google displayed its various datacentres - but they've been pulled.

It is a very strange situation, and if a member was to say 'I think Google is bust', IMO, this would be very hard to argue with now.

Although I recall that after Florida such ideas about 'Google being broken' were rejected outright and regarded as mere 'conspiracy theories' :)

(This was always a little unfair, since conspiracy theories are invariably too complicated to ring true. But something being 'bust' is extremely commonplace :) esp. in the UK ;)

glengara

9:24 pm on Feb 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hey Brett, you better close this thread, it doesn't seem we're ready for it yet.

Chelsea

9:31 pm on Feb 5, 2004 (gmt 0)



it doesn't seem we're ready for it yet.

What do you mean?

[edited by: Chelsea at 10:31 pm (utc) on Feb. 5, 2004]

glengara

9:55 pm on Feb 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ready to accept the premise of the thread ;-)

Chelsea

9:58 pm on Feb 5, 2004 (gmt 0)



Hmmm,

it's only speculation / observation :)

soapystar

10:10 pm on Feb 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not arguing :) but when did they roll-back? I've seen some dodgy past algo's repaired in a week or so, but this unfathomable situation has been going on for nearly 3 months now.

just after dominic..or was it florida?..i forget now!..but the first time when it was just looking like a keyword filter....that first time it was rolled back for a few weeks...

Chelsea

10:16 pm on Feb 5, 2004 (gmt 0)



<>

steveb

10:55 pm on Feb 5, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Google doesn't rank sites, it ranks pages. It's odd that a lot of people still don't get that.

Chelsea

11:01 pm on Feb 5, 2004 (gmt 0)



If you still believe this Steveb, then you haven't understood the recent drastic changes in the algo.

Of course Google must consider pages; they're the basic unit of the WWW: but the way it relates these pages to each other now seems entirely different. In the past, the way that Google ranked pages gave some weight to the value of a site, I guess there was some weight to internal linking. Not so now. Now it ranks collections of pages - whether they are engineered by huge, and very clever linking campaigns or not.

Let's enter the real world, not the salad days of Google's past :)

[edited by: Chelsea at 11:10 pm (utc) on Feb. 5, 2004]

customdy

11:04 pm on Feb 5, 2004 (gmt 0)

10+ Year Member



For those that are seeing some improvement, have you made any changes?

valeyard

11:14 pm on Feb 5, 2004 (gmt 0)

10+ Year Member



It doesn't appear to be working very well because simple lists of sites now often rank higher than the sites themselves. The algo has turned what was an index, into an index of indexes.

Agreed. It looks like a problem with the use of "hubs". A hub is a great way of gathering pages for SERPs however it should never appear in the SERPs itself unless the user includes a search term such as "links" or "directory".

Unfortunately the problem with that last sentence is the phrase "such as" which is impossible to implement algorithmically. My (latest!) suspicion is that Google have therefore not bothered to do so. Hub pages are considered as relevant as content pages.

yankee

11:22 pm on Feb 5, 2004 (gmt 0)

10+ Year Member



Good search engines decrease the amount of clicks to relevant content. Florida and Austin have increased the amount of clicks.

Chelsea

11:24 pm on Feb 5, 2004 (gmt 0)



Agreed. It looks like a problem with the use of "hubs".

Let's all 'hub out' then!

Deep Purple famously said in a live concert "let's have everything louder than everything else"

(which is of course absurd)

So let's have an Internet with "everything linking out to everything else"

(Which is equally absurd)

I hope Google know what they're doing - it looks like a total disaster in the making to me :(

And it will be really easy for a competing search engine to improve upon these results, they just need to dump the pages with huge numbers of outbound links (that Google seems to admire) and replace them with those focused on a specific topic.

After all, these pages that Google is now serving up aren't *search results*, they are *search pages* - Who wants to search twice?

It increasingly looks like an abdication of responsibility, as well as being an irritation :)

[edited by: Chelsea at 11:58 pm (utc) on Feb. 5, 2004]

valeyard

11:36 pm on Feb 5, 2004 (gmt 0)

10+ Year Member



Good search engines decrease the amount of clicks to relevant content. Florida and Austin have increased the amount of clicks.

Absolutely.

I used to (1) click on Google (2) enter my search and click (3) click on a target website.

Now I (1) click on Google (2) enter my search and click (3)(4)(5)(6) click through irrelevant results (7) click on vivisimo (8) enter my search term and click (9) click on a target website.

I really, really wish I could add ":-)" but I'm afraid it's true.

agapes

12:32 am on Feb 6, 2004 (gmt 0)

10+ Year Member



Hissingsid, maybe you're right about being a bug in SERPS cuz i recently questioned Google about some discrepancy showings in Adwords, and they emailed me back saying that there is a bug in the Adwords...so maybe its in the SERPS too?

steveb

12:52 am on Feb 6, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Chelsea, you are not understanding the relationship of pages and sites. You have to get yourself into the post-Austin world.

It is all about PAGES. Google is ranking PAGES from slate, or cnn, or other authority domains. Those PAGES are beating full-fledged domains worth of content, because domains of niche content matter less (very little).

The effect you are mistaken about, imo, is that those CNN pages are given high authority ranks because they reside on authoritative domains. The domain content means nothing. It is a serious mistake to think that. The page content -- or more accurately, the APPARENT-to-a-bot page content, is what matters.

People keep saying "my site this" or "my site that" while missing a fundamental of the post-florda world that PAGES, even with long URLs deep on large domains, is what are being algorithmically judged. Don't confuse the value of having links from CNN to a CNN news article on Widgets with Google thinking CNN is all about widgets. Google is saying that it trusts CNN's judgement and this widgets article is worth ranking well.

And, to long time readers of webmasterworld none of this should be a surprise.
GoogleGuy told us when Googlebot got better at indexing long URLs, and webmasterworld members noticed.
Google Guy encouraged people to focus on multiple keywords rather than putting eggs in one basket.

Multiple pages focused on multiple things/keywords on a large, stable, authoritative domain is the direction to go in.

One side note on this partly explains why directory pages are doing well... they are PAGES that have a high conetration of keyword content, that is likely titled well, that links to authoritative sites, and is linked from its authoritative parent. These PAGES have no depth, and some folks are thinking it is bad search engineering that they outrank domains full of content. Maybe, but the point is it is a page being ranked, not the domain. In other words, one directory page with words and linking on it is going to kick the butt of an index page of a large on-topic domain that is just a flash graphic.

steveb

12:54 am on Feb 6, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Don't want to give the wrong impression about the above though. I fully believe Google will be valuing niche authority much higher as its algorithm switch process unfolds.

Deep pages of large, niche sites will dominate those deep CNN-type pages as this process matures.

BrewCrue

2:06 am on Feb 6, 2004 (gmt 0)

10+ Year Member



(Warning: speculation ahead.) It's possible that if your listing gets clicked more than the expected average for a given search term, it may get "some kind of mojo points." Google may measure this extra clicking as votes for your site and perhaps use this information in their algorithm.

"If nobody's clicking on the search results Google knows it's not delivering what they're looking for. What are people who are drilling down into page two or three clicking on? Google may float those up higher."

They can very easily identify the search results that fail to generate the click through rates that they should and begin adjusting their results that way.

Google used to have a feedback tab in their tool bar, and a link at the bottom of the results that asked "how are these results?" These methods of measuring the value of results required interaction from the searcher. Tracking clicks allows Google to do the work, and clicks tell a more complete story than direct user feedback.

yankee

2:39 am on Feb 6, 2004 (gmt 0)

10+ Year Member



"Google is ranking PAGES from slate, or cnn, or other authority domains."

That is part of the stupidity of Google's new algo. I've seen many pages from these domains rank high just because they mention the words in the search phrase on the page. The page has nothing to do with the search phrase, yet that's what google thinks is relevant.

steveb

2:47 am on Feb 6, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I don't think it is stupidty as much as immaturity of the algo. Obviously Google can more easily recognize generic authority as oppossed to niche authority. It needs to get better at the more difficult task.

frances

3:26 am on Feb 6, 2004 (gmt 0)

10+ Year Member



immaturity....

Doesnt that bring us back to a point that has been made several times. If the algo isn't ready, maybe google should have waited until it was ready. Or at least more ready than it is now.

glengara

8:54 am on Feb 6, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



IMO Steveb is right on the money, we're in a process, present results have little bearing on eventual ones, and trying to decipher anything from them now will make you mad/take to drink/depressed/quit.

steveb

9:32 am on Feb 6, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I don't know what to say about this, except I'm stunned.

Do these searches:

keyword +a
keyword +www
keyword +keyword (use the same word twice)

These put up completely different results from each other and anything ever seen on this planet before.

Kennyh

9:45 am on Feb 6, 2004 (gmt 0)

10+ Year Member



steve - that's been the case since Austin went live. Very bizarre and lends great weight to the view that the algo is very much a work in progress. If anyone can come up with an explanation as to why these three should be different, I'd love to hear it...

edit: spelling

zgb999

9:48 am on Feb 6, 2004 (gmt 0)

10+ Year Member



"Multiple pages focused on multiple things/keywords on a large, stable, authoritative domain is the direction to go in."

That is where the site gets in. Unless you are dmoz or some kind of large directory your site has to focus on one (or a few) themes.

If you want to sell everything to everybody you won't sell anything to anybody...

This 105 message thread spans 4 pages: 105
 

Featured Threads

Hot Threads This Week

Hot Threads This Month