Forum Moderators: open

Message Too Old, No Replies

The Google Specialist.....

...an endangered species?

         

glengara

8:50 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's certainly not the good old dependable G any more, but I can't quite figure out if it's deliberate or accidental.
My head says it's not working properly yet, and things will settle back down, my gut says it's added chaos theory to the algo to keep things in a continual state of flux.
The latter makes more sense, as due to the number of potentially relevant pages, searchers see little if any difference in results, while we are left in the dark as to what steps are worth taking.
Google 1 SEOs 0?

WebGuerrilla

9:11 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think Google has a good understanding of reality. The truth is there is a huge disconnect between how we evaluate the quality of SERPS and how a real searcher evaluates SERPS.

In our world, if the SERPS don't contain the sites owned by our clients, then they suck. But to the average Joe, the results look fine.

The reality is that for any given search term, you could re-sort the top 50 results 50 different ways without the average searcher ever getting upset over relevance. Google seems to understand this concept fairly well. And they also seem to be exploring how much random chaos can be introduced before users actually complain about the quality of rersults.

Finding that line means millions in increased ad revenues. Crossing that line means potentially losing a huge chunk of their marketshare.

In the end, I don't think you will see the demise of the Gooogle specialist. You will just see an adjustment of what that specialist offers to clients. You may not be able to offer 24/7 domination anymore, but you will still be able to help clients achieve maximum stability in an unstable environment.

Chndru

9:20 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



wonderful post, WebGuerrilla...

Shak

9:21 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



somebody buy that man a drink :)

class post WG...

Shak

Skier

9:40 pm on Sep 12, 2003 (gmt 0)

10+ Year Member



But we are all "real searchers" too. I look up things every day that are not related to the sites I work on. We can see the SERPs from a real world perspective as well as anyone.

Generally speaking, the quality of results I see today are far superior to those of a few years ago. However, I still feel as if I am mostly wading through piles of junk to find the odd gem.

1. The Google SERPs are still far from perfect, there is room for years of improvement. If they don't, someone else will. The game is not over, Google can't afford to compromise its quality by adding chaos.

2. You imply that the average user can't tell if he is getting good results. That simply makes no sense. How then, did Google grow so fast?

3. If all were perfect, optimising for Google, would be optimising for the user. No Google specialist, just website specialist.

willybfriendly

9:46 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The reality is that for any given search term, you could re-sort the top 50 results 50 different ways without the average searcher ever getting upset over relevance.

True enough, until one wants to find that site that was at the top of the page just yesterday, and now can not be found. Been there, done that...

WBF

Shak

9:55 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



3. If all were perfect, optimising for Google, would be optimising for the user. No Google specialist, just website specialist.

Googlebot ain't the 1 that makes it to your check-out page with credit card in hand, thats the USER.

Shak

ronin

10:50 pm on Sep 12, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



willyb makes a good point. Using the in-browser bookmark feature was a lot more commonplace before Google pushed AltaVista out the way with (comparatively) near-ideal results on the first page or so for a given search.

User behaviour tends to show that many users are happier to return to a search engine and enter in the search term they used yesterday / last week to find a url they've forgotten than to use bookmarks... perhaps because this keeps the in-browser bookmark list relatively short.

If Google starts mixing it up big time, users will no longer have this pseudo-bookmark feature available to them.

Everybody agrees that there's a problem if the SERP indices update too slowly, but if results are updating constantly, that's not really ideal either.

DaveN

12:00 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



bing bing bing WG on the nail again

Skier

12:22 am on Sep 13, 2003 (gmt 0)

10+ Year Member



Ok WG, I have thought some more about what you say, and I have to agree with you too.

Finding that line means millions in increased ad revenues.

Not much else fits the facts.

steveb

1:08 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"The reality is that for any given search term, you could re-sort the top 50 results 50 different ways without the average searcher ever getting upset over relevance."

If you are selling widgets, sure. For information sites this idea has no merit at all.

In the vast majority of areas, the information on top ten results are drastically better that 40-50... with Google managing to usually get more than half the objectively most useful sites in the top ten.

People gotta get out of their niches more, especially those selling, and especially those selling via affiliate means. It's nonsensical tail wagging the dog thinking to think Google is even setup primarily to rank sites in the niches most people here are even concerned about.

WebGuerrilla

1:24 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



User behaviour tends to show that many users are happier to return to a search engine and enter in the search term they used yesterday / last week to find a url they've forgotten than to use bookmarks

I would like to see a legitimate study that supports that. I'm not saying that doesn't happen, (I actually do that all the time) but I doubt that the total percentage of regular users that get upset because they type in the same search they typed two days ago and see a different set of results is statistically relevant.

And if that does happen enough to have a true impact on Google's user satisfaction, then that is something Google created all on its own. Since the very begining of search engines, instability and fluxuation has been the norm. It was the foundation upon which the first revenue models were built. If you wanted consistent, stable exposure, you had to pay.

Of course, the search engines didn't have the forsight to see that paying them wouldn't be the only option. We came along in a banner dominated era and were able to convince a good chunk of the people pushing the wheelbarrels full of cash that their money would be better spent with us.

And so the adversarial relationship began. AV, Infoseek and Excite all used to run randomly rotating algos that served no purpose other than making it more difficult for Webmasters and SEOs to get a consistent free ride.

Excite was the best. There was a period in '98 when the SERPS would change every 72 hours. Many companies who used to enjoy 24/7 SERPS coughed up the money for ads in order to keep the traffic going during the portion of the week when their listings would vanish. Other companies paid a lot of money to SEOs who built them 3 or 4 sites that were individually tweaked to nail the algo of the day.

Either way, everyone paid. And the rank and file searcher continued to type their queries into the little white box, completely oblivious to the battle being waged between search engines and Webmasters.

But Google's situation is a bit different. I don't think that the SERP stability everyone has become accustomed to was done intentionally. It was just a bi-product of a system that is so link dependent. In an on-page-only world, it is much easier to introduce a bit of random chaos. However, it is much more difficult in a system that requires a huge computational process to take place in order for SERPS to change significantly.

Maybe Google's existence has changed the public's idea of what good SERPS are. And maybe the bookmark factor has, or will soon become the deciding factor.

If that's the case, I'm not sure how I personally feel about it. From a pure economic point of view, it would be a great thing if users started abandoning Google because of SERP fluctuation. That would cause Google (and its competitors)to move back to a pre-Dominic type of system. Doing that would bring back the days of shooting fish in a barrel. Easy work, great pay, but very boring.

On the otherhand, if users tolerate random fluctuation, then working as an SEO will once again become a fun and exciting (from a completely geek perspective) occupation. And that isn't such a bad thing for all the Google specialists out there. In the big picture, it will probably help thin out the crowd a bit. If SEO gets back to being alot of hard work, a good chunk of the competition will hang it up.

Mohamed_E

1:42 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"The reality is that for any given search term, you could re-sort the top 50 results 50 different ways without the average searcher ever getting upset over relevance."

If you are selling widgets, sure. For information sites this idea has no merit at all.

The fact of the matter is that in areas where there is real information the SERPs have been remarkably stable. Google has real gliches, at any time a few of my pages seem to drop out of the index for a few days. But, apart from that, I see very few day to day, or month to month, changes in the SERPs.

claus

2:28 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Reading the above posts, i feel like i'm the only one that totally drowns in irrelevant maillists, forum posts and the like whenever i search for something remotely technical - surely this can't be Google targeting me individually with plain lousy serps can it?

The added emphasis on "freshness" (for lack of better or more precise words) has surely made the serps less predictable and even - in an increasing number of cases - totally unusable for the queries i perform when i really need to know something now.

Google once could solve my problem of never being able to localize a specific page quickly in my enormous collection of bookmarks - it was simply faster to search. It is not anymore, and recently i have been collecting more and more bookmarks and spend more and more time searching through them in stead of searching the web. Google's role of "internet organizer" is diminishing rapidly in my case.

I'm not on the "Google is broken" track - i've been very sceptical about that statement, as seen from a SEO point of view, stability and a fair amount of predictability is still there imho. But as a searcher, requesting information, i see quite another picture.

Perhaps that quote about "we're fixing algo distribution now, spam will have to wait for a while" (so i recall it) is still valid, whatever the precise meaning of it originally was.

/claus

willybfriendly

2:41 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google once could solve my problem of never being able to localize a specific page quickly in my enormous collection of bookmarks - it was simply faster to search. It is not anymore, and recently i have been collecting more and more bookmarks and spend more and more time searching through them in stead of searching the web. Google's role of "internet organizer" is diminishing rapidly in my case.

I hadn't really identified this until I read these words, but yes, I too am doing this agian. In fact I was creating category folders in my favorites just last night...

WBF

richardb

3:41 am on Sep 13, 2003 (gmt 0)

10+ Year Member



Well said WebGuerrilla

It’s an interesting thought that

Crossing that line means potentially losing a huge chunk of their marketshare.

I think that they are getting near that line. Trends are changing and in the industry, people are starting to become less reliant upon G to produce the goods. Must start looking at Yahoo so much money spent, so little known.

…but I can't quite figure out if it's deliberate or accidental.
Deliberate. No one with so much financial muscle would openly walk down such an exposed path for so long without good reason.

I spend a lot of time “educating” clients and Kartoo (flash version, (who says that Flash is crap ;))) is becoming a big hit! Once the interface is explained it only takes a couple of hours for them to used to it. Most of them have a number of tool bars installed G, AV, COME ON FAST! It does not take long for them to see things in a different way!

… i feel like i'm the only one that totally drowns in irrelevant maillists
Ummm I feel like I’m a pre-the-pill Neo too (blue/red, blue/red, blue/red, which one to take?)

…but yes, I too am doing this agian. In fact I was creating category folders in my favorites just last night
Quite a trend starting to appear ;)

…if the SERPS don't contain the sites owned by our clients, then they suck. But to the average Joe, the results look fine.
Seems kinda difficult to miss these days apart from when you come up against link farms, hidden text…! OK so we go other routes.

It has been said before but search engines seem to not understand the power of the pro web community we are only loyal when rewarded!

DaveAtIFG

3:55 am on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Amen claus, technical searches a year ago were an order of magnitude better than they are today. "When you don't understand something, follow the money." It isn't clear to me how the financial aspects fit into this situation, but I'm confident they do. AdSense was part of the most recent Google upgrade after all.

ciml

3:21 pm on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is a detectable fluctuation that does not correlate to on page factors or off page factors, but it's very small and I don't think it's the main instability that people are seeing.

IMO, the main factor here is that Google is much, much fresher than in the stable, pre-Dominic days. To enable this, Google fetch pages seemingly at random from their dataset, rather than only following links as in the old days. Google's servers don't get to coincide during constant flux, so we shouldn't even expect second to second stability.

From my point of view, this has made Google far harder to analyse. For some tests I now have less than one thirtieth the resolution I had before April. This is a huge pain, but it doesn't make it harder to find out about widgets. It just makes it harder to find out how the algo's working.

> In the big picture, it will probably help thin out the crowd a bit.

I couldn't agree more, WebG. The Google specialist needs to evolve. Again.

claus

7:48 pm on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> it doesn't make it harder to find out about widgets

I disagree. Really, i do. The widget info that i can't seem to find is the good oldfashioned info that just stays the same and always has been exactly as it is now - like technical specs, manuals for this and that, tables on static pages and the like. I get lots of dynamic content, but that's filled with questions from people looking for the same info as i am in stead of the answers to these questions. The answers, in turn, are mostly found on good old static html pages that haven't updated, changed or moved one bit since the last time i saw them. For this reason (?) they are buried deep down in the serps if they're even there.

As to the workings of the algo, the freshness introduced should supply enough movement in the serps to obscure any modelling - pages have always been produced, updated, and moved around the clock, now they're also spidered and indexed around the clock, and some extra weight seems to have been put on "freshness" (?) ..so it's close to impossible following these changes.

I don't suppose the fundamentals of the algo have changed much, it's just being distributed over more machines and the update frequency has increased significantly, and then there's this added "freshness-effect/-weight" (and, some spamming techniques seem to be ignored as well for the moment - i don't suppose they will continue to be so, however)

I still find that the sites i monitor do allright in the serps. It's just all the other sites (and those really really good information-sites are often far from optimized) that i can't find.

/claus


Added a couple of hours later:

Ciml, the next time i disagree with you (if ever), please remind me to think a little deeper before i post anything - i've been thinking about your post ever since and only partially disagree now.

I still disagree that it's not harder to find info about widgets. Imho, it is harder to find those good old and static information-rich (dare i say it..) *whispers* high PR pages *shhh* with tons of different inbounds and little or (more likely) none on-page / on-site optimization done.

I do no longer disagree that the algo might have changed a bit or that it's application might be different (the reason that it's "harder to find out how it works"). Apart from the noise (freshness) there is a clear signal - these differences are systematic. The only reason why i didn't see it sooner is because my pages are not just optimized for one distinct measure, so they do well although rated on other parameters. That's also why Brett's recipe still holds.

Toolbar PR "broken", "link:" defunct, the army of spiders, monthly dance gone - daily drinks now, stronger regionalization, even Alltheweb's far better (consistent) serps... it all makes sense now. It's just not the Google (engine/algo) anymore, it's the Gooooooooooooogles.

I'm not sure if i should start that other thread or leave it up to somebody else (it's been attempted a few times already i've noticed), but this is all off-topic here. Sorry about that.

[edited by: claus at 10:21 pm (utc) on Sep. 13, 2003]

twilight47

8:58 pm on Sep 13, 2003 (gmt 0)

10+ Year Member



It just makes it harder to find out how the algo's working.

I agree that there will still be alot of Google SEO evolving, but just as in sports, the fundementals will still apply just like Brett Tabke's
"Successful Site in 12 Months with Google Alone"

DaveAtIFG

10:23 pm on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't suppose the fundamentals of the algo have changed much
Once again, me too.

When GG announced an upgrade circa Dominic, I pretty much left my sites alone and "hung on for the ride." These are all straight up SEO'd sites, no database generated spam, artificial linking or cloaking. Today they rank essentially where they did before it all began. In a few cases I made a few on page optimizations or acquired a few links and those sites are a now bit more prominent, just as expected.

The weighting factors in the algo have been shifted, perhaps additional filters have been added, but this upgrade was primarily about AdSense. Secondarily, it was preparing for future growth. Peripherally, it was about improving SERPs.

I suspect database driven spam sites suffered, PR has been deemphasized and on page factors carry more weight. This is all consistent with claus's observations:

It's just all the other sites (and those really really good information-sites are often far from optimized) that i can't find.

For white hat SEO, not much has changed.

2_much

11:44 pm on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"random fluctuation" - freshness?

From the user point of view - no problem. There are ways to find that site you found some time ago. And if you can't - oh well, you have thousands to choose from.

For the SEO - great. It's the same game as in 98, only that instead of tweaking on-page criteria, you tweak the off-page criteria. No big difference :-)

The "evolution" of the SEO that I see, is mainly learning to balance between organic and paid, and mastering both.

ciml

11:45 pm on Sep 13, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Please don't hesitate to disagree, Claus. The world would be duller!

I'm pretty sure that Google cares a lot about those far "from optimized, really good information-sites". I agree that a fresher, less stable Google can reduce the prominence of established and highly reputable URLs; but a lot of searches are for current events in news, sports, products and entertainment. It's a difficult balance, especially as Google want the search interface to be simple for non-expert users.

> Toolbar PR "broken", "link:" defunct, the army of spiders, monthly dance gone - daily drinks now, stronger regionalization...

I think it's mostly webmasters who care about the accuracy of the Toolbar PR graph, the link: search or the predictability of whose widget site will come top - or when.

> That's also why Brett's recipe still holds.

Absolutely! Some of my friends are very aware of how upset I get when a change at Google makes the algo harder to analyse, but I can hardly blame Google for that. On the other hand, I can't remember an update that upset me from a ranking point of view, other than the Internet backbone problems about 18 months ago when a lot of sites were unreachable. Like Dave, I can't say that the basics have changes much.

Brett's recipe will continue to hold, and Google specialists are not necessary for those who are patient and don't want to maximise their optimisation, push the envelope a little, protect their investment or launch a heavy duty Google campaign.

glengara

8:24 pm on Sep 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



An SEO is going to try an experiment, on a high ranking, PR7 site, he's going to exclude Gbot.
There's loads of links with different relevant link-text, and the basis is to see what effect excluding all on-page factors may have on rankings.
Any opinions?

DaveN

10:10 pm on Sep 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



glengara, If the pr7 site has enough backlinks then nothing should change.

I can't ever remember yahoo or disney ever optimising for the keyword "EXIT" but they do pretty well in the serps just from anchor text.

IMHO if enough brute force is put into anyone of the main google algo areas you can rank well.

Google specialists will adapt, has we always have done or we won't be specialists would we.

DaveN

ciml

10:28 pm on Sep 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> exclude Gbot

Or link to a page with the word widget, when the word widget is not on the page. Experiments are fun.

Dave, I can't agree that any of the main areas can get a site to #1. Try to come top for 'exit' using only body text. :-)

DaveN

10:35 pm on Sep 14, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ciml you still think body text is one of the main areas ... Interesting ;)

Dave

Blue Gravity

10:38 pm on Sep 14, 2003 (gmt 0)

10+ Year Member



Google 1 SEOs 0?

In the beginning I think we all looked at Google's changes from the point of view, but when you get down to the dirty details, our job is to make our own sites and/or client's sites as productive as possible according to Google's SERPs. Google will be forever changing, and tracking it's changes in it's algos will be tough, but just because a site of yours doesn't show up as high as you'd like it to in the SERPs doesn't mean you've failed, it just means it's time to recalibrate your optimization methods for Google.

In many ways I'm actually thankful that Google is making drastic changes, because I see it as Google's way of shaking up the odds a bit. When people get used to optimizing one way, Google goes and modifies it a bit, and sets everyone back to zero.

SlyOldDog

11:17 am on Sep 15, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's right. I'm glad too. I hope to always be a step a head of the competition, so a moving target makes it harder for everyone, and easier for the most focused to stay ahead.

ciml

11:21 am on Sep 15, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> body text is one of the main areas

Certainly. It won't make you number one for 'exit', but body text can pull a lot of traffic across a site.

This 37 message thread spans 2 pages: 37