Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Supplemental results - questions

Google could make the matter become easier

         

rmiranda

6:15 pm on Apr 1, 2007 (gmt 0)

10+ Year Member



I’d like to put some notes about these issues and the Google:

1. They say the supplemental result is “completely automated”. This means something like: “We have not responsibility on these results. The machine is the responsible.”

Who programs the machine? Who establishes what variables will have major or minor value in the algorithm? Do they not trust in their algorithm?

2. Why they leave us speculating about what happened? Why they don’t say clearly what are the reasons to a given URL is on supplemental result and what to do to change it?

If they do it we will have not headache and their index will be clean.

3. They say the supplemental result is not a punishment nor a secondary index.

Of course is a punishment and in most cases it comes from errors or mistakes by the sites authors. But also in most cases the authors do not know what they did wrong. On the other hand, no one knows if the pages will return to the main index.

Crazy!

tedster

8:32 pm on Apr 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have you read the main thread: Supplemental Results: What exactly are they [webmasterworld.com]? Seems to me we've had decent input from Google about this. No there's not a recipe for getting your pages moved out of the Supplemental Index, but they could hardly do that, right/

Mostly, I think the Supplemental Index is Google's response to the computing challenges created by explosive growth on the web -- especially in this era when multi-million URL domains can be created overnight. It's a much different situation than it was in the earlier days when they could crunch data for a few hundred million URLs and have a decent engine.

Marcia

9:11 pm on Apr 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Who programs the machine? Who establishes what variables will have major or minor value in the algorithm? Do they not trust in their algorithm?

People create the programs, and no doubt the variables chosen are based on statistical data to a very large degree.

2. Why they leave us speculating about what happened? Why they don’t say clearly what are the reasons to a given URL is on supplemental result and what to do to change it?

Each site has its own set of factors and conditions that cause pages to go into the Supplemental index. In order for them to say clearly why any given URL is supplemental, some person would have to examine the site in question to evaluate.

Any idea how much manpower that would take, and how much it would cost, to provide such a *service* to webmasters?

But also in most cases the authors do not know what they did wrong.

What makes evaluating individual sites for webmasters (other than the webspam team for spam reports) their responsibility?

Added:
It might not be that the webmaster did anything wrong. It may be that the site doesn't have the qualities to qualify for the main index - or doesn't *yet* have the qualities needed.

[edited by: Marcia at 9:16 pm (utc) on April 1, 2007]