Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Update Allegra - Google Update 2-2-2005

         

illusionist

1:34 pm on Feb 2, 2005 (gmt 0)

10+ Year Member



My site which came back on december 26 update, seems to have disappeared again on this data center [216.239.53.99...] . Its notwhere to be found even in allinanchor, allintitle etc? I see majot change on that data center, is this a new update?

george123

1:22 pm on Feb 5, 2005 (gmt 0)

10+ Year Member



continue ...The problem for Google Yahoo or MSN is that they have to use all those IT engineers or PHD Maths guys who unfortunately they are young boys "college boys",(most of them)and have no contact with the real world like the street boys ,the gamblers (even the stock exchange gamblers),the clever green grosser businessman from next door ,the old journalists and sociologists ,the pushers the prostitutes and all the underdogs of our well respected society(LOL).All they know is just numbers and forms AND get a bonus from our company have a BBQ and sleep lonely (as usual happens with all those guys)

skippy

3:41 pm on Feb 5, 2005 (gmt 0)

10+ Year Member



Definitely no sign of that in my area. The results are very literally. Seeing sites with low PR, 4 backlinks, but the keyword phrase stuffed into their page fifteen times.

I am seeing stuff where the only mention of the keyword is in the title or just image tags. It looks like pure LSI without any filters applied.

europeforvisitors

4:29 pm on Feb 5, 2005 (gmt 0)



some - including me - have the feeling that outbound links in some manner may have a positive impact on the linking site itself. Maybe in such a way that with valuable outbound links a site looks more like an authority on a given topic.

It would also make sense for Google to assign greater weight to one-way links than to reciprocal links (whether outbound or inbound).

Other factors could come into play, too, such as the nature of the pages that are sending or receiving the links.

[/end "what if" scenario]

Brett_Tabke

4:44 pm on Feb 5, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Lets try to stay on topic.
(update names come from the system used to name hurricans. first update - female - starting with A. something to do with Google Dance...music...allegra...no it was not named after a drug, a person living or dead - it was named, because no one came up with anything better at the time.)

anyway. I am not sure what the keywords did this time (never watch them any more) - but referrals are up about 5% from google.

erthlng

4:51 pm on Feb 5, 2005 (gmt 0)

10+ Year Member



We run several fairly large sites.

One of them is a small specialized directory that has been listed on page 1 of the serps for a keyword that usually generates about 22 million results on G. The site has been around since 1996. It can now be found on page 3 of the serps for this keyword.

Another e-commerce site we have run since 1995 usually has around 590,000 pages listed on G. It has lost 130,000 pages in the serps since Feb 2. The last time we saw something like this was after the Florida update, when we were almost completely dropped by G.

Here is the interesting part, although the overall number of pages listed on G has decreased for the e-commerce site, we are seeing a significant increase in sales since Feb 2. What I suspect has happened is that although a large number of pages have been dropped, other pages seem to have moved up in the serps.

claus

5:46 pm on Feb 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



So, this is the first live round, eh?

Now, i can't determine the timing with anything near accuracy because a few real life events (tsunami + domestic storm + local election stuff) tend to skew my metrics.

I am absolutely sure, though, that this update has both an algo enhancement and a PR recalculation, and that it started no later than mid january. First, we saw the first signs of the algo changes in selected portions of the web as tests were carried out - now the winning tests are migrating to the whole data set and a data update is performed.

What we are seeing now proves what i stated in some odd post early last year, that in order to do partial/continuous PR calculations you need to make a full scale update once in a while to make sure your data and reference points don't decay. So, this is the first round in the new system.

It also helps my understanding of the so-called "sandbox" somewhat (although not totally), as in the absence of a total count/calculation some data points will miss valid references, and as the inflow of new data don't stop this process will tend to enforce itself (data decay). One year is a long time - i think perhaps we should expect the next round to be within six months or so.

As to whether the SERPS are good or bad - i'll leave opinions and judgement to the Google Search Quality people. Whatever it is we webmasters just have to live with it the way it is :)

---
Nice thing with the images btw. - it could make those text-only SERPS a little more exiting. AFAIK, it was tried some time ago too but i guess it was dropped again. Let's see if it makes it into production this time - sofar i've just heard about it, not even seen a screenshot. Added: I can't help it, the word "Picasa" just can't get out of my mind when i hear this. I think there might be a few difficulties regarding potentially offensive pictures though.

Jon_King

6:05 pm on Feb 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>keywords (never watch them any more)

From the mouth of a true content builder...

Jalinder

6:36 pm on Feb 5, 2005 (gmt 0)

10+ Year Member



So what are the parameters google has given more importance to?
For us things that were working before are not working now. Listings in top 10 are now below 500, some not even in 1000. Is this due to some penalty?

theBear

6:45 pm on Feb 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



claus,

In order to do continuous pr calculations etc .... you need a ton of horse power and correct data.

Chances of having both at anypoint in time is close to nil, all of the calulations done by G are approximations.

If you finally get the horsepower you order, the data set has out stripped that horsepower.

So the results decay and to keep new sites out just makes matters worse.

Some result sets will be better than others. More so if the area is static. However few things on the net are static.

I think there is no easy way to figure out when things start changing because of a major update vs minor updates. But to tag this one on one site I keep track of no later than Jan 28 for the start.

I am also wondering if G has a problem with parked domains, but right now that is just a wondering type of thing.

claus

7:02 pm on Feb 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



give it a second thought theBear :) There are a number of ways you could do it, one would be to establish a baseline and then just monitor changes to this and factor those changes in. Depending on how you do it, this would surely lead to sandbox-like things at some point. Another would be to rotate the calculations among subsets of the data, or use aggregates (eg. domains, hosts) - in both cases use these to establish some "distance metrics" or whatever you would call it... there's a lot of room for creativity here.
This 823 message thread spans 83 pages: 823