homepage Welcome to WebmasterWorld Guest from 54.226.235.222
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Google and Continuous Updates...
Will dodgy results become more prolific?
crobb305




msg:202694
 5:47 pm on May 7, 2003 (gmt 0)

There has been some talk that Google is going toward continuous updating (continual update of backlinks, PR calculations, etc). Although monthly updates can be frustrating (the waiting and suspense), one advantage of monthly updates has always been the relative stability of the serps. Seems to me that continuous updating may make it easier for spam to creep in from the continual tweaking that some SEOs will do in attempt to figure out the algorithm. Does anyone else share this concern? I guess it might not be any different than what the freshbot already provides (except more frequent PR changes), but it seems that quicker, more continuous updating, may take away from the care and precision that Google has always given to the preparation of it's monthly index (with more spammy results similar to those of other search engines that provide continuous updates).

 

AthlonInside




msg:202695
 6:09 pm on May 7, 2003 (gmt 0)

Yes, that's why I prefer an update once a month instead of continuos update. I hate spams, and they never stop.

Chris_R




msg:202696
 6:22 pm on May 7, 2003 (gmt 0)

Google is not going to update their website based on what webmasters want. All the whining about spam is from webmasters. Users DO NOT care about spam - they don't check to see if backlinks are from guestbooks or any of the other 100 things webmasters whine about. If the results are relevant is all they care about.

I am not sure what path google is going to take, but BEFORE the freshbot - google had mentioned in at least one article (actually I think it was a radio interview) that they wanted to do more frequent updates.

I believe the freshbot might be their compromise, but I am sure that they want to give users the freshest listings they can. They have to balance between:

1) Using their computer power to do an overall update more often than every four weeks.

2) Using their computer power to provide better results using algos for fresh listings.

I think (and this is just an educated guess) that they will lean towards #2 for the time being as I think they still have some work todo before they "flatten out the curve" (I don't know what the word for what I am trying to say is).

crobb305




msg:202697
 6:27 pm on May 7, 2003 (gmt 0)

If the results are relevant is all they care about.

True...but with a quicker update cycle (say every 48 hours), webmasters can continually tweak to try to figure out the algorithm (and see the result of their tweaking within 48 hours) and different forms of spam will develop more quickly. This *may* cause irrelevant results to show, thus causing dissatisfied surfers. I am not whining, I am just suggesting that Google be prepared for this and the manpower it may take to quickly identify and remedy these forms of spam. At least with a monthly update, there is a period of deep crawling (limiting the time frame spammers have to act) and then the update (causing spammers to wait another month to see the result of their action). Meanwhile, the serps are fairly stable except for the efforts of the freshbot (which doesn't seem to cause drastic changes in the serps from day to day).

I am just thinking out loud here, so please do throw rocks at me! LOL

Alphawolf




msg:202698
 7:02 pm on May 7, 2003 (gmt 0)

Great post Chris_R

In some industries/segments even frequent updates using Freshbot info won't change SERP's.

Example the client I am doing SEO work for- _all_ the other sites in thier industry have been the same for over 1 year. All static sites.

That's why every update thier site is the only one to move up. :)

Last update for 'industry name' company they went from 23 or so to #3. Now even with this update they are still at #3 for that phrase, and have gone from #34 to (as of now) #13 for 'industry name' which is their big phrase.

This industry is only known by a couple phrases...and all static sites- so my point is that even frequent updates wouldn't change the SERP's for this industry.

The deepcrawl and links with anchor text is the key in the above case.

So, if Freshbot could apply new links hence PR THEN things would change really quick.

But- doubt that will happen. Too much room for abuse and too much computer power needed...not worth the efforts...

Just thuoghts...

AW

poet22




msg:202699
 7:06 pm on May 7, 2003 (gmt 0)

" (causing spammers to wait another month to see the result of their action). "

On the other hand the surfers have to put up with that spam for the whole month. Which is worse? The user is the only concern of google if the results are relevant who cares how they got there..

markusf




msg:202700
 7:57 pm on May 7, 2003 (gmt 0)

there is a simple solution to all this which i think google would impliment....

Continious updates etc would only be applied for sites with PR 6+ or so...

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved