Welcome to WebmasterWorld Guest from 23.23.46.20

The next Google update is coming - prepare to survive

Good preperations for the next update - setting benchmarks

   
12:08 am on Oct 21, 2006 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Setting sustainable benchmarks and techniques to survive for the long haul is probably sound business strategy.

With Florida long since gone and Big Daddy the most recent update that whipped through the website community, one can see a pattern of those that were well prepared and those that weren't.

There were those that made adjustments and were unscathed, those that lived to tell the tale with a close shave and those that fell into the journals of the forgotten.

There were those that blamed Google and those that accepted the environment and did all things within their power to strengthen their sites ability to survive.

Web history shows that the survival of the fitest often accords with the realisation of reality and facing up to it, or quick adaptations to failures [ some call it research :) ] and transcends those that sit on the fringes with over optimisation techniques and low quality content, ready to be hammered when the next cleansing comes through in Google's determination to hold it's dominant position of providing good content.

So what are the "over optimisation" techniques that make a site most vulnerable to this next change, and how will content need to improve to get you through.

And what are your benchmarks for the future?

2:59 am on Oct 21, 2006 (gmt 0)

10+ Year Member



I believe it started 3 days ago - although most are calling them refreshes these days
3:49 am on Oct 21, 2006 (gmt 0)

5+ Year Member



If you look at Jagger and Big Daddy as a continuum, I fall into the "licking minor cuts and major bruises" category -- so far.

I.e., Big Daddy took back most, but not all, of the huge gains Jagger brought me.

I would hope that if the pendulum is about to swing it heads back towards Jagger.

To this day however, I refuse to submit a sitemap to Google, mainly because their spokespeople categorically denied that the Sitemaps program could do damage, when it already had.

9:49 am on Oct 21, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One of my most successful pages is a self-referential form, where visitors have to click through various options up to five times sequentially requesting this same URL. Quite badly designed actually, but I have the suspicion that the google-toolbar "thinks" this page must be very very important;)

I believe that the so long promised roll out of big daddy has been stopped in the middle by the "end of the summer". Maybe what you just observed was a slight shift back or repair. The big storm of the full usage of the new infrastructure is still to come.

To me the key issue is subdomains, which is closely related to the (canonical) www.-subdomain-issue. Make sure you did not SEO in that area, and if you did, pray. I believe that there was still something "wrong" with PR-calculation concerning the exact defnition of subdomained-URLs, and that google's technicians are intensively investigating data collected from webmaster central. Once this issue is resolved, we will see the first true and new Pagerank calculation since 2003/4 combined with a new and much more flexible infrastructure to add tinkering-knobs like visitor data collected by the toolbar or recent research in semantic analysis.

Make sure your site is about what it predends to be about.
Don't do evil.

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month