Setting sustainable benchmarks and techniques to survive for the long haul is probably sound business strategy.
With Florida long since gone and Big Daddy the most recent update that whipped through the website community, one can see a pattern of those that were well prepared and those that weren't.
There were those that made adjustments and were unscathed, those that lived to tell the tale with a close shave and those that fell into the journals of the forgotten.
There were those that blamed Google and those that accepted the environment and did all things within their power to strengthen their sites ability to survive.
Web history shows that the survival of the fitest often accords with the realisation of reality and facing up to it, or quick adaptations to failures [ some call it research :) ] and transcends those that sit on the fringes with over optimisation techniques and low quality content, ready to be hammered when the next cleansing comes through in Google's determination to hold it's dominant position of providing good content.
So what are the "over optimisation" techniques that make a site most vulnerable to this next change, and how will content need to improve to get you through.
And what are your benchmarks for the future?