ColourOfSpring - 9:37 am on Jun 5, 2013 (gmt 0)
The problem with disconnected observations are - they are disconnected, they are random with random error is observation and implementation and you don't really don't know how to interpret all these mixed signals... e.g. if you edit your title and you gained two position in 2 days you assume your edit caused that... but that is not necessarily true.
If order to have a chance of understanding PENGUIN as it relates to a a specific domain you need to have great historically understanding of that domain... thus anyone coming here to figure out how to recover from a PENGUIN issue isn't likely to have any luck because the alleged developed experience produce "here" is in abstract form.
My background is programming. Penguin is like trying to work out how a "program" works purely by experimenting with inputs. This is made far worse by the fact that the "program" is run only once every 6 months or so, so you get few opportunities to experiment. THAT IS THE PROBLEM. It's the sheer amount of time you have to wait to see if your inputs have affected you positively or negatively. If Penguin were run once a week, we'd have understood it by some point in May (perhaps June) 2012. Google don't want us to understand it because it's a punitive update by nature. It can ONLY be punitive if it's run irregularly and infrequently. Moreover, its effects are long-lasting too, by nature I believe: most sites never recover. PENGUIN IS DESIGNED THAT WAY. The fact that most sites never recover means Penguin works as intended (from Google's perspective). If Penguin were a helpful, instructive update to make better webmasters of us, Google would be more active in helping us recover.