tedster - 4:48 pm on Aug 21, 2012 (gmt 0)
Yes, I'll bet the algorithm does have its own version of spaghetti code going on. Seems unavoidable at this point. For instance, one of the conflicts I see is between quality measures like Penguin or Panda and freshness concerns.
I'd say another problem with Penguin is it assumes that quality and technical SEO can't exist for the same page. If Penguin just wiped out any advantage that "artificial" steps might be giving the URL, that would be one thing. But it seems to me that Penguin hands out stronger demotions than that - and good content can become almost unfindable.
I do know this for sure - as a user, not as a webmaster or SEO. For a while, Google Search was so good I barely bothered with bookmarks. But today when I find a gem I know I need to record it somehow before it becomes unfindable. I give Penguin the credit for that.