Welcome to WebmasterWorld Guest from 220.127.116.11
[edited by: tedster at 5:33 am (utc) on Sep 1, 2012]
Google send most of the traffic but I sure as aint playing their games in the hope things improve.
Here's a thought. What if G was "training" their filters instead of assigning them a value.
Take Geo-location. Instead of using Server IP, ccTLD, potentially whois and WMT settings- what if they just let "the world" tell them which locations responded well to a site, and which ones don't
...the machine-learning does seem to be going pretty slow in the mid to long tail queries.
I have not seen anyone on the web who can say Panda is about blabla 100%
If Google understood Panda in perfect detail, then they would not have had such a monster thread on their own forum asking for input from webmasters who felt they were a false positive.
Thus, if you are able to "fix" your backlink profile to make it less prone to Penguin, you are just confirming that you are a dirty spammer in the first place. Spammers don't get rehabilitated.
I believe that the biggest hurdle to getting out of Panda (if you assume user metrics tell you everything you need to know about your site's quality) is knowing where the threshold is
I mean the same, although i think it is not all about user metric but site structure ( siloing ), unique content. Apparently all this reduces in some way back to user metrics.
Sometimes we have a great start to a day, hit our normal level of sales by 1pm and think we're in for a good day, then the door shuts and the total at the end of the day is exactly what we would have predicted. Could be natural, but it defies all real world explanation.