potentialgeek - 6:28 pm on Jul 15, 2011 (gmt 0)
In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.
I don't think we can read much into this, because Cutts basically was answering a question politely. The webmaster was already about to try his idea, not Cutt's idea, and Matt said he could try it and some other stuff. Which is a low-value statement or no-comment comment. This is consistent with his previous public statements on Panda. He has been very helpful on other topics but very cagey on Panda. It could simply be the fact that Panda code was written by engineers in another dept, not the Spam Dept, so he isn't authorized to say much about it.
Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites donít get as much attention or care?
This quote is one of the Google Guidelines after Panda [googlewebmastercentral.blogspot.com]. I don't see how putting old content on a new subdomain will override the issue Google expressed in that guideline.
Vanessa Fox, speaking for Google about Panda [webpronews.com], recently said:
Google wants the searcher to be happy and easily find their answer. Letís say the content and the user experience are good for that page. Then you run into the issue of quality ratio of the whole site. The question then becomes if someone lands on your site and they like that page, but they want to engage with your site further and click around your site, does the experience become degraded or does it continue to be a good experience?
Google has already decided it has assumed responsibility for user experience beyond the landing page via Google search results. That was the motivation behind the decision to allow bad pages to bring down the entire site. A new subdomain doesn't protect users from bad pages. There's no way this subdomain "fix" is going to last. It must be a loophole Google engineers are already working to fix.