Forum Moderators: Robert Charlton & goodroi
In May, Edmondson wrote an email to Google engineers...and asked whether he should break up his site into “subdomains,”...In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.
The HubPages subdomain testing began in late June and already has shown positive results. Edmondson’s own articles on HubPages, which saw a 50% drop in page views after Google’s Panda updates, have returned to pre-Panda levels in the first three weeks since he activated subdomains for himself and several other authors.
I had a look at some of these subdomains (on that forum thread, just click on the name of someone who says they are participating) and at a glance they have all the structural problems that the original Pandalised hubpages had.
Umm... why not just get rid of the material that is NOT "high quality" content?
Then you won't have to worry about "getting relief" for the high quality content by moving it onto individual subdomains or other shenanigans.
noone has recovered simply from removing "bad" content.
Umm... why not just get rid of the material that is NOT "high quality" content?
Think of the nightmare for G if a single bad subdomain tanked the entire wordpress.com site. So they must have something built into the algo determining the relationship between subdomains and domains - and they probably use interlinking to determine the relationship - and look how they are treating their own blogspot structure - the subdomains link back once to the domain within an iframe in the nav bar at the top, and the domain does not any pass link juice to the subdomains. That's no relationship at all!
In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.
Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
Google wants the searcher to be happy and easily find their answer. Let’s say the content and the user experience are good for that page. Then you run into the issue of quality ratio of the whole site. The question then becomes if someone lands on your site and they like that page, but they want to engage with your site further and click around your site, does the experience become degraded or does it continue to be a good experience?
even if it *does* work, I suspect it won't work very long
all the subdomaining strategy does it so immunize the good stuff from the bad stuff.
I can't imagine placing low-quality articles on a separate subdomain. That just seems like more of a mess to me
Splitting low-quality content off to a subdomain just doesn't seem like a logical solution to me
He has been very helpful on other topics but very cagey on Panda..
When you find something that works DON'T TALK ABOUT IT. #fightclub
I'm leery of anything Google says or does right now, but if Google wants us to use this technique, then maybe they won't smack it down.
moderator's note: thegypsy (a.k.a. Dave Harry) has written
more about his involvement here: [searchnewscentral.com...][edited by: tedster at 4:24 pm (utc) on Jul 18, 2011]
I get the sense that Panda runs separately than the 'at the time of discovery' algo set.
But hubpage CEO definitely seem to say that they see the recovery only for the good content and not for the bad. That one is a mystery! How can the distinction be made by the normal algo if panda is a separate run?