---- A subdomains fix for Panda - Matt Cutts suggestion to HubPages
dataguy - 12:11 pm on Jul 23, 2011 (gmt 0)
MarvinH, apparently it didn't work the first time around for him.
The first time around was me trying to remove what I was guessing was weak content from a different site, in the hopes of escaping the effects of Panda on the rest of the site. This didn't have any effect at all as far as I could tell. If you recall, that's what conventional wisdom and what MC said to do since the beginning of Panda until the article on HubPages turned up.
The difference between that and what I'm doing now is: 1. It's a different, stronger and larger site. 2. I'm separating the content by author account and not just guessing at which pages are weak. 3. I started with author accounts I knew were strong, changing the URL's on strong content instead of hoping that the old URL's would gain strength after removing weak content.
I think these things make a big difference. When I first heard of the HubPages experiment I was very skeptical because I thought I had already experimented with what they were doing. I'm glad I got over my skepticism.
Do you think it is possible that Google is sending more traffic to your new URLs only because they are new? New-page effect?
It will be interesting to see how these new URLs still perform in a few weeks / months.
You're exactly correct, @MarvinH. That's the big question and it's why I've only committed a small percentage of my pages to this experiment. I was hoping to get some feedback from others experimenting with subdomains so we could compare notes.