|Does moving content to a subdomain to escape Panda still work?|
| 6:59 pm on May 21, 2012 (gmt 0)|
Does moving content to a subdomain to escape Panda still work? Does anyone have any recent experience?
We were finally hit by Panda 3.5, down about 30%, and I'm considering moving a folder of low performing content to a subdomain, and possibly also noindexing it, to let the main domain and content shine.
After checking that it wasn't because of some changes we had made with fantastic timing on April 18, I suspect the culprit is a folder of PDFs. These are scanned press articles from major newspapers about us and our sector, licenced, but over the years it's grown to about 700 out of a total of 850 urls on our site. It's a great resource, high quality, g has OCRed most of them, but it attracts under 5% of our visitors, the PDFs have limited engagement, and sometimes stray a bit far from our niche.
Noindex to recover from Panda seems to have mixed press, but what about subdomains? Any help or feedback is definitely much appreciated!
| 8:05 pm on May 21, 2012 (gmt 0)|
There is no escape from Panda. (*that I've yet to find.)
| 9:04 pm on May 21, 2012 (gmt 0)|
I have had very positive results from removing thin/shallow pages, so yes, I would recommend doing this if you feel they add little value.
| 9:38 pm on May 21, 2012 (gmt 0)|
Jinxed did you delete the content, noindex it, or move it to another subdomain?
I'd like to keep this content available to visitors, but some people are reporting that just noindex doesn't always seem to make much effect, so I though moving it off domain as well as noindex might be a better bet.
| 2:09 am on May 22, 2012 (gmt 0)|
I think there are a few articles out there that "analyze" how ehow and some of the other large sites that were hit by Panda managed to escape. I believe that one of the steps they took was to use subdomains to structure their content.
| 5:43 am on May 22, 2012 (gmt 0)|
I remember some people saying that it works in the short term, but not in the long term. Not sure if anyone has had long term success with it.
| 6:04 am on May 22, 2012 (gmt 0)|
I removed them - status code 'Gone', or added a 301 if the content had decent links.
As others have said, moving the content to a sub domain would no doubt see a short term lift, but in the long term (i.e. When Panda is rerun once or twice) I imagine your Panda quality score would be lowered back again.
I would say find the reason G doesn't like the content and fix it, or remove it from the index. I'm personally not a fan or just 'no-indexing' a chunk of content as that still doesn't get to the root of why this content isn't deemed good enough.
Just my opinion though.
| 3:58 pm on May 22, 2012 (gmt 0)|
Thanks guys for the comments, appreciated.
Most people's experiences from what I've read seem to be a short-term lift and then long-term back to square one - but it was mostly the content they were moving to the subdomain that they were interested in. The subdomain content got smacked down again after a Panda cycle or two.
For me it's the content left on the main domain that's important, and I'll probably just noindex the subdomain content just so it's at least accessible somewhere.
The main domain content has pretty good engagement and usability.
Has anyone seen any examples or have any direct experience about this way round?
| 10:54 pm on May 22, 2012 (gmt 0)|
I have not escaped Panda, but Im also just now working on it, I will block google to spider half of my sites, but just google, I will not remove them be cause that would be bad for my visitors and I dont work to please google, but my visitors.
| 1:35 pm on May 26, 2012 (gmt 0)|
Any other experiences with hacking back on the main domain and using subdomains to 'purify' the main domain?
| 3:39 pm on May 26, 2012 (gmt 0)|
Even if others have had positive/negative experiences with doing this, no two situations will ever be the same.
If you are pretty certain that this section of content is the cause of your problem, then your plan of moving the content to a sub-domain and no-indexing it from Google seems like a good plan.
Either way, what’s the harm in trying it? I suggest giving it a go and posting back your findings here in this thread.
| 4:11 pm on May 26, 2012 (gmt 0)|
I try to 'fiddle' with this domain as little as possible, which has served me well in the past.
The content that I suspect is the culprit is not fixable, they are scanned newspaper articles from an official press service, so it's a binary choice in how I deal with them. Ironically, it's really good stuff and very useful, but to a very narrow group of people.
I'll report back of course, now off to the Apache forums to figure how to noindex PDFs and their jump pages...
| 1:34 pm on Sep 5, 2012 (gmt 0)|
It's time to update everybody, and it's good news.
We had a small jump in referrals on Monday 20 August, and over the last 2 weeks traffic has been steadily increasing, with yesterday at 90% of where we were before panda 3.5. Fingers crossed that it continues.
Overall we bottomed out in early June at 45% down (there was something that hit out site mid June, around 19 June that nobody has talked about). We did a total of 4 things to the site, though the 4th was only rolled out on the 18 August, so too soon I think to have had any influence on the recovery.
1. a couple of days after my last post here I did what I asked about in the OP, all moved to a subdomain and noindexed, including the jump page. There was a bit of a hiccup with the 404 page (details here [webmasterworld.com...] but in a couple of weeks g had got it's teeth into them and within 4 weeks a large number of the PDFs were dropping out, all done in 6 wks more or less, even though there are still a couple showing up in a site search.
2. Revamped the about us section, with completely new copy for the category and 2 of the 10 subcat pages, including some industry membership logos, mentions, and all written without any thought at all to SEO or KWs. Added lots of facts and figures, such as dates, employees etc. The worst hit page was in this section FYI, though it wasn't one of the re-written pages.
3. Improved the call to actions on our money pages, which are in 2 categories, toning down lots of keyword work. It wasn't excessive at all, maybe 2-4 mentions of the main 3 word KW in 500-1500 words and another 3-4 related ngrams in the text. We cut this back to 2 kws plus trimmed back ngrams and ‘supporting’ kws from 3 or 4 words to 1 word, ie 'great big widgets' to 'widgets'. About 10% also had their title trimmed back a couple of words, touching up the description meta and cutting back the KW meta (yes, I know) from 6-7 to 3 core KWs. We did the same also to all other pages, including those hit hardest by panda (this was all we did to those pages). They now read much better, and there was also perhaps a dozen sentences trimmed out of the text that were superfluous and paragraphs broken up.
4. Lastly, we add an extra navigation option - as well as the main menu, at the bottom of each page there's a 'next' option, and we added also a 'back' option to the previous page in order. Our thinking was that people felt pulled through the page with few choices, and we're effectively doubling their options once they've read the entire page (we're normally at around 3 mins on average, 3 pages and 55% bounce). It seemed to be a point where we losing visitors. Also, these next and back options were configured to lead visitors up the silo, so after reading all 5 sub-sub-category pages, the last next option leads 'up' to the sub-cat page, after reading all 5 of those, up to the cat page, etc. and it's noticeably boosted the number of people exploring our site more widely. It’s also worth noting that we toned down the internal anchor text in the ‘next’ link, which were very kw focus, to shorter and sweeter text anchors, with the reasoning that after reading a page of widgets it’s logical to say ‘what about big fat ones’ instead of ‘big fat widgety widgets’
I don't think point 4 actually helped with the recovery though as it was rolled out too late and only to half of our site so far (the English version only, our site is bilingual) but might explain the ramp up in recovery over these past 2 weeks while g is figuring out how the internal links have changed.
Does anything in my description sound familiar? I hope it helps people out, and gives a bit of encouragement at least!
| 2:02 pm on Sep 5, 2012 (gmt 0)|
Thanks for the update and congratulations.
Points 2 and 3 sound like really good improvements for your users, improving readability and adding credibility with membership logos, etc.
Can you clarify the situation with the PDF's that you moved to a subdomain and noindexed. Does the main domain still link to those from the same places as before?
| 2:13 pm on Sep 5, 2012 (gmt 0)|
Yes, the main domain still links there from the same sub-cat page and 7 sub-sub-cat pages (one for each year), just the links have been updated.
They were all linked from the resources category, so that close to the about us or 2 money cat pages.
Also, I should mention that points 2 and 3 were rolled out between 20-23 July, so not that long ago, on the English half of the site, then 3-5 August rollout for the other half, though it was the English half that got hit the hardest.
| 3:12 pm on Sep 5, 2012 (gmt 0)|
How many clicks would it take to navigate from the pages hardest hit by Panda to the PDF's?
Did you change the position of the links to the PDF's at all (e.g. move them somewhere less prominent, such as lower down the page)?
| 3:26 pm on Sep 5, 2012 (gmt 0)|
The hardest hit page was a sub-cat page in about us, with a couple of child pages (so 2 sub-sub-cat pages). The PDFs were linked from a sub-cat page and 7 sub-sub-cat pages in resources, so 2 clicks or 3 clicks from the hardest hit page to the page with the linked PDFs.
Nothing else at all was changed on the PDF pages.
| 3:59 pm on Sep 5, 2012 (gmt 0)|
Were the sections hardest hit by Panda the ones that generate most of your organic traffic?
Was your content ever scraped? (if so, were the scrapers more prominent around the time you were hit by Panda than they are now?).
| 4:31 pm on Sep 5, 2012 (gmt 0)|
The section hit the hardest generated about 35-40% of our g organic traffic, though scrapers haven't been a major problem for us. There's always the usual slideshare to report or blog that copies too much, but they don't tend to outrank us - we find cloudflare seems to help with cutting down on scapers.
The competition in our serps has mostly been pretty decent, we're not in the kind of market that is dominated by articles or 'how to's, but B2B high cost services.
| 5:29 pm on Sep 5, 2012 (gmt 0)|
My view on the Panda effect is (or was!) that Google hits the sections of the site that are low quality or are close to the low quality stuff. Your worst hit sections are several clicks away from the PDF's by the sound of it (before and after re-locating them to a subdomain), so not very close really.
The changes you made to the worst hit sections sound too minor and too recent to be the reason for the recovery (but could be nevertheless).
My personal view on the subdomain solution is that if the low quality content is simply moved to a subdomain with no other changes and is just as accessible after moving it to a subdomain as before, it doesn't really change anything from the user's perspective so I wouldn't personally expect it to be a long term Panda solution.
There have been examples of sites temporarily recovering after moving content to subdomains, but only temporarily so it will be interesting to see if the recovery continues/sticks. I hope it does for your sake.....but it will blow much of my understanding of Panda out of the water!