Welcome to WebmasterWorld Guest from 3.228.24.192

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Best practices for reviving no-indexed content?

     
8:29 pm on May 7, 2017 (gmt 0)

Full Member

10+ Year Member

joined:June 4, 2008
posts: 202
votes: 0


I have some content on my site that I noindexed due to quality purposes (aka panda). In most cases this content has been noindexed for quite some time (years in many cases). I'm finally at the point that I would like to try to revive that content by improving it. I've had good success improving the lower quality content that I didn't noindex*** but I'm wondering, since this other content has been noindexed for so long, what the best approach would be.

Has anyone revived long dead (noindexed) content with any luck?

i see a couple possible approaches:
1) Improve it, remove the noindex tag, keep same url
2) improve it, give it a fresh url, 404 old url
3) improve it, give it a fresh url, 301 old url to new url

Other ideas? I'd love to hear if anyone else has had success (or failures) with reintroducing content into the wild vs just testing this blindly. Since I have had luck with improving our other content I'm a bit nervous about potentially tipping the scale again somehow and undoing all that good. As of right now, Google referrals are up 25% after the initial improvements (of course, after losing so much traffic to Panda over the years, 25% is a mere drop in the bucket)

*** (they were higher priority pages so they got the treatment first and noindexing the other content stopped the downward panda spiral)
9:21 pm on May 7, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10466
votes: 1098


I'd treat it as new content (which it should be) and not worry about anything else. Do 410 the old stuff as it is taken out by the new content. No sense in playing with old urls *(or playing with fire).
10:33 pm on May 7, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


1) Improve it, remove the noindex tag, keep same url

URLs, even no-indexed ones, never go away. There are links somewhere to these files and you might as well keep the juice. There are also fewer other obstacles by continuing to use the same URL.
11:57 pm on May 7, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10466
votes: 1098


New URLS will help avoid old Panda problems. Take your pick and see what happens. :)
12:08 am on May 8, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


New URLS will help avoid old Panda problems
However, if the urls were noindexed, they do not have "old Panda problems" and many of those Panda restrictions have since been reversed by the last several algos anyway (I removed my disallow file without any effect.) Also getcooking has also said they are aware of the old Panda issues and these pages will be "improved." Regardless, IMO it's always better to use the existing URLs unless there is a verified reason not to. That's what I would do.
12:14 am on May 8, 2017 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5836
votes: 161


My choice would be

>> 1) Improve it, remove the noindex tag, keep same url

Take things gradually and do a few pages at a time. If the quality is significantly improved, Google should respond just fine.
12:44 am on May 8, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10466
votes: 1098


The reason for the noindex (according to OP) was Panda. So yes, the problem remains. That said, it is six to one-half dozen as to reusing URLS that likely have little to no back links (thin content, who'd link to to that?). Sometimes it is best to cut losses and start over.
1:48 am on May 9, 2017 (gmt 0)

Full Member

10+ Year Member

joined:June 4, 2008
posts: 202
votes: 0


Thanks everyone for the speculation and reasoning behind your suggestions. I guess I'll just run some tests and see what happens! I was hoping to follow in the footsteps of someone who'd tried any one of the approaches (you know, successfully!) but I realize each case is different. If I come up with a working strategy I'll definitely report back!
1:01 pm on May 9, 2017 (gmt 0)

Preferred Member

5+ Year Member

joined:Mar 22, 2011
posts:451
votes: 7


1) Improve it, remove the noindex tag, keep same url

My vote. That seems to work for me.