Pjman - 3:52 pm on Nov 26, 2012 (gmt 0)
I have a number of data only PDFs that users find highly valuable on a few of my sites. All of those sites were hit by Panda 1. The only real low-quality work on those sites are these data-only (numbers) PDFs, thousands of them.
Would a robot.txt disallow of the directory of PDFs re-establish quality?
With HTML I usually just “noindex” and robot.txt disallow, but PDFs don’t allow for that option.
Any ideas guys?