vphoner - 2:25 am on Apr 17, 2011 (gmt 0)
I have often changed out affiliate links. When I deleted links out of the redirect file, Google wasn't able to discover the 404, and continued to index them. Over the years, dozens were accumulating. At the time of Panda, 30 redirect links were indexed in Google and 90% of them were 404 (and have been for over a year, but Google didn't know it). I have since removed the robots.txt denial and the dead redirect links have been deleted from the index. Only 2 valid links remain, but I am trying to figure out the best way to handle this in the future. If I were to add the deny back into robots.txt, those links would simply reappear in the index because Googlebot keeps requesting them from memory...if it can't encounter a 404, then it will reindex them.
It may be best to create a new folder, block it in robots.txt first, then make changes to your code to look in this new folder for your redirect file. Then delete your old files and let the search engines search the old folder (where the file will 404 back and be removed from the index) Would this work?
Can a noindex, nofollow be added into a php redirect file above the script?
I tried it, but it did not work, the redirect did not work after I added noindex, nofollow at the top of the file. Anyone know how to do this?