member22 - 8:45 am on Aug 30, 2013 (gmt 0) [edited by: ergophobe at 2:07 pm (utc) on Aug 30, 2013]
If a page is already indexed, the last thing you ever want to do is block robots from re-crawling it. If they can't crawl it, they will never see a noindex directive.
Since I put the Disallow googlebot blocked the pages but it is not an issue because there is no no index directive on my page ( I cannot add one on each page because I don't have a list of pages to block that is my problem ! )
#1 Duplicate pages that are being created by the CMS.
This is not happening anymore because I reinstalled 1.5 and will never use 2.5 until joomla figures out a way to create static web address ( hopefully 3.5 )
# 2 Same it is impossible to redirect the page because how can I know that a page with an number 1 needs to be redirected to a certain page B. Maybe the next page with a number 1 in it needs to be redirect to page C so unless I misunderstood this doesn't seem to be a possibility.
# 3 Same I can't guarantee that page B thru G will need to redirect to URL A as I don't have a list of the pages to redirect... google has those pages in its index but is hidding it from me !
[edit reason] formatting [/edit]
[edited by: ergophobe at 2:07 pm (utc) on Aug 30, 2013]