Welcome to WebmasterWorld Guest from 126.96.36.199
It took a few days to index those 600 pages ( due a bug when I upgraded joomla ) but you are telling me it can take years to remove that is terrible...
In other words, what is best to use, the disallow in the robots.txt, URL removal tool or URL Parameter or should I use the 3 of them at once to give myself the most chance and get the penalty I have removed as quickly as possible.
I currently have pages indexed in google with the following description : " A description for this result is not available because of this site's robots.txt – learn more. " is it because of the disallow I have
There have been several threads on the problem you had with your Joomla upgrade. Have you considered becoming supporter and putting your site for review
Thank you for your answer about the line of code to add but this issue is that I don't know which directory the issue is coming from because googlebot has surfed our FTP in a certain way and created pages that I think random
googlebot has surfed our FTP in a certain way
Header set X-Robots-Tag: "noindex"