Supriya21 - 2:47 pm on Aug 31, 2011 (gmt 0)
Our rankings are back. It was a different issue that caused ranks drop.
Nevertheless, we need to remove this duplicate content. It's user generated content so we don't have much control on it.
I configured "URL parameters" within webmaster tools to let Google know about these dynamic parameters that are causing duplicate page titles. It's been 3 weeks and Google seems to ignore this.
If we add entries in robots.txt to get rid of duplicate URLs, will it be removed from Google index?
As per my understanding robots.txt prevents crawling of the URLs but if there are links pointing to the URLs, then those URLs can not be removed from Google index by using robots.txt.
Would love to hear your thoughts!