Forum Moderators: phranque
So SEs have indexed two identical pages. Is it possible that some of these pages have received a duplicate content filter because of this?
And how should I prevent the old file from getting indexed, is a disallow command in robots.txt enough?