Forum Moderators: open
The PR has not changed, its 4.
Each day Googlebots is visiting the 30 pages on the first subdir level and only < 10 pages from the subdir level. So I am afraid, that Google will never list the 2000 pages in the [domain.tld...] There are only < 20 pages listed.
Should I wait?
Or should I tell Google in a robots.txt to delete all the [domain.tld...]
and give me a second chance to get these pages spidered in [domain.tld...]
I am afraid that due to the 301 redirection I am not able to let Google selectivly delete the old url (without the www subdomain) [domain.tld;...]
I am only able to delete the content of the pages.
Should I wait or delete by robots.txt?
Thanks a lot, Maggy
You can use an online HTTP header sniffer to verify that the code is correctly served as a 301. If it is correct then you can immediately delete all of the old content as it will never be served to any robot, or user, will it? They were redirected.
If the status code is really a 302 then you need to correct that to be a 301. If the status code is a 200 then there isn't a proper redirect set up at all.