I recently moved my site to another host. The reason to be able use the use the custom error and change the format of the urls from
Every thing is working fine with regards to the 200 status.
I was told in another thread to block google from the old format by adding
Disallow /items to my robot.txt, and I thought that was the right thing to do.
I had freshbot on my site and grabbed the index page with all the links on it, but all the links in the index page are the category links (ford, BMW, Nissan, etc.), they got indexed by google and that what they show if you do a search for my keyword.
in the meanwhile, google dropped all 400+ pages they had in the old format since I asked it to not crawl them. now I have the lowest in traffic from google for some time.
My concern is, did I do the correct thing to block the crawler this early or I should have waited until the new format of urls got indexed?
Also, since I have been visited by the freshbot crawlerXX, would my site get deep crawled this month?