Forum Moderators: Robert Charlton & goodroi
I want to know that can blocking undesrieable urls of dynamic site using robots can let ur traffic down from thousand to nothing.
Last week i blocked my site urls using robots rule and in webmaster tools the urls resritcited by robots were reached to 6000+ and after two days of doing this google stoped sending traffic to my site and when i remove that robots rule 48 hours before now my site is back in google.
I want to know what to do as i want to blocked undesireable urls of my site and want to remove all of them indivisually using wmt but im very much worried bcoz im not sure but 90% doubtfull google stoped sending me traffic bcoz of robots rule.
when i typed
site:www.example.com
none of that urls appeared in google
and yes when i added robots rules i have checked the most common keywords of my site all urls were disappeared or thrown to last pages
Now im worried how to remove these undesireable traffic as i m afraid of adding robots rules again.
and are u sure robots cant be problem for losing rank in google.
old urls were like this
www.example.com/mypost/postid=1&string=1
and
optimized urls are like this
www.example.com/mypost/postid=1
i added only 200 urls of optimized urls
and yes i addd robot rule like this in robots.txt
--------------
User-Agent: *
Disallow: /*string*
--------------
is there any thing i have done wrong and should i go for it again
Then when the browser tries to fetch the page, it gets redirected to the preferred URL www.example.com/mypost/postid=1, google will not find the page.
This is because Google is unable to pass throught the intermediate step. It just will not load or attempt to load any URL that is disallowed by robots.txt