Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Blocking undesirable urls using robots Can effect Google Search Rank?

         

fsmobilez

11:13 am on Jan 23, 2009 (gmt 0)

10+ Year Member



Greetings

I want to know that can blocking undesrieable urls of dynamic site using robots can let ur traffic down from thousand to nothing.

Last week i blocked my site urls using robots rule and in webmaster tools the urls resritcited by robots were reached to 6000+ and after two days of doing this google stoped sending traffic to my site and when i remove that robots rule 48 hours before now my site is back in google.

I want to know what to do as i want to blocked undesireable urls of my site and want to remove all of them indivisually using wmt but im very much worried bcoz im not sure but 90% doubtfull google stoped sending me traffic bcoz of robots rule.

tedster

2:50 pm on Jan 23, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It sounds like your rule blocked urls that were in currently the search results and bringing you traffic.

fsmobilez

2:59 pm on Jan 23, 2009 (gmt 0)

10+ Year Member



No they are 100% not in search of google they only appear after giving specail operator in google search

when i typed
site:www.example.com

none of that urls appeared in google

and yes when i added robots rules i have checked the most common keywords of my site all urls were disappeared or thrown to last pages

Now im worried how to remove these undesireable traffic as i m afraid of adding robots rules again.

and are u sure robots cant be problem for losing rank in google.

Shaddows

3:46 pm on Jan 23, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Definately not. Almost every corporate site in the world would be unable to rank if that were the case.

I would suggest you got your syntax wrong and inadvertantly blocked pages (or paths to pages) in the index.

fsmobilez

4:24 pm on Jan 23, 2009 (gmt 0)

10+ Year Member



Well thats urls were there in google in large scale and im not adding the reoptimizted links as im afradi if google gives me penalty for duplicate urls

old urls were like this

www.example.com/mypost/postid=1&string=1

and
optimized urls are like this
www.example.com/mypost/postid=1

i added only 200 urls of optimized urls

and yes i addd robot rule like this in robots.txt

--------------
User-Agent: *
Disallow: /*string*

--------------

is there any thing i have done wrong and should i go for it again

Shaddows

5:00 pm on Jan 23, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Possibly.

If clicking tries to take you to
www.example.com/mypost/postid=1&string=1

Where a rewrite takes you to
www.example.com/mypost/postid=1

Google will never get there.

fsmobilez

8:53 am on Jan 24, 2009 (gmt 0)

10+ Year Member



Shaddows i really dont got what u said >>?

Shaddows

8:47 am on Jan 26, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It depends on how what your internal links look like. If user action tries to take them to a URL structured like this
www.example.com/mypost/postid=1&string=1

Then when the browser tries to fetch the page, it gets redirected to the preferred URL www.example.com/mypost/postid=1, google will not find the page.

This is because Google is unable to pass throught the intermediate step. It just will not load or attempt to load any URL that is disallowed by robots.txt

fsmobilez

11:52 am on Jan 26, 2009 (gmt 0)

10+ Year Member



Oh ok .

But i dont want google to crawl these type of urls in future and about current crawled urls i will submit indivisual url request in wmt (first applying robots rule)to remove them from google so can i do it without losing traffic.