Welcome to WebmasterWorld Guest from 54.197.171.28

Forum Moderators: goodroi

Message Too Old, No Replies

Block Url's by robot.txt in google webmaster tool

   
11:06 am on Mar 10, 2014 (gmt 0)



Hello Friends

Today I checked my gwt account and looked 42 url blocked by robot.txt. how to solve this problem.

Anyone suggest me

Thanx
3:49 pm on Mar 10, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



You haven't explained the problem. Not everything labeled an "error" in gwt is actually an error.

Are you trying to find out which 42 they are? Good luck: I don't think I've ever found where they hide the list.

Only you can say how many URLs are supposed to be blocked. Is it more than 42? Less? Have they been crawling pages they're not supposed to, and/or skipping pages that should be crawled?

Surely it would be more worrying if they listed no blocked pages. Everyone has some private areas, after all.
9:00 pm on Mar 10, 2014 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



What have you blocked in robots.txt? Often GWT complains about files being blocked that you have purposely blocked so if you have a directory that is off limits, the blocked files may be in there. Or you may have blocked a file extension like .css or .gif and those are files that you intended to block. Not all the notifications in GWT are very useful and like lucy24 I have not found where they list blocked URLs so that a person could verify that they do or don't intend to block them.

They complain to me about getting 403s and the URL shown is one that is clearly blocked in robots.txt because they are not permitted to crawl that type of URL. Now, how did they get a 403 if they are reading and complying with robots.txt? Beats me.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month