| 1:00 pm on Jan 16, 2014 (gmt 0)|
How many pages are you talking about and how is it impacting your business?
| 1:15 pm on Jan 16, 2014 (gmt 0)|
Impact : my rankings dropped considerably.
I am seeing 85 pages when I type the site command and the webmaster tools shows 85 too when my website is only 39 pages.
By the way the number of pages in the webmaster tools keeps increasing lately for 67 to 85 in the last one month and half and in the last year I haven't added any new pages.
Why is that ? any to figure which pages google has in its index that could be hurting me because I have the feeling there are some.
| 1:36 pm on Jan 16, 2014 (gmt 0)|
Google will display the url of pages blocked by robots.txt. They may even show urls that do not exist since they can not crawl the page if it is blocked by robots.txt to discover that it does not exist. This will not impact rankings.
Are these pages blocked by robots.txt or has Google crawled them?
| 2:01 pm on Jan 16, 2014 (gmt 0)|
That is true the 2 urls it is currently showing in omitted results are url that I blocked in the robots.txt.
The reason I blocked those is because google crawled those url during the bug I had a year ago.
Should i remove the robots.txt let google crawl then, them remove those once they appear and finally re-block those ?
| 4:39 pm on Jan 16, 2014 (gmt 0)|
Blocking Google from crawling one page will not lower the rankings of your other pages with a few rare exceptions. I would look elsewhere to explain the rankings drop instead of spending time on non-essential pages blocked with robots.txt. That is more of a non-vital cosmetic issue.
| 5:25 pm on Jan 16, 2014 (gmt 0)|
I see but where would you look ?
| 7:18 pm on Jan 16, 2014 (gmt 0)|
This will answer your question [webmasterworld.com...]
Another option is to go here [webmasterworld.com...]