Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Pages removed on 30 sept 2008 still showing in WMT

         

fsmobilez

12:15 pm on Jan 19, 2009 (gmt 0)

10+ Year Member



I have removed my site pages using google webmaster tools and i used the method which u told me first i block that directory using robots file and than submit request in WMt and pages were removed sucessfully , i do this with my two sites

Now problem im facing is that in one site i have removed rule for those pages which were removed in sept but still google is showing this in webmaster tools
ý Not found ý(1,849)ý

while in other site in which still rule is added in robot file it is showing this
URLs restricted by robots.txt ý(1428)

I have two question

first is should i add robot rule for both sites or remove that rule for both sites and 2nd question is how can i get rid of this complelety as removed urls have no back links but still showing in google Webmaster tools.

johnnie

2:18 pm on Jan 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Are the pages still showing in the SERPS? Do a site:yourdomain.com query and find out. WMT data is laggy.

fsmobilez

4:27 pm on Jan 19, 2009 (gmt 0)

10+ Year Member



the pages are not showing in serps

but i want to know should i keep the rule in robots file or removed that rule which one u will preffer me

and when will wmt stop displaying these

johnnie

10:12 pm on Jan 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



WMT is laggy; give it a few weeks. If the removal request shows as accepted, you're free to remove the robots-rule. Remember that other SEs will index it then though.

fsmobilez

7:11 am on Jan 20, 2009 (gmt 0)

10+ Year Member



Well the i do the same and everything is fine in google search engine but in Wmt its still showing 404 error for those removed pages.

and 2nd queston is blocking thousand of urls using robots can be a problem or not (to loss traffic from google).

Receptional Andy

1:03 pm on Jan 20, 2009 (gmt 0)



is blocking thousand of urls using robots can be a problem or not (to loss traffic from google).

The only way blocking content with robots exclusion is likely to cause a drop in traffic is if you exclude pages that appear in search results and attract visitors - if you're removing undesirable pages then there should be no negative effect.