Welcome to WebmasterWorld Guest from 54.159.250.110

Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt google webmasters blogger blogspot help

   
4:22 am on Feb 15, 2010 (gmt 0)

5+ Year Member



Hi everyone.I got 404 not found errors.. how can I prevent google from crawling those urls using robots.txt in google webmasters? My blogging software is Blogger ..thanks
12:17 pm on Feb 22, 2010 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



You could add a meta robots tag to your custom 404 page that blocks the search engines. I would not suggest that. It would be much better to fix the 404 pages then to not have the search engines index them.

Generally if a 404 page is being indexed by the search engines it is because there is link popularity pointing to it. I personally do not like to block the flow of link popularity. You could upload content and fix the page or just redirect it.