Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Robots.txt google webmasters blogger blogspot help

4:22 am on Feb 15, 2010 (gmt 0)

New User

5+ Year Member

joined:Feb 15, 2010
votes: 0

Hi everyone.I got 404 not found errors.. how can I prevent google from crawling those urls using robots.txt in google webmasters? My blogging software is Blogger ..thanks
12:17 pm on Feb 22, 2010 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
votes: 225

You could add a meta robots tag to your custom 404 page that blocks the search engines. I would not suggest that. It would be much better to fix the 404 pages then to not have the search engines index them.

Generally if a 404 page is being indexed by the search engines it is because there is link popularity pointing to it. I personally do not like to block the flow of link popularity. You could upload content and fix the page or just redirect it.

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members