homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
Forum Library, Charter, Moderators: mademetop

General Search Engine Marketing Issues Forum

Should I "disallow" in robots.txt when site is down?

 12:10 pm on Dec 18, 2004 (gmt 0)

My site was recently down for over 24 hours due to database problems.

I have G, Yahoo and MSN bots coming back 300+ times a day each.

Is it better to allow the bots to see blank pages / 404 errors or to temporarily "disallow" them via robots.txt?


The Contractor

 12:26 pm on Dec 18, 2004 (gmt 0)

Make a custom 404 that resembles your homepage with no follow meta. There isn't any good solution if your site is down - period. If it's your hosts fault - move your hosting. If it's the scripts fault - change scripts. If it's user error running the script - stop doing that ;)


 8:33 am on Dec 19, 2004 (gmt 0)

Thanks for the response.

Unfortunately it was self induced - a little programming hubris.

I've written myself a reprimand and put myself on probation for 6 weeks.


 3:37 pm on Dec 20, 2004 (gmt 0)

I'd leave it. Google's used to seeing sites suffering code errors on visits. It's less use to robot instructions which change all the time. Google may only look at robots.txt every week or so and meta instructions a little more often.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Marketing and Biz Dev / General Search Engine Marketing Issues
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved