Welcome to WebmasterWorld Guest from 54.227.5.198

Forum Moderators: mademetop

Message Too Old, No Replies

Should I "disallow" in robots.txt when site is down?

     

classifieds

12:10 pm on Dec 18, 2004 (gmt 0)

10+ Year Member



My site was recently down for over 24 hours due to database problems.

I have G, Yahoo and MSN bots coming back 300+ times a day each.

Is it better to allow the bots to see blank pages / 404 errors or to temporarily "disallow" them via robots.txt?

The Contractor

12:26 pm on Dec 18, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Make a custom 404 that resembles your homepage with no follow meta. There isn't any good solution if your site is down - period. If it's your hosts fault - move your hosting. If it's the scripts fault - change scripts. If it's user error running the script - stop doing that ;)

classifieds

8:33 am on Dec 19, 2004 (gmt 0)

10+ Year Member



Thanks for the response.

Unfortunately it was self induced - a little programming hubris.

I've written myself a reprimand and put myself on probation for 6 weeks.

Wail

3:37 pm on Dec 20, 2004 (gmt 0)

10+ Year Member



I'd leave it. Google's used to seeing sites suffering code errors on visits. It's less use to robot instructions which change all the time. Google may only look at robots.txt every week or so and meta instructions a little more often.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month