Welcome to WebmasterWorld Guest from 54.162.136.26

Forum Moderators: mademetop

Message Too Old, No Replies

Should I "disallow" in robots.txt when site is down?

     
12:10 pm on Dec 18, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Dec 17, 2004
posts:110
votes: 0


My site was recently down for over 24 hours due to database problems.

I have G, Yahoo and MSN bots coming back 300+ times a day each.

Is it better to allow the bots to see blank pages / 404 errors or to temporarily "disallow" them via robots.txt?

12:26 pm on Dec 18, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 29, 2002
posts:1954
votes: 0


Make a custom 404 that resembles your homepage with no follow meta. There isn't any good solution if your site is down - period. If it's your hosts fault - move your hosting. If it's the scripts fault - change scripts. If it's user error running the script - stop doing that ;)
8:33 am on Dec 19, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Dec 17, 2004
posts:110
votes: 0


Thanks for the response.

Unfortunately it was self induced - a little programming hubris.

I've written myself a reprimand and put myself on probation for 6 weeks.

3:37 pm on Dec 20, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 18, 2004
posts:81
votes: 0


I'd leave it. Google's used to seeing sites suffering code errors on visits. It's less use to robot instructions which change all the time. Google may only look at robots.txt every week or so and meta instructions a little more often.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members