Forum Moderators: open

Message Too Old, No Replies

Am I doomed if my site was offline during crawl

Hosting billing foul-up.

         

rfgdxm1

10:55 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Due to my host having a billing screw-up, they took my sites offline in the worst possible way for about two and a half hours. Rather than them just going dead, which a SE bot likely would think is a server temporarily down and try again, they redirected everything to an error page to contact billing. Google freshbot came by and grabbed a few pages getting that redirect to an error page. Which will mean that Google thinks *that* is the new content of my sites. :( Now I know if this had been the deep crawler I would be doomed. However, my site was deep crawled when it was up before and all went well. However, what will the effect of this be with the fresh bot? Please say I'm not doomed Googleguy? I noticed the Freshbot still hasn't written my sites off, as it has grabbed a page since the site came back online.

Powdork

11:07 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I would think that at the worst those pages that sent the error message might suffer a drop in the rankings. There are several factors which might come into play.
1.FreshBot visits don't always amount to anything even with new content. In other words, it doesn't always result in fresh tags or reflect changes in content. I believe that this is pretty consistent among pages though. If they usually show fresh tags then the new content is always indexed after a visit.
2. Does the error page give the proper 404 headings.

There are probably more that I can't think of or don't know. I wouldn't worry though. It will be interesting to see. If there is a problem see how the pages rank for the error message's content.

Chile

11:08 pm on Mar 17, 2003 (gmt 0)

10+ Year Member



I had the same thing happen for about a day on one site where I was assigning permissions to certain folders. Well, I made a small error, and the main page was down temporarily. Google had already come and gone so I figured "no harm, no foul". A few hours later, I checked the site description and sure enough, the bot picked my error message as the site description.

That happened about 2 days ago. Now the description is back, and everything is running smooth.

I think you should be fine after the next visit.

rfgdxm1

11:17 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>2. Does the error page give the proper 404 headings.

No. What the server was doing is sending a 302 back, and redirecting to the error page. Now I know this would be a disaster if this was the deep crawler. I actually had my site vanish from Google due to a server glitch for one month. What happened is the host was reinstalling new software, and for some reason it was showing a standard server page for my home page. Google deepbot came by at just that moment, and thought this was all that was on my site, and left. :( What I am assuming what will happen is just these few pages may not do well because of the changes, but when freshbot comes back again it'll just think that I changed content yet again. Thus, these few pages may not show up for searches for a few days, but the rest of the pages on the site where Googlebot didn't get this error will be fine.

EliteWeb

11:18 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Depending how many links are coming into your site and how high the PR is I think plays a factor in this. I had a site that I HAD to take offline due to massive traffic buildup so everything went to the error page. Google had this error page cached the next day but when i put the site back online it was promptly put back in the google index. This is a site of PR6 of 10 with 510 backlinks counted by google..

If it is a site with lower ranking it may take a month to gain your ranking back because the bots may not come back to it as often.

Powdork

11:25 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If it is a site with lower ranking it may take a month to gain your ranking back because the bots may not come back to it as often.

Even if the bots don't come back you'll be fine. After any fresh results that do show up expire (2 days), your cache will revert to the one resulting from the Feb deep crawl.

rfgdxm1

11:27 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From past experience this can cause doom if it happens with the deep crawler. Makes sense. If the deep crawler starts at your home page and gets an error page, then it thinks that is the whole site. It has no links to follow to the rest of the site's content. However, freshbot's only purpose is to add new pages it finds, and to update pages it finds has changed. Thus at worst freshbot should just think these few pages were gone, but not assume the site is.

engine

11:37 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



If the sites were down for a couple of hours then that's not very long. If you've still got all your incoming links in place things should settle again in due course. To suggest the site is doomed is taking things too far.

Fear not, just wait for the next visit which might be tomorrow.

rfgdxm1

11:46 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The problem engine is that the site wasn't truly down, but instead the server was sending its own content to any SE bot. If this had happened at the time the deep crawler came around to start crawling my site, it'd have thought that this one error page was all the site was. It wouldn't have the links to the inner content to even try and get that. My expectation here is nothing more will happen besides possibly a few pages falling out of Google for a few days. And fortunately, these weren't critically important pages, like the sites home pages.

BigDave

11:56 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



At least it gave a 302 instead of a 301. I have no about how googlebot deals with 302s, but at least it should catch the temporary part.

Sorry you have to be the guinea pig on this one, but could you keep us informed as to what happens. At least that way we can get something useful out of it.

Powdork

11:56 pm on Mar 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If this had happened at the time the deep crawler came around to start crawling my site, it'd have thought that this one error page was all the site was. It wouldn't have the links to the inner content to even try and get that.

rfgdxm1,
This sound like an excellent reason to have external links to deep pages and a sound navigation structure on all pages.

GoogleGuy

12:52 am on Mar 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I don't think you're doomed. Freshbot has a fast turnaround, so if it finds good content it should show those pages quickly.

rfgdxm1

1:17 am on Mar 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My guess is I'll do OK Googleguy. :) I just checked, and it so happens the few pages freshbot tried to grab while it was hosed happened to be rather unimportant ones on the site that don't tend to get that much traffic from search engines. Now if it had been my site's home pages that would be more critical. Also, my sites definitely have good freshbot turnaround. In fact I once got a bit paranoid that I seemed to be getting too manu freshbot hits and was wondering if Google somehow didn't trust me. However, from what I have read freshbot coming around a lot isn't unusual for sites with decent PR (mine are PR 6 and 5), and have ODP links. Thus my guess is just that these few unimportant pages will be unfindable in Google for a few days max. Nothing to panic about.

growing

2:00 am on Mar 18, 2003 (gmt 0)

10+ Year Member



So what is the best way to handle server outages (planned or unplanned) for an SE perspective?

What low-bandwidth content or response should you display whilst your server is getting an overhaul so as to minimise the damage to your listings in Google or other major engines?

I suspect that it is best to just pull the plug and return absolutely no response. I presume that Googlebot will come back later and try again. But if you deliver a response (any response), then Googlebot will think that is your entire content.

So what is recommended?

Steve

rfgdxm1

3:15 am on Mar 18, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>I suspect that it is best to just pull the plug and return absolutely no response.

Yes, this is the way to do it. Given servers go down all the time, my guess is that Google does retry later if it gets no response.

Jakpot

11:10 am on Mar 18, 2003 (gmt 0)

10+ Year Member



My server has been up and down - mostly down - for the last eight days and I've held my Serp position. Just lucky I suppose and I hold my breath each time I check Google.