Forum Moderators: open

Message Too Old, No Replies

# 1 "Bandwidth Limit Exceeded"

         

vagelis_t

3:07 pm on May 25, 2003 (gmt 0)

10+ Year Member



# 1 of about 125,000
"509 Bandwidth Limit Exceeded
Bandwidth Limit Exceeded. The server is temporarily unable to service
your request due to the site owner reaching his/her bandwidth limit"

Any idea why google would be doing this?

Yidaki

3:13 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Any idea why google would be doing this?

1) yes, they just crawled the page, indexed the page and return the page for searches on their "real" content. The snippet says it all: they had some bandwith problems the moment that googlebot came visiting.

2) welcome to WebmasterWorld [webmasterworld.com].

verbum

3:19 pm on May 25, 2003 (gmt 0)

10+ Year Member



Hi. Content was there when googlebot came a-crawling (as a look at the cache shows). Any bandwidth problem happened after the bot's visit.

Critter

3:24 pm on May 25, 2003 (gmt 0)

10+ Year Member



I just looked at the site...and it was Ok...

However, the cache had the error page.

Considering that the page returned a status of 509 (which is in error territory), I feel this is further proof that Google is, at the moment, somewhat broken.

Peter

Jane_Doe

3:25 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It may have kept it's position because of inbound links with relevent anchor text. Try typing in:

best search engine

into Google and check the cache.

AthlonInside

4:03 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It is the contents crawled the freshbot. However, the search is based on the page fetch while deep crawl.

annej

4:03 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I once had a site down for a month. It was a mess up with automic payment not going though to the host company. I didn't get their warning emails as we were on an extended RV trip and were unable to get online. As a result they shut my site down. When we returned my site was gone but the site was still listed on Google and still in it's usual good position in the serps. No info on the site but it was there. The site has a great many links and I think that's what saved it.

[edited by: annej at 4:30 pm (utc) on May 25, 2003]

Yidaki

4:17 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>Considering that the page returned a status of 509 (which is in error territory), I feel this is further proof that Google is, at the moment, somewhat broken.

The webmaster of the 509 site prob feels very thankfull to google and prob feels that google is doing a better job if it keeps temporary unavailabe / error returning pages instead of dropping them. I don't think he'd say that google is broken.

Critter

4:56 pm on May 25, 2003 (gmt 0)

10+ Year Member



Yidaki, you're looking at it from a webmaster point of view.

From the search engine user's point of view, having an error response page as the #1 returned result is *broken*.

Peter

kevinpate

5:02 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Another way to look at this is returning a 509 Error high in the serps is algo speak for:

Hey joe surfer ... site domain.com, notwithstanding a recent 509 error - which is often merely a temporary bandwidth issue so the site might even work by now, is actually a relevant site to your search terms based on our algo. It's up to you to decide whether to check it out.

Critter

5:10 pm on May 25, 2003 (gmt 0)

10+ Year Member



That would perhaps make sense if response-code 509 always meant "bandwidth allocation exceeded"...

Unfortunately, it does not. 509 is a custom (non-standard) response code. Perhaps the better code to return would be 503--Service Unavailable.

Regardless, all 500 error codes denote a server error--and none of the 5xx errors have a de jure associated temporary or permanent connotation.

If you want a good index then pages with 5xx errors should not be part of the index.

Peter

Yidaki

5:18 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>If you want a good index then pages with 5xx errors should not be part of the index.

Agreed if the server returned http status 509.

But i didn't hear vagelist saying what http statuts code was returned to googlebot. Could be that the server was confused and returned a error page along with a 200 status code?!

Critter

5:24 pm on May 25, 2003 (gmt 0)

10+ Year Member



Hmmm...not likely looking at the cache.

That's an internally-generated error document from apache without modification.

It would be the "HTTP/1.1 509 Bandwidth Exceeded" or some such header.

Peter

workshy

5:29 pm on May 25, 2003 (gmt 0)

10+ Year Member



>If you want a good index then pages with 5xx errors should not be part of the index.

Just to diversify a tad...How about the webmaster of a #4 ranking site of 2,990,000 which is a blank white page with hidden text? What would he be thinking about Google?
This 'site' has barely, (if at all) changed position since I first spotted it about 2 months ago.

OK, it used to be a humor site and the hidden text is obviously not done for profit, more of a joke I suppose, but then maybe it's Google that's the joke at the moment!

Sie.

Yidaki

5:35 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>That's an internally-generated error document from apache without modification.

Nothing that can't also be returned in response to a status 200.

>It would be the "HTTP/1.1 509 Bandwidth Exceeded" or some such header.

Yes. But you wouldn't see it in the cache.

badger_uk

5:36 pm on May 25, 2003 (gmt 0)

10+ Year Member



I just checked my site and noticed I've had this message showing for the last 2 days, and to make matters worse all the major spiders have visited today. Looks like I've got a few nervous days and nights ahead of me.

badgeruk
p.s. No It's not my site that started this thread.

Yidaki

5:37 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>but then maybe it's Google that's the joke at the moment!

Yah, could be. Who knows? I wonder why nobody laughs about it but complains instead ... ;)

kevinpate

5:38 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A fair point Critter.

I've personally only seen the 509 code when it was bandwidth related (too personally on 2 occasions this year, situations where we hit our bandwidth ceiling within 36 hours of month's end.

I really do gotta start paying more attention to where we're at on that issue because both times managed to coincide with a late month crawlby from google.

Critter

5:46 pm on May 25, 2003 (gmt 0)

10+ Year Member



Yidaki:

The liklihood of someone purposely creating a html page that looked exactly like apache's error page and placing it in the directory to be fetched with a 200 response is, ahem, highly unlikely.

:)

Peter

Yidaki

5:51 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>placing it in the directory to be fetched with a 200 response is, ahem, highly unlikely.

erm, peter ... well, no. Since i run my own little niche search engine, i know that there's quite nothing that is unlikely to happen with server setup's, seriously. ;)

However, my guess is that the page in fact returned 509 status. So there are two pov's:

- webmasters should be happy to stay in the google index until the error is fixed
- some joe and jane's prob ignore it, some prob complain about it

Kackle

5:59 pm on May 25, 2003 (gmt 0)



It's not a feature, it's an algo that gives too much benefit of the doubt to anchor text in backlinks, as opposed to the page itself.

Try these searches (no quotes used):

<keyword keyword keyword> (This prank googlebomb page has been number 1 since March; it has survived two updates so far.)

<keyword keyword> (Look at the number 2 link out of 467,000; two weeks ago it was number 1. The directory has been empty since before November, according to the Wayback machine.)

[edited by: ciml at 4:30 pm (utc) on May 26, 2003]
[edit reason] Please avoid specifics. [/edit]

chiyo

6:06 pm on May 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This message has appeared regularly in Google SERPS for months, maybe years. It just reflects the page contents at the time it was crawled (freshbot or deep). I see these all the time and usually in a few days freshbot recrawls and the site is then displayed with its normal contents.

To say google is broken becuase of this means you would say google has been broken for many months, maybe years. Google is not a magician. It cannot index sites that nobody can see!