homepage Welcome to WebmasterWorld Guest from 54.237.213.31
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
I want Google only to crawl my website at night
Errioxa

5+ Year Member



 
Msg#: 4068686 posted 3:12 am on Jan 27, 2010 (gmt 0)

In the test for webmaster, this question

29. Your server has limited bandwidth resources and you would like Google only to crawl at night. What should you do?

* a) Send an email to info@google.com
* b) Add "Allow: 11pm-7am" to your robots.txt file
* c) Have your server respond with HTTP result code 503 to all users when it's bogged down
* d) Dynamically change your robots.txt to disallow crawling during the day

Has anyone used the option d?

 

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4068686 posted 5:42 am on Jan 27, 2010 (gmt 0)

Has anyone used the option d?

Not me.

29. Your server has limited bandwidth resources and you would like Google only to crawl at night. What should you do?

* e) None of the above: Change hosts.

Errioxa

5+ Year Member



 
Msg#: 4068686 posted 9:03 am on Jan 27, 2010 (gmt 0)

The server is mine. There are 3000 - 4000 users simultaneously and I use 3 frpontal servers

loudspeaker

5+ Year Member



 
Msg#: 4068686 posted 3:08 pm on Jan 27, 2010 (gmt 0)

Not directly on your question, but have you tried looking at options within webmaster tools? I think you can control crawling to *some* extent (perhaps not to that level, though).

pageoneresults

WebmasterWorld Senior Member pageoneresults us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4068686 posted 3:24 pm on Jan 27, 2010 (gmt 0)

* b) Add "Allow: 11pm-7am" to your robots.txt file?

<crawl_rate from =08:00UTC to=17:00UTC>medium</crawl_rate>

Google Webmaster Tools Patent on Crawl Rates
[seobythesea.com...]

Has anyone used the option d?

I do believe that would be the Kiss of Death. ;)

To me, none of the answers appear correct. If I had to choose, it would be b) based on the information available.

* c) Have your server respond with HTTP result code 503 to all users when it's bogged down.

Isn't that the default behavior? I think it's a trick question. :)

TheMadScientist

WebmasterWorld Senior Member themadscientist us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 4068686 posted 4:30 pm on Jan 27, 2010 (gmt 0)

The server is mine. There are 3000 - 4000 users simultaneously and I use 3 frpontal servers

* f) Find a bigger pipe.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4068686 posted 4:39 pm on Jan 27, 2010 (gmt 0)

Note - this question is part of a Webmaster Quiz that Google published. They did not publish the answers, as far as I can see.

This quiz provides guidelines and is not an exhaustive list of issues/causes webmasters may encounter. Discussion of specific cases is encouraged on the Webmaster Help Forum.

Keep in mind your username may be posted on the Google Webmaster Central Blog if you are a top scorer.

https://spreadsheets.google.com/viewform?hl=en&formkey=dFlIRlpTY3B5T2xWOExiSmlfVTl1dFE6MA


jdMorgan

WebmasterWorld Senior Member jdmorgan us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4068686 posted 6:06 pm on Jan 27, 2010 (gmt 0)

"* c) Have your server respond with HTTP result code 503 to all users when it's bogged down"
is technically correct.

10.5.4 503 Service Unavailable

The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay. If known, the length of the delay MAY be indicated in a Retry-After header. If no Retry-After is given, the client SHOULD handle the response as it would for a 500 response.

RFC2616: Hypertext Transfer Protocol -- HTTP/1.1 [w3.org] (section 10.5.4)

See that "Retry-After" header they mention? There's the direct answer to the question posed in the subject line of this thread. However, I can't comment on whether this approach is advisable, having never tried it.

Jim

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved