homepage Welcome to WebmasterWorld Guest from 54.166.128.254
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Should I block Googlebot from my site?
G-bot hits my site 1000's of times a night and I get little traffic.
lost in space

10+ Year Member



 
Msg#: 29167 posted 9:23 pm on Apr 22, 2005 (gmt 0)

I have a 1 ˝ year old site that does very well in MSN, but I get very little traffic from Google. Each night around 9 pm Google stops by to spider thousands of pages bringing my server speed to a crawl and competing with MSN Bot plus my customers. The server overheats, the site loads very slow and my click to sales ratio bottoms out because the pages load way to slow with all the different bot traffic.

I am honored that Google is so excited to index my site and quickly spider my pages, but Google doesn’t rank the pages well and the PR on the site dropped to a PR1. Could this be a sandbox issue? If Google likes that site enough to spider every new page, sentence, or word added daily will time and good SEO eventually increase my rank?

The other option is to ban the Googlebot so MSN Bot and my customers have better access to my site. What should I do? Will time and good SEO bring traffic from Google?

Dawna

 

lost in space

10+ Year Member



 
Msg#: 29167 posted 3:47 pm on May 2, 2005 (gmt 0)

Anybody? Some advice would really be appreciated.

Dawna

twebdonny



 
Msg#: 29167 posted 3:48 pm on May 2, 2005 (gmt 0)

Send it on over to my site,
I could use a complete spidering.

hahaha
j/k

Nikke

10+ Year Member



 
Msg#: 29167 posted 3:51 pm on May 2, 2005 (gmt 0)

I have had an issue like this with ask.com. Their spider fetched 3000 pages a day bringing me 10 to 15 visitors.
I banned them.

I would however never do that to Google. Couldn't you just lock out Google from certain pages using your robots.txt file?

awebguy

10+ Year Member



 
Msg#: 29167 posted 4:03 pm on May 2, 2005 (gmt 0)

contact the G team

at [google.com...]

with "Googlebot is overloading my servers"

SEOPTI

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29167 posted 4:06 pm on May 2, 2005 (gmt 0)

Block it all the way.

jino

10+ Year Member



 
Msg#: 29167 posted 4:09 pm on May 2, 2005 (gmt 0)

lost_in_space

Put this in your robots.txt file

User-agent: Googlebot
Crawl-delay: 10

This will limit googlebot to 10seconds between each call. You may change the delay to a number more comfortable to your server load. I find that googlebot is very obedient to this command, unlike yahoo or jeeves or msn bots.

ncgimaker

5+ Year Member



 
Msg#: 29167 posted 7:47 pm on May 2, 2005 (gmt 0)

Can I check,

Googlebot does process non standard wildcards in the robots.txt file?

Such as

disallow: /images/*.jpg

victor

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29167 posted 8:26 pm on May 2, 2005 (gmt 0)

I don't think googlebot honors a Crawl-delay. Useful for MSNbot, though

I had a similar problem with googlebot rampaging through a site. One "behave or be banned" email later, and I had a (seemingly) non-canned response and apology. Googlebot been well-behaved ever since.

lost in space

10+ Year Member



 
Msg#: 29167 posted 12:21 am on May 3, 2005 (gmt 0)

Thank you for your advice, I will try e-mailing Google.

I would hate to block Google from any of my content. I still wonder why Google likes the site enough to spider it daily, but not enough to rank it very well.

Is it a good sign that Google is spidering my site daily? Should I try to optimize the site for Google and if so could it cost me rank at MSN.

Dawna

BillyS

WebmasterWorld Senior Member billys us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 29167 posted 12:27 am on May 3, 2005 (gmt 0)

Get a bigger box...

trimmer80

10+ Year Member



 
Msg#: 29167 posted 4:42 am on May 3, 2005 (gmt 0)

look at the pages that the bot are hitting... work out which pages are useful to the bot and which aren't, only ban the bot from pages that are not going to bring in traffic.
If they are all relevant than you need to get a better box or look at your site to see why the bot is overloading it.

I dont think banning is the best solution. If google starts sending lots of visitors your site will start to run slow anyway.

PatrickDeese

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29167 posted 5:17 am on May 3, 2005 (gmt 0)

Google doesn’t rank the pages well and the PR on the site dropped to a PR1. Could this be a sandbox issue?

If your home page is PR 1, and you aren't ranking well in Google - the simple answer is that you need a lot more quality inbound links to your site.

It also sounds like you need to upgrade your server.

alexo

5+ Year Member



 
Msg#: 29167 posted 7:37 pm on May 19, 2005 (gmt 0)

Hello

During this last 3-4 months i add ~near 20-30mb content in my database (php-mysql site) and after this start my problems. [just now db size is ~near 60mb]
every time google visit my site, my apache was overloaded and as result site "in down". at that time i was on VPS hosting (256mb ram).

at first i didn't understand what's the problem, (didn't see any connection between google's visits and server overloading).

it's continued 3 month, after i decide to move my host to other hoster and upgrade my hosting account to 512mb. (it was 30-40 days ago)
btw, server moving wasn't clear and as result site was in down 2-3 days/

as result of this:
1. i loss google's traffic, if before google have 15-25k indexed pages from my site, now there are 500-700 pages.
2. all my KW-s aren't work
3. if 3-5 months ago google make 2-3gb during his site visits, now maximum 300-340mb

during this last month google visit my site 5-7 times and 4 times of this he make server overloading.

i don't know , what do?

1. wrote Google and tell that his visits make site overloading?
now when he don't do deep crawls?
The "overloading" form states sending the form would lead to Google coming less to my site.
That's not what I want at all!

i think that after this he will visit less , in which i'm not interested.

2. upgrade my server or move to dedicated with minimum 1 gb ram?
hmm ... now , durign this 40 days my business was worst (i mean site incoming), so it's a little terrible to pay for dedicated server and don't be sure, that we can earn enough.

the problem is, that when my problems starts, at first i think that it may be google's ban or penalty (for duplicate content).
still now i don't sure that the only problem is in server overloading.

pls, help me,
suggest, what's the best solution in this case?

thank you

alexo

5+ Year Member



 
Msg#: 29167 posted 12:07 am on May 20, 2005 (gmt 0)

hello once again.

did anybody use this code in robots.txt
User-agent: *
Disallow: /images/

User-agent: Googlebot
Crawl-delay: 10

and try to validate via
1/ [searchengineworld.com...]

2/ [validator.czweb.org...]

and get
We're sorry, this robots.txt does NOT validate.

ALbino

10+ Year Member



 
Msg#: 29167 posted 1:45 am on May 20, 2005 (gmt 0)

Get a bigger box...

What he said. If your site can't handle a few thousand hits then I don't know how you're making any money in the first place. You need either a new server, a new host, or both. Good luck :)

AL.

Reid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 29167 posted 2:25 am on May 20, 2005 (gmt 0)

did you look to see if it is making multiple request for the same page?
Is cache on?
If you use session id's it could overload the server.
gets new id asks for new page gets new id asks for new page - endless loop.

Check site:yoursite in google how much pages are cached is there an end in site?
Lots of variables to consider here.

CodeJockey

10+ Year Member



 
Msg#: 29167 posted 4:12 am on May 20, 2005 (gmt 0)

>> [just now db size is ~near 60mb]

Actual db size doesn't mean a lot to mySql, but you may see large performance improvements by indexing key fields/frequently accessed fields.

alexo

5+ Year Member



 
Msg#: 29167 posted 5:37 am on May 20, 2005 (gmt 0)

<but you may see large performance improvements by indexing key fields/frequently accessed fields. >

sorry for stupid question. how can i do it?

thank you

CodeJockey

10+ Year Member



 
Msg#: 29167 posted 6:02 am on May 20, 2005 (gmt 0)

If you're using phpmyadmin it should be under the 'Properties' option of your table. The exact option that you want is 'Index'. If you're using some other admin tool, it should clearly be marked.

Also, you can use mySql syntax directly. It should be in your manual under 'CREATE INDEX'. If you don't have a manual, then it's available online at their site.

Only index fields that are key fields or are important fields that you query on. Too many indexes seems to be worse than no indexes at all, so think about what you're doing before you do it.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved