Welcome to WebmasterWorld Guest from

Forum Moderators: mack

Message Too Old, No Replies

Are you taking the MSNbot Gamble?

how much bandwith are you allowing

5:22 pm on May 24, 2004 (gmt 0)

Junior Member

joined:Jan 20, 2003
votes: 0

The MSNbot has quite a voracious appetite for spidering websites. Some webmasters love it and try to feed it as much as possible. Other webmasters don't see any reason to use up bandwith for a search engine that doesn't currently exist.

Personally, bandwith is cheap for me and I'm willing to feed MSNbot as long as it doesn't impact performance for users.

So how much bandwith are you allowing?

5:20 pm on June 11, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 29, 2004
votes: 0

Welcome msndude, feel at home and give a try to Linux baby!
6:32 pm on June 11, 2004 (gmt 0)

Preferred Member

joined:Apr 22, 2004
votes: 0

Seriously.. bandwidth is cheap so if your on a host thats stingy with it thats your fault.

The bot is a good thing.

Just like I HATE Microsofts monopoly on the desktop I hate Googles in the search world. I sincerly hope M$ makes inroads in this area and gets knocked on their butt with their OS

8:18 pm on June 11, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 10, 2004
votes: 0

Welcome MSNDude
well shoot, I thought someone at MS just liked me when I saw their new bot pounding my site, second only to G this month and Yohoo nowhere to be seen. I just hope their new SE starts feeding results to MSN soon so all this bandwidth is not wasted, (hint hint)
A timeline maybe?
10:09 pm on June 11, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 16, 2003
votes: 0

There's some case for a meta tag specific to the MSNBot, like the revisit-after tag, except the other way around: Do not visit before. I don't mind Google or Yahoo spidering daily, but I don't want to see MSNBot more than once every three or four days, at least until it's serving a live search engine. Every site varies as to what it can handle, and this would be an excellent way for webmasters to set limits.

Around 40% of my traffic is bots, and that hasn't varied a great deal with the growth of overall traffic. Most of that is Googlebot. Add another major spider to the equation, and we're talking an extra 30% or so in bandwidth with no extra visitors to support the cost. Now I'm doing okay, but if those kinds of percentages are at all common, there are going to be a lot of people taking issue with it. So how about those meta tags, MSNDude?

11:23 pm on June 11, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 12, 2003
votes: 0

You'll recall the claim was made that there was this upper limit:
4,294,967,296 (2^32) URLs.

ms will likely not have this problem, as they are probably built on top of the 64 bit version of windows server. and sql server, if this is the data engine can handle 64 bits without problem.

at least msnbot will follow a new internal link if it sees it, while google studiously ignores new internal links while concentrating on respidering things like contacts.htm several times a day.

no room in the "new links" database?

1:16 am on June 12, 2004 (gmt 0)

Preferred Member

joined:Feb 8, 2002
votes: 0

welcome MSNdue - a relief to see you here. And of course, we love the fact that you are crawling deeply all of our sites. Wish you the best in providing healthy competition to Google.

continued here:
This 56 message thread spans 6 pages: 56