Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: mack
joined:Jan 20, 2003
Personally, bandwith is cheap for me and I'm willing to feed MSNbot as long as it doesn't impact performance for users.
So how much bandwith are you allowing?
joined:Apr 22, 2004
The bot is a good thing.
Just like I HATE Microsofts monopoly on the desktop I hate Googles in the search world. I sincerly hope M$ makes inroads in this area and gets knocked on their butt with their OS
Around 40% of my traffic is bots, and that hasn't varied a great deal with the growth of overall traffic. Most of that is Googlebot. Add another major spider to the equation, and we're talking an extra 30% or so in bandwidth with no extra visitors to support the cost. Now I'm doing okay, but if those kinds of percentages are at all common, there are going to be a lot of people taking issue with it. So how about those meta tags, msndude?
You'll recall the claim was made that there was this upper limit:
4,294,967,296 (2^32) URLs.
ms will likely not have this problem, as they are probably built on top of the 64 bit version of windows server. and sql server, if this is the data engine can handle 64 bits without problem.
at least msnbot will follow a new internal link if it sees it, while google studiously ignores new internal links while concentrating on respidering things like contacts.htm several times a day.
no room in the "new links" database?
joined:Feb 8, 2002