homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Search Engines / Search Engine Spider and User Agent Identification
Forum Library, Charter, Moderators: Ocean10000 & incrediBILL

Search Engine Spider and User Agent Identification Forum

This 38 message thread spans 2 pages: < < 38 ( 1 [2]     
Blocking Monitoring Services
looking for list of ips.

 2:36 pm on Apr 9, 2003 (gmt 0)

Does anyone have a comprehensive list of website monitoring service IP address?

I've got a few collected from the forum here, but I just thought maybe someone had a full list of these rogue unwanted bots.

At a minimum, do you know these 4:

Aside from blocking them, what kind of "fun" can be had via cloaking with these bots?


Website monitoring services take users. Why would a user visit the site if they can "monitor" it from elsewhere. That defeats mission critical branding, it defeats promotion efforts, it defeats advertising, and that defeats your sites goals.

When a user visits your site and does not find updated content, they may find content or advertising they have not been exposed too. It's like channel surfing - it is how visitors are exposed to content they may not have seen before.

On the technical side, if people don't visit a site, it also means you are not counted by page counters such as the search engine toolbars like Google, Yahoo, and Alexa. That in turn may hurt your search engine rankings.

By not actively blocking these monitors, you are allowing and endorsing the poaching of your visitors. Website monitors are worse than Gator too me. Atleast with Gator, they have to visit your site.

Aside from the approved bots and partnerships with search engines, we do not allow unauthorized programmed querying of the site.


Mikkel Svendsen

 11:09 pm on Apr 9, 2003 (gmt 0)

Tapolyai, I understand what you are saying, but I do not agree that the majority of users of KaZaA (I'm not so sure about Gator ...) and services like the new Lycos stuff are unaware of what they do. Just take a look at Download.com and check the million of weekly downloads just KaZaA have. Do you think none of them know what they do when they download a P2P program and start swapping files? :)

You may be right that some people get tricked into using some of these services (like some of the Gator examples) but with KaZaA, Lycos personalization etc, no, thats services and software people choose because they want to and because it helps them to make the web more usefull to them.

I don't think we can stop this "move" - not even if we all agree to do so here at WebmasterWorld - just like the music business can not stop P2P or MP3 files all together.

I truely understand the concerns that some might have with overuse of bandwidth, not being able to show ads etc. but let's deal with it instead of doing what the music business have been doing so far... I don't think they have handled it so well ;)

If users really feel a strong need for "off-browser" monitoring of the web and if "personalized agents" is here to stay then why not try and get the best out of it. I am pretty sure we can find a way to adopt to this and make money ... be creative. I thought we where the ones to be so :)


 11:57 pm on Apr 9, 2003 (gmt 0)

That's fine Mikkel. What you do with your own site is your business - that's partly why we love our own sites so much.
Clipping services are as bad as they come. Agree, the whole trend towards it's ok to use other peoples content can't be stopped, but you can stop some of the jacking screen scrapers out there.

Mikkel Svendsen

 12:10 am on Apr 10, 2003 (gmt 0)

Nobody said it had to be free ;)

I don't believe in "the free web". It takes money to make good websites - even community sites like WebMasterWorld. I am not saying I want everything to be free - not at all. Also, I do undertsand it will take a lot of changes to profit from the clipping services, but I am very sure it will come. There will be be people that will understand how to develop websites that "fit" this format and make money from it.

Wasen't that we all did when we started SEO'ing back in the mid 90's? :)


 12:15 am on Apr 10, 2003 (gmt 0)

Just noticed another one this morning,

hyperspin.com/FREE_SERVER_MONITORING_SERVICE/ from, took robots text, I certainly didn't subscribe. Seems to be same sort of thing as internet seer.


 12:16 am on Apr 10, 2003 (gmt 0)

Some IPs that were not mentioned by fiestagirl





 7:17 am on Apr 12, 2003 (gmt 0)

>Interesting fears that do not really match up with my experiences.

Most people don't even know it happened. Imagine if the gas pump shorted you three spoon fulls on each tank for your life time. You don't "experience" the loss but it does add up over time. It reminds me of the cashier who took 1 penny out of every transaction for 40 years. No one could understand how she did it when she retired as a millionaire.

Web clippings and monitor services take a little bit of your site with them every time they connect. This is especially true if your site already has a update newsletter.


 2:02 pm on Apr 15, 2003 (gmt 0)

The problem with site monitoring services is that they can also be used by competitors to monitor your activities. I say if a competitor wants to monitor your site for change, (read as spying), then make them come to you site via an ISP because hopefully you have already banned their IPs.

I think it was somewhere here I read that some infringement services gather data first then try to sell it. I believe this to be true due to the fact that my site mentions no less than 10 of the Fortune 25 companies. Most in not such a flattering manner, and although I have banned several infringement services, I have yet to hear from one of the Fortune 25 companies. Of course Iím a really small fish. My site has been up for 3 years.

You can add
deny from 12.40.85.


 4:54 pm on Apr 18, 2003 (gmt 0)

Here's one to add to the list :

sun10088.linkalarm.com - - [18/Apr/2003:09:50:06 -0700] "GET /robots.txt HTTP/1.0" 200 44 "-" "LinkAlarm/2.4"sun10088.linkalarm.com - - [18/Apr/2003:09:50:06 -0700] "HEAD / HTTP/1.0" 200 0 "-" "LinkAlarm/2.4"

This 38 message thread spans 2 pages: < < 38 ( 1 [2]
Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Search Engine Spider and User Agent Identification
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved