homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
Forum Library, Charter, Moderators: DixonJones & mademetop

Website Analytics - Tracking and Logging Forum

Referal Spam Windows Platform
IIS, Windows

10+ Year Member

Msg#: 3886 posted 12:41 am on Dec 30, 2005 (gmt 0)

Hi, as many others have said, I too have been hit hard with referral spam. My host suspended my account because they said my site was so popular that it drove the servers crazy and slowed everyone else down. I understand the htaccess can help on Apache servers, but is there anything that can be done if my host runs on a Windows Server. My host says no and only suggestion was to pay more money and go to a dedicated server. Any suggestions would be great!




5+ Year Member

Msg#: 3886 posted 3:13 pm on Dec 30, 2005 (gmt 0)

First of all: It's worth to run an own dedicated server if you use windows. I run some few websites on my server, the most important of them counts with about 6.000 unique users (no counting search engine robots, see below) per day.

Since about a year I 'classify' every entrance in the web if it is done by a robot, machine or by a human user.
To be able to do this I wrote a 'log database' where I log every entrance with its server variables (Ip, Proxy, referrer, querystring, country where they come from page visited etc. etc.)
If I see a 'rare number' of page views by one IP, I look at the pages viewed by this IP, if I notice that there have been called 1.000 sites in 2 minutes without accepting sessions it's clear that it was a bot and after having done a 'whois' I add this IP to to the robot-table and if it comes back to the site the IP would be treated correspondent to the classification I assigned: Maybe it could never enter the page for being a spammer, maybe it was 'accredited' as search engine spider etc. etc.
At the beginning it was a little bit work to classify all the spiders and spammers but now there are 2 or 3 new spiders or spammers per week.
If one of them causes too much trouble I block it directly on the server.
I did all this because my clients are paying me for being presented on the page and I don't want (and they too) that they pay for things caused by spammers or search engine robots.
Two or three weeks ago I noticed an increment of page requests for pages that do not exist, never existed and searching for files like owssvr.dll, or files in a _vti_bin directory etc.
So I wrote another database to catch the IPs doing this with a provider database behind containing the providers and there IP range.
The first week was hard: I had to classify nearly every hour some of this violent IPs, but now it seems that the 'must important' providers doing nothing against spam and attacks are already classified and new attacks enter atomically in the database and - and this is another effect - the number of this rare attacks decreased nearly to 10% of the attacks on port 80 I had before.
Maybe they don't love me anymore ...


Msg#: 3886 posted 8:47 pm on Jan 2, 2006 (gmt 0)

If you're getting anywhere near 250 thousand data requests / month, I would like to kindly advise you look into a dedicated server now. If you do not have a data request tracker, 250k hits is about 20 thousand visitors - This number is the limit for virtual hosts, despite their own facts would appear to claim they can handle more traffic, I know from experience they can not.

If you are nowhere near 250k hits / month, then your site might benefit from some optimizing... I just ran my site through this validator and must've fixed anywhere from 30 to 100 thousand errors in the past 2 days (don't laugh too hard until you're error free yourself, hehe).
[htmlhelp.com ]
(you can check a box for 'Validate entire site')
Also when things get hairy, I use this as a backup:
[netmechanic.com ]
(scroll down a bit).

By optimizing your code (validate the entire site to an error-free specification), you can considerably reduce server overhead - this usually prolongs my stay with a host for some time and extends my upgrade deadline noticeably.

Hope that helps.

p.s.: a dedi can tackle 10 - 15 million data req's / month or thereabouts, for a highly optimized site (98% + error-free, w3-compliant standardized code).

Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved