Forum Moderators: phranque

Message Too Old, No Replies

I found my .HTACCESS file!

         

Megaclinium

1:23 am on Mar 31, 2011 (gmt 0)

10+ Year Member



I'm sure most of you will go "woopdeedoo"

I guess I just never noticed it because of the . in front hiding for telnet.

Anyway, the reason for my post is that I am changing hosting providers. One of the questions I asked is "do you provide hot link protection" which is on my control panel. I type in the portion of the site to protect from scrapers and the extentions to prevent scraping of (I still allow .PDFs to be linked directly like .html pages).

Many said that they don't have this but I can use the .htaccess file.
This is what I found it translates to in my .htaccess file:

RewriteCond %{HTTP_REFERER} !^http://mysite.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://mysite.com$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.mysite.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.mysite.com$ [NC]
RewriteRule .*\.(jpg|jpeg|gif|png|bmp)$ 404.shtml [R,NC]

this results in any site that tries to grab your files with the extensions you select (.jpg, .bmp, etc) to get a 302 error and not send the file.

Anyone visiting your site doesn't get this since the media gets are from your own pages.

The other thing I asked them is if they have the IP Deny thing.

This was simple, you typed the #s on the screen but it translates as something like this in the .htaccess file so you certainly could do this manually:

(the below is of course a made up range from an internal net so we don't upset any hacker scum on any criminal servers):

deny from 192.168.72.0/24

I have literally hundreds of ranges banned so is good to know how to do this.

The other thing I ask for, and they have this, is to get access to my raw log files so I can run my own progs to separate the data.


Should I edit the .htaccess on my PC and re-ftp it over? or should I used one of linux's editors right on the server?

tangor

3:59 am on Mar 31, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You can do it either way, but if you ftp, make sure your PC editor does not introduce line end characters your host does not support... ie: use Notepad and ASCII (not Binary).

Secondary, rather than entering "hundreds" of ip addresses, look to banning ranges of ip addresses or banning by host. See the Search Engine forum for more details:
[webmasterworld.com...]

tangor

4:04 am on Mar 31, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Whoops... just noted that Search Engine moved it here... Knowing that, just make sure you send your .htaccess in the right format, but use all the data to PUT IN YOUR .htaccess from the info at the Search Engine forum.

You have to deal with .htaccess on Apache servers... but the content of that file comes from other experiences.

g1smd

7:27 am on Mar 31, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Serving a 302 redirect is NOT a good idea.

Additionally, your site does this if the referrer is missing - so anyone using an Internet Security product that strips out the sending of referrer data and anyone using an ISP that implements a caching proxy system will see your site as imageless and broken.

Finally, the code is inefficient in many ways and has several syntax errors. Try:

RewriteCond %{HTTP_REFERER} .
RewriteCond %{HTTP_REFERER} !^https?://(www\.)?example\.com/ [NC]
RewriteRule \.(jpe?g|gif|png|bmp)$ - [F,NC]

wilderness

1:48 pm on Mar 31, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"do you provide hot link protection" which is on my control panel.


Using CP to implement such procedures is a BAD practice. Many provide bad syntax, while some providers even implement the procedures in a layer above your domain that possibly prevents future removal or editing of the action.

this results in any site that tries to grab your files with the extensions you select


For clarification, your interpretation of this could be mistaken.
What these lines actually do is merely deny requests for objects (pages or files) in which a referring URL is actually provided in the request.
The lines will not prevent a direct request for the files, that's another and more complicated prevention.

The other thing I ask for, and they have this, is to get access to my raw log files so I can run my own progs to separate the data.


If your with a hosting provided that does not offer raw logs, than you need to find another host immediately.