Forum Moderators: phranque

Message Too Old, No Replies

Questions about hotlinking in .htaccess

         

silverbytes

3:50 pm on Jun 27, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



99% of my images being hotlinked are from "hotlinker.com" site.
I thought that blocking using "Deny from (hotlinker.com ip here)" would be enough but I see tons of 404 errors in my logs.

1) Does that means that my protection is working?
2) If so: how to get rid of those 404?
3) Is there a better way to avoid my images being hotlinked?
4) Will image searchers as google image still be able to show my photos if I use other kind of protection?

Help is very appreciated in advance.

[edited by: jdMorgan at 4:58 pm (utc) on June 28, 2007]
[edit reason] examplified hotlinking domain [/edit]

jdMorgan

3:03 am on Jun 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If a host is denied, you should see a 403-Forbidden, not a 404-Not Found response, so you have a problem.

If the images are hotlinked by hotlinker.com, that is, they are displayed on the hotlinker.com site, then hotlinker.com will be the HTTP_REFERER, not the REMOTE_HOST. When your image is fetched, it is fetched by a visitor's browser, not fetched by hotlinker.com. So, the REMOTE_HOST and the REMOTE_ADDR, which is what "Deny from" refers to by default, are not the hotlinker.com hostname or IP address, but rather those of the visitor to that site.

You can achieve what you intended with this instead:


SetEnvIf Referer "^http://(www\.)?hotlinker\.com" hlink
...
Deny from env=hlink

Note that if you use a custom error document, you must Allow that custom error document to be served to clients requesting hotlinked URLs. If you do not, then another 403-Forbidden error will be generated when they attempt to fetch the custom 403 page as a result of the image denial. Then you'll get yet another 403 error because of that -- and another, and another... ad infinitum, until either the client or the browser gives up.

I also suggest you allow universal access to robots.txt. The result is something like:


SetEnvIf Request_URI "custom403\.html$" allowit
SetEnvIf Request_URI "robots\.txt$" allowit
SetEnvIf Referer "^http://(www\.)?hotlinker\.com" hlink
Order Deny,Allow
Allow from env=allowit
Deny from env=hlink
Deny from 38.0.0.0/8
Deny from 63.146.13.64/27
Deny from 63.148.99.224/27

etc.

Remember that you may use only one "Order" directive in each .htaccess file; You will have to integrate the this code with your existing mod_access directives, if any.

Jim

silverbytes

2:31 pm on Jun 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks but why should I suppose those are being triggered from hotlinking.com?

That's how my logs entries looks like:

85.138.209.** - - [25/Jun/2007:09:44:14 -0300] "GET /wallpapers/homer-simpson-wallpaper.htm HTTP/1.1" 200 6401 "http://www.hotlinker.com/friend/profile/displayProfile.do?userid=25293***" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; SIMBAR Enabled; SIMBAR={AB31FD6B-060D-41a7-997C-EAD0A75608C3}; SIMBAR=0)"

Note: replaced some numbers with *** to fit guidelines

[edited by: jdMorgan at 5:01 pm (utc) on June 28, 2007]
[edit reason] Examplified hotlinking domain name [/edit]

jdMorgan

4:57 pm on Jun 28, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



...Because I cited that domain as a generic "example.com" that is hotlinking your images. I have edited all posts in this thread to clarify by making the hotlinking domain name consistent.

Your log entry indicates that as I described above, you should deny access based on the referer, and not based upon the remote host IP or the remote hostname.

Jim

[edited by: jdMorgan at 5:01 pm (utc) on June 28, 2007]

silverbytes

9:17 pm on Jun 29, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I appreciate the answer, unfortunatelly I reall don't fully understand the mechanism. (my fault)
Must I do that for every and each hotlinking site?

jdMorgan

12:34 am on Jun 30, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Most people opt to block all referrers except their own domain(s) and possibly allow a few other exceptions.

Jim

silverbytes

1:29 pm on Jul 2, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Interesting. What syntax do you use to block all referrers except yours and others?

Must I allow google images and other SE manually in order to avoid banning them to show my photos in picture search?

wilderness

2:42 pm on Jul 2, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There's an example thread in the forum library.

[webmasterworld.com...]

silverbytes

3:17 pm on Jul 2, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If I understand correctly I could just find my top 3 abusive entries and replace that in code. What I called in this example "abusivehotlinker2 and 3"

Then use a custom 403 page like custom403.html to send it there.

But what are these "deny from" IP in example?

Deny from 38.0.0.0/8
Deny from 63.146.13.64/27
Deny from 63.148.99.224/27

Original code:

SetEnvIf Request_URI "custom403\.html$" allowit
SetEnvIf Request_URI "robots\.txt$" allowit
SetEnvIf Referer "^http://(www\.)?hotlinker\.com" hlink
SetEnvIf Referer "^http://(www\.)?abusivehotlinker2\.com" hlink
SetEnvIf Referer "^http://(www\.)?abusivehotlinker3\.com" hlink
Order Deny,Allow
Allow from env=allowit
Deny from env=hlink
Deny from 38.0.0.0/8
Deny from 63.146.13.64/27
Deny from 63.148.99.224/27

wilderness

3:56 pm on Jul 2, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If I understand correctly I could just find my top 3 abusive entries and replace that in code. What I called in this example "abusivehotlinker2 and 3"

Then use a custom 403 page like custom403.html to send it there.

Your going to make far too much work for yourself by concentrating your goals on individual abusers when in fact you should implement a solution to hot-linking or deep-linking to ALL abusers

But what are these "deny from" IP in example?

Deny from 38.0.0.0/8
Deny from 63.146.13.64/27
Deny from 63.148.99.224/27

Jim provided those IP ranges as an example of common denied IP ranges. One is Cyveillance (I did not check the others).

Original code:

SetEnvIf Request_URI "custom403\.html$" allowit
SetEnvIf Request_URI "robots\.txt$" allowit
SetEnvIf Referer "^http://(www\.)?hotlinker\.com" hlink
SetEnvIf Referer "^http://(www\.)?abusivehotlinker2\.com" hlink
SetEnvIf Referer "^http://(www\.)?abusivehotlinker3\.com" hlink

Adding quotes is redundant and presents bloated code.
I can assure you that should the day come and you are ever required to poor over 20-25 pages of htaccess locating a syntax error that you'll regret the day you ever added any unnecessary characters.

Order Deny,Allow
Allow from env=allowit
Deny from env=hlink
Deny from 38.0.0.0/8
Deny from 63.146.13.64/27
Deny from 63.148.99.224/27

Don

[edited by: jdMorgan at 5:02 pm (utc) on July 2, 2007]
[edit reason] Repaired formatting. [/edit]