Forum Moderators: coopster
I'm using print "Location:http://mysite.com/redirectionpage.html\n\n"
as my redirection command in the CGI program and call it with
<!--#exec cgi="program.cgi"-->
If the User Agent is not to be redirected then the requested page is served normally. It's only when the User Agent is to be redirected, that the originally requested page is shown with the link for the redirection target page added.
Anything I can do to get it to redirect?
For example:
<!--#exec cgi="/cgi-bin/example.cgi" -->
If the script returns a Location: header instead of output, then this will be translated into an HTML anchor.
The include virtual [httpd.apache.org] element should be used in preference to exec cgi. In particular, if you need to pass additional arguments to a CGI program, using the query string, this cannot be done with exec cgi, but can be done with include virtual, as shown here:
<!--#include virtual="/cgi-bin/example.cgi?argument=value" -->
I guess that might answer the question, try include virtual and see if that works.
Are you on a Microsoft server? If so the syntax will be somewhat different. The following should work with Perl on Apache.
print "Pragma: no-cache\n";
print "Location: http*://www.somesite.com\n\n";
For Microsoft servers with Perl this **might** work. Never tried it before.
print "Status: 301 Moved Permanently\n","Location: http*://www.somesite.com\n\n";
Then, to make the redirect, use the include directive jatar_k referred to.
Tried the virtual include but the results were the same.
Key_Master,
I'm on a UNIX server. Added the "Pragma..." line, but no change in
results.
netcommr,
I have no knowledge of mod_rewrite, but I'll look into it.
Thanks for the advice all, maybe I need to attack this from a different angle such as mod_rewrite or?
Website Technology Issues [webmasterworld.com]
Do you want to maybe let us in on why you want to redirect some agents?
Thanks for the link.
I created a new site about a month ago and it's attracted too many spambots and other unfriendly robots in my view. So I was looking for a way to keep them out. I'm already using .htaccess for some IPs and robots.txt for those that will obey it. The script was for the rest that showed a User Agent.
This is all probably more effort than it's worth, but it began to annoy me so I decided to do something about it.
if ($badbot == 1){
$url="http://domain.com/badbotpage.html";
print "Content-type: text/html\n\n";
print "<HTML>";
$redir = "<META HTTP-EQUIV=\"REFRESH\" CONTENT=\"0;URL=";
$redir .= $url."\">";
print $redir;
print "</HTML>";
exit;
}