Forum Moderators: coopster
define('HOST_NAME', 'www.example.com');
$test_url = 'http://' . HOST_NAME;$socket = @fsockopen(@gethostbyname(HOST_NAME), 80);
fwrite($socket, "HEAD $test_url HTTP/1.1\r\nHost: " . HOST_NAME . "\r\nConnection: Close\r\n\r\n");$i = 0;
$header = '';
while($i<20)
{
$s = fgets($socket, 4096);
$header = $header . $s;
if(strcmp($s, "\r\n") == 0 ¦¦ strcmp($s, "\n") == 0)
{
break;
}
$i++;
}
fclose($socket);
echo $header;
If I change example.com into <some_other_url.com>, $header == 400 bad request. But
the Server Headers [webmasterworld.com] return 200. What's wrong with my code?
HEAD / HTTP/1.1
Host: www.example.com
User-agent: iProgramBot http://www.iProgramBot.com
and provide us with a Web page to explain why you're accessing our sites. Otherwise, I regret that on my sites, you'll always get a 403 response, unless I check your Web page and decide to allow your user-agent. I'd also recommend that you read and follow robots.txt if you intend to fetch multiple URLs from other sites; It's the polite thing to do, and saves you getting added to blacklists.
Jim