Welcome to WebmasterWorld Guest from 54.166.46.226

Forum Moderators: goodroi

Message Too Old, No Replies

Blocking a file in the main folder?

   
7:53 pm on Jan 22, 2005 (gmt 0)

10+ Year Member



The location to my file is [example.com...]

To block link.php for all robots, should I use this?

User-Agent: *
Disallow: /link.php

I'm not sure if I need the "/" or not after Disallow:?

12:35 am on Jan 23, 2005 (gmt 0)

10+ Year Member




I'm not sure if I need the "/" or not after Disallow:?

It is not strictly needed, but the most correct syntax includes the "/" char.


User-Agent: *
Disallow: /link.php

Correct. :)

2:57 am on Jan 23, 2005 (gmt 0)

10+ Year Member



Ok ;-)

One more quick question:

link.php is what we use to count clicks..we have a function called url which when used in this syntax, counts the click then redirects to an outside site.

Example:

[example.com...]

Would the robots.txt in my message above prevent robots from serfing through any number of the above links? Or JUST the link.php file itself?

The reason I am doing this is because I don't want to be penalized for any of the links in my redirect should they show any errors, or be a "bad neighborhod" which could penalize me? Am I correct in the assumption that this does what I want it to?

3:33 am on Jan 23, 2005 (gmt 0)

WebmasterWorld Administrator mack is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



If you block link.php then that file can't be spidered, with ot without additional variables within the url (link.php?q=#*$!xxxx)

On order to spider any url it needs access to link.php if you have blocked access to that then they cant go any further.

Mack.