pageoneresults - 1:48 pm on Jul 23, 2007 (gmt 0) The above will work better. With a file name like that, you may not need to disallow or block anything. ;) <pure conjecture>
I'm wondering if it would work equally well to block access to a "Links.html" page by using a disallow in robots.txt?
<meta name="robots" content="none">
The above will work better.
With a file name like that, you may not need to disallow or block anything. ;)
Don't believe me? That's okay. I'm sure there are many out there who can confirm the above. In fact, I know there are. :)
P.S. Link exchange directories are one surefire way for the algo to do its thing.
Unfortunately the abuse has reached levels where those who are doing it with editorial discretion might find themselves caught up in collateral damage. That's why I think link exchange directories have outlived their usefulness. They are more of an anchor now than anything else.
Try this experiment. Remove your link exchange directory from your site, just 410 it! Take those links that you consider the creme of the crop and disperse those throughout your site where appropriate. Don't put them all in one area, that's a signal.