Forum Moderators: open
To prevent part of a page from being indexed add the tag
<!-- FreeFind Begin No Index -->
before the section of the page to be ignored, and the tag
<!-- FreeFind End No Index -->
after the section of the page to be ignored. The spider, when it notices this tag, will prevent the text occurring between these tags from being included in the index.
Will this actually work?
What I am trying to do it to stop a re-direct link being indexed in the search engine. This would create a duplicate page with the web site which is being listed on my site.
I thought about using robots.txt but this would stop other pages which I want indexed. I try using rel="no follow" in the link but this won't work because the way the re-direct link works
<a href={TRACKLINKURL} class="reciplink" {EXTERNALLINKS}>{LINKTITLE}</a>
Any help would be appreciated.
This no-index part of the page would be great if I know if works.
OR will this only work with the FreeFind search engine whatever that is?
If so has anyone any idea how I can achieve what I am trying to do.
User-agent: *
Disallow: file.html
(If you have reciprocal links on someone elses site you should always check that they have not used this tag attribute, because if they have it will HARM your own sites ranking).