Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
rencke and rc,
Do you have any thoughts about keeping spiders away from the content pages and only allowing them to feed on the NOFRAMES content of the frameset?
joined:July 2, 2000
Well, the site I'm referencing had no concerns w/ hiding the content, and my philosophy re spidering is generally more is better. Since the content was dynamically generated (using a query "?"), I assumed that it was not going to be spidered anyway.
No way that I know of. There was a thread yesterday that I believe discussed some tricks to make it less visible, but it was still there in the source.
If you have nothing to hide, why put all your eggs into the same basket? The more pages (with different titles and descriptions), the better chance that one of them may rank high in a search reply and bring people into the site.
If, on the other hand, you do have something to hide - well, sorry I have no other idea than the meta robots tag - but I am told that not all engines respect that one.
BUT - other members of the community linked to the pages on message boards, which I wasn't expecting and didn't even think of, and indeed, the boards were spidered, and therefore those pages. I just checked Google, and every page on my "community site" is indexed. The issue is having duplicate content up.
I'll just get them 404'd. How interesting - how to avoid listings instead of how to get them.
Purposely loading the page with words spiders view negatively, such as "links, resource, directory" might also add some defense without getting you marked as a spammer.
As for boards linking to the help pages. How about a js that performs a redirect (to a 404, perhaps) unless the referrer matches an approved page?