Welcome to WebmasterWorld Guest from 54.146.240.181

Forum Moderators: rogerd

Message Too Old, No Replies

Preventing unneccessary pages from being indexed

     

Bradley

8:39 pm on Aug 2, 2004 (gmt 0)

10+ Year Member



This is a rather basic question, but could provide a lot of value to those running their own forums. Search engines will try and index any/all pages possible. On some forums, there are toolbar links within a thread that look like:

[example.com...]

[example.com...]

[example.com...]

Does these forums prevent the search engines from indexing these pages? If so, how is this accomplished?

I looked at the robots.txt file but I didn't find anything there.

[edited by: rogerd at 9:06 pm (utc) on Aug. 2, 2004]
[edit reason] No specifics, please... [/edit]

rogerd

10:31 pm on Aug 2, 2004 (gmt 0)

WebmasterWorld Administrator rogerd is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I think it's a good idea to prevent "junk" pages from being indexed - forms, odd versions, etc. Normally these won't be well-ranked, but I'd prefer the SEs to focus on the good content instead of scarfing down low-content pages.

You can restrict indexing with your robots.txt file and/or a robots NOINDEX meta tag. The advantage of the robots.txt approach is that well-behaved bots will never even request the page. The downside is that Google, when it finds the link on a "good" page, may stick that URL in its index for future reference. It won't spider it, but it may keep the entry for it. If you put in the NOINDEX tag, that should prevent if from being listed in any way. Of course, the spider has to request the page to see that tag, so you'll still see some of the bandwidth usage.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month