Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Better to Exclude part of Site?

spider individual pages or index pages?



8:24 am on May 25, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I have a site that happens to be very large. (70+meg, 8000+ pages)
Because of this, I have "index" pages that link to individual pages.
Would it be wise to exclude either the "index" pages or the individual pages using robots.txt? I was thinking of excluding the individual pages just because I doubt many spiders will really bother to go through all of the site.
The site itself is also highly interrelated, often having dozens on links to other pages within the site on an individual page.

site on profile

brotherhood of LAN

2:33 pm on May 25, 2002 (gmt 0)

WebmasterWorld Administrator brotherhood_of_lan is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

hello dcheney

its a good question. In my instances, the only content I disallow is content not relevant to the site (ie contact us/feedback/buy this) sort of thing.

You have to consider that if you exclude certain parts of your site, that you will prevent the bot from navigating your whole site, and it will affect your overall navigation (as far as certain search engines and page rank is concerned)

im sure one of the experienced ones in here will be able to chip in. My largest site is about the quarter of the size of yours and its pretty much left open to spidering.


Featured Threads

Hot Threads This Week

Hot Threads This Month