Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Spiders and redirects to disallowed directories

Are the disallowed pages indexed?

6:47 pm on Aug 19, 2005 (gmt 0)

New User

10+ Year Member

joined:May 27, 2005
votes: 0

If a spider follows a link to a page that redirects to a different page which is disallowed in robots.txt, what happens?

For example, lets say someone's site links to www.domain.com/hello.php :


User-agent: *
Disallow: /disallowed

Will the spider index the page from the disallowed directory?

My guess is that it will index it because robots.txt will only keep it from requesting the page directly, and in this case it didn't request the page by name, but was instead presented with it "through" a different (allowed) link.

Does anyone have any experience with this who can answer for sure?

7:18 pm on Aug 19, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
votes: 0

The php redirect creates a new client request, so yes, the client will use the new URL, and robots.txt will apply.

However, many spiders such as Google, Yahoo, and Ask Jeeves/Teoma, will list a URL-only result in their SERPs if they "know about" the URL, but are disallowed by robots.txt from actually fetching the page. In Yahoo's case, they will use the link text they found with the link (if any) to create a listing.

A partial solution is to allow the page to be spidered, but include a <meta name="robots" content="noindex"> tag on the page. However, I've seen Google ignore this occasionally as well, and include a URL-only listing anyway.



Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members