Welcome to WebmasterWorld Guest from 50.19.190.144

Forum Moderators: goodroi

Message Too Old, No Replies

How to make a page *inaccessible* to searches

without using password protection?

     
5:15 pm on Nov 5, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 8, 2002
posts:65
votes: 0


Hi,
I need to create a page... the URL will be sent to a select few, so we don't want people to come across it by searching. Is this even possible or does the page have to be password protected?

Thanks.
h+h

5:20 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member nick_w is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 4, 2002
posts:5044
votes: 0


Sure, just add a robots.txt [webmasterworld.com] file to your http document root (where you keep your pages).

The file should look like this:


User-agent: *
Dissallow /

That'll stop all bots.

Nick

5:23 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 21, 2003
posts:2355
votes: 0


That'll stop all bots.

That'll stop all bots who obey/read the robots.txt file - don't count on this method - don't put data in a public place that you don't want to be "found".

5:25 pm on Nov 5, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 8, 2002
posts:65
votes: 0


wow, thanks for that. can i do that in a separate directory, so that only the page in that directory is inaccessible to the bots?
5:30 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member nick_w is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 4, 2002
posts:5044
votes: 0



Disallow /dir/page.html

I *think* that'd do it...

Nick

5:33 pm on Nov 5, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 8, 2002
posts:65
votes: 0


And does the robots.txt page go within that directory or still at the root?
5:35 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member heini is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Jan 31, 2001
posts:4404
votes: 0


I'd say password protections is by far the better method in this case. One of the reasons rogue bots are called rogue is that they don't obey to robots.txt.
6:19 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


Don't use Nick_W's first suggestion unless you want to ban ALL good bots from your site, which I don't think is your intention. His second suggestion would also work, but has the downside of telling the whole world the name of this secret page.

I suggest protecting a subdirectory using the sample below, and then putting the page there and giving it a more-or-less unguessable name (i.e. not index.html). Also make sure that if someone nosy tries to access that directory, that your server doesn't cough up a list of the files in the directory. That should be sufficient unless the data is really sensitive, in which case you should definitely password-protect it.

User-agent: *
Dissallow /some_private_dir/

6:53 pm on Nov 5, 2003 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 8, 2002
posts:65
votes: 0


thanks for your help!
10:41 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member jomaxx is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Nov 6, 2002
posts:4768
votes: 0


**** TYPO ALERT: "Disallow" has just one "s" in it. I cut-and-pasted part of that code from an earlier post without proofreading it. Use this instead:

User-agent: *
Disallow: /some_private_dir/

11:22 pm on Nov 5, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 14, 2002
posts:1192
votes: 0


If all you want to do is make sure the page is not advertized to the whole world in the major search engines then I suppose that a suitable robots.txt will suffice.

If you want to be sure that nobody unauthorized can read it there is simply no alternative to a password protection scheme.

8:04 pm on Nov 10, 2003 (gmt 0)

New User

10+ Year Member

joined:July 7, 2003
posts:3
votes: 0


One trick I've heard is to link to the new url using a button.onClick command or a nested form with a submit button to go to the page in question. Most robots (and please correct me if I'm wrong) don't follow these types of links.
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members