| Welcome to WebmasterWorld Guest from 22.214.171.124 |
register, login, search, subscribe, help, library, PubCon, announcements, recent posts, open posts,
|Accredited PayPal World Seller|
| 6:53 pm on Feb 14, 2000 (gmt 0)|
I want to create a robot.txt file that will do the following.
Disallow certain pages to ALL Spiders and then disallow some other pages to only one of the spiders. i.e.
moreinfo.htm not allowed for any spiders
moreinfo-alt.htm not allowed for Excite.
Also in the User-agent: what excatly am I putting there? Would it be scooter or Scooter/2.0 G.R.A.B. V1.1.0?
| 8:25 pm on Feb 14, 2000 (gmt 0)|
A full robots.txt spec is at webcrawler. Half of it is Greek to me but there are examples in section 4 that are pretty simple.
Also a syntax checker is at robots syntax
and this is very handy.
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
WebmasterWorld ® and PubCon ® are a Registered Trademarks of Pubcon Inc.
© Pubcon Inc. 1996-2012 all rights reserved