Forum Moderated by: goodroi
Using a robots.txt is all part of being a good SEO.
| Thread Subject | Messages | Started by | Last Message | ||||
|---|---|---|---|---|---|---|---|
| Robot.txt and Sitemap |
3 | sierra11b | 1:26 am Oct 5, 2005 | ||||
| Newbie Robots.txt file Q |
5 | CoffeeMan | 12:34 pm Oct 3, 2005 | ||||
| SygolBot, from sygol.com: anyone seen this before? just visited my site, and did NOT respect robots.txt |
3 | stapel | 12:37 am Oct 3, 2005 | ||||
| Can I get a page crawled daily? disallow commands all I can find |
5 | innocbystr | 11:33 pm Oct 2, 2005 | ||||
| Need to double check my Entry Q - regarding ( / ) (Forward Slash) in Entry |
2 | Tutti | 3:59 pm Oct 2, 2005 | ||||
| Trying to add robots.txt to invisionfree forum can't get it to work |
5 | Damaris | 1:34 am Sept 29, 2005 | ||||
| What does robots.txt help to do? |
3 | flashcination | 10:40 pm Sept 28, 2005 | ||||
| How to block URL's of other sites link on my site? How to block URL's of other sites link on my site? |
3 | sweetu | 9:09 pm Sept 25, 2005 | ||||
| How to make a robots.txt file? What are the steps? |
2 | steelcutman1975 | 8:37 pm Sept 23, 2005 | ||||
| Google "URL exclusion" not following our robots.txt! robots.txt in subdirectory is not getting followed |
9 | joeduck | 9:00 pm Sept 21, 2005 | ||||
| newbie question About robots.txt |
5 | dicken288 | 7:37 am Sept 21, 2005 | ||||
| Keeping engines away from some files. Keeping some image files away from the prying seahc engines.... |
3 | ScoobyFlew | 6:05 am Sept 20, 2005 | ||||
| Robot.txt newbie question |
2 | playadonna | 2:51 pm Sept 18, 2005 | ||||
| how long till search bot checks again? |
2 | editordude | 11:47 am Sept 17, 2005 | ||||
| How to prevent hotlinking but allow one site using htaccess? |
2 | hersheyyu | 11:38 pm Sept 15, 2005 | ||||
| Help with disallow Help with disallow |
4 | SlugKing | 10:21 pm Sept 14, 2005 | ||||
| Can Spiders see my dynamic "includes" pages? |
3 | zepper | 11:57 pm Sept 13, 2005 | ||||
| picking up |
5 | stevelibby | 7:04 am Sept 13, 2005 | ||||
| Need Help Google spiders crawl 2 sites same content |
1 | milindnaik | 6:54 am Sept 12, 2005 | ||||
| Allow search spider - error 503 socket.gaierror: (7, 'getaddrinfo fail Allow search spider - error |
5 | shahindastur | 5:39 am Sept 12, 2005 | ||||
| Quick Question When something is disallowed, do bots ignore it completely? |
5 | jlander | 2:43 am Sept 12, 2005 | ||||
| Am I correctly blocking diamondbot? |
2 | JAB_Creations | 11:25 am Sept 11, 2005 | ||||
| Does excluding a cgi page automatically disable same page with paramet paramters? |
4 | Clark | 3:47 am Sept 11, 2005 | ||||
| Is it harmful if google indexes my /feed/ directory? how to dissallow safely |
3 | brokenbricks | 2:42 pm Sept 9, 2005 | ||||
| robots.txt wildcard Will this work |
7 | gosman | 1:59 pm Sept 6, 2005 |