Welcome to WebmasterWorld Guest from 54.156.37.174
Forum Moderated by: goodroi
Using a robots.txt is all part of being a good SEO. Be sure to check yours in the robots.txt validator that is available to subscribers.
Thread Subject | Messages | Started by | Last Message | ||||
---|---|---|---|---|---|---|---|
Preventing a page from being indexed but wanting links to be followed |
3 | realmaverick | 5:24 pm Oct 21, 2008 | ||||
Disallow .php in root folder but allow in sub. |
3 | mysticalsock | 5:19 pm Oct 21, 2008 | ||||
robot.txt restricted but still showing ! |
9 | member22 | 2:05 pm Oct 21, 2008 | ||||
robot.txt blocked pages but still alive |
3 | member22 | 1:27 pm Oct 21, 2008 | ||||
Disallow: /*? - is it ok ? |
6 | nex99 | 1:22 pm Oct 21, 2008 | ||||
how to block indexing of root domain, but still allow sub-domain how to block indexing of root domain, but still allow sub-domain |
6 | chazeo | 1:20 pm Oct 21, 2008 | ||||
Block directory but not lower level folder robots.txt block directory |
2 | Clicknowdomains | 1:04 pm Oct 21, 2008 | ||||
robots.txt |
2 | kiransarv | 1:00 pm Oct 21, 2008 | ||||
Sub-domain & crawl-delay |
2 | foxfox | 4:39 pm Oct 7, 2008 | ||||
Crawl Rate: How do I know when you request slow versus fast Does a long maximum time spent crawling a page have adverse impact on targe |
2 | 11364guy | 7:50 pm Oct 6, 2008 | ||||
Non-white list blocking With a sting |
4 | phred | 12:51 pm Sep 30, 2008 | ||||
How to attract Yahoo! Slurp Spider |
2 | ouarsenis | 1:28 pm Sep 25, 2008 | ||||
My website all pages are in cache but home page are not caching home page is not caching , what should i do for it? |
3 | pervezalam | 6:43 pm Sep 24, 2008 | ||||
robots , duplicate content and google |
9 | fsmobilez | 8:07 pm Sep 23, 2008 | ||||
Robots.txt Ive never used robots.txt |
4 | srobinson | 3:21 am Sep 23, 2008 | ||||
Understanding the structure of robots.txt subdirectories |
10 | alahamdan | 12:00 am Sep 5, 2008 | ||||
Baiduspider - how do I keep it out. having some difficulty blocking this spider |
3 | ChicagoFan67 | 1:44 pm Sep 4, 2008 | ||||
restrict special duplicate pages |
4 | sixeleven | 11:58 pm Sep 3, 2008 | ||||
How to set Slow GoogleBot Crawl Speed I need to fix Google to crawl my site as very slow or specifice times, how? |
7 | Senthil | 7:54 am Aug 31, 2008 | ||||
Errors of robots.txt unreachable in google webmaster sitemap Please anyone help me |
1 | pervezalam | 9:35 am Aug 29, 2008 | ||||
What is Robot.txt? What is robot.txt, |
8 | deepk | 7:16 am Aug 21, 2008 | ||||
disallow & allow |
3 | doumiao | 6:14 am Aug 15, 2008 | ||||
Regular Expressions, robots.txt, and robotstxt.org |
4 | Funtick | 7:24 pm Aug 14, 2008 | ||||
How to disallow https urls How to disallow https urls |
6 | spiritualseo | 8:39 pm Aug 12, 2008 | ||||
URLs restricted by robots.txt How to remove that restricton or reinclude URL |
2 | Raheel | 2:28 pm Aug 9, 2008 |