Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
I have three URL patterns
I want allow users to crawl the first two but not to allow the links following it. I have to disallow the third pattern.
Basically it is allowing the first page and disallow the second page.
How can i do this in robots.txt?
You will also need to re-list all other things that you want Google to not follow in the Googlebot section, as once there is a Googlebot section, they no longer look at the