Forum Moderators: goodroi
I have two dynamic URL pages;
1.http://mydomain.com/index?id=(.*)&query=(.*)
2.http://mydomain.com/index?id=(.*)&query=(.*)&start=10&pager.offset=(.*)
I want to allow robots to crawl the first page but i don't want robots to crawl the page with "&start"...How can i do this.
If I use
"Disallow: /index?id" will block both the URL patterns. So How can i be specific..
In my robots.txt:
I have added,
User-agent: *
Disallow: /index
User-agent: Googlebot
Disallow: /index*start*
Is this correct....
Please help me..
regards
kiran
I would not include index in the Google robots.txt line. I would just have Disallow: /*start*. That will exclude all urls with start in it.
[google.com...]