Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
I was curious if there is an entry you can put into your robots.txt to deny the same page that has multiple variables. For example, I want to deny page.html but I also want to deny page.html?item=1 and page.html?item=2. Is it possible to put something in your robots.txt file that will deny all those pages instead of specifying each page with the item number? I have over 1 thousand items so this will take up a lot of room in my robots.txt file.
Is it possible?
But don't take my word for it since I'm completely new to this; check it at Google Webmaster Tools which has a robots.txt analysis tool which allows you to test robots.txt rules against specific urls. V handy indeed.
ps dont forget google, yahoo, msn and all the other robots act differently from each other. just because something works for google does not guarantee it will work for the others.
I guess the Disallow: /viewcart.html in the robots.txt works for all variables and values inputed along with the file.
Thanks for your help! :)
If the start of the URL completely matches that in the disallow statement, even if the real URL is longer than the disallow rule, it will still be disallowed.
So, Disallow: /page will disallow any url that starts with / p a g e for example.