Welcome to WebmasterWorld Guest from 18.104.22.168 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
how to block a page on a particular location experienced
i want to block a particular file located under the various folder with the same file name.
domain.com/folder2/file.php domain.com/folder3/file.php domain.com/folder4/file.php
I have approx 15000 folders like this where i want to block the file.php in every folder. what is the best option to block. I dont want to make the robot file so lengthy by putting all the urls in side. any other quick solution.
help appreciated in advance :-)
[fixed confusing typo in title]
edited by: goodroi at 1:54 pm (utc) on Aug. 13, 2007]
An easy way to handle this with the big search engines is to use wildcards aka pattern matching. IMPORTANT This is not supported by most other bots.
Google Robots.txt Pattern Matching Explained
[ ...] google.com
Yahoo Robots.txt Wildcards Explained
[ ...] ysearchblog.com experienced
will this expression block this file at any level.
if yes then this would be really helpful for me.
pls reply if possible new_seo
If you want to block a particular file which is located under the various folder with the same file name,then I think you can do it simply by User-Agent:* Disallow: /file.php goodroi
This will block Google from all filenames containing "my-file-name.php"
Please remember that most smaller search engines do not support this in robots.txt. Also if you want to test other combinations for Googlebot you can go to Google's Webmaster Central and use their robots.txt analysis tool.
This will disallow any URL that begins /file.php.... Disallow: /file.php
... and that only works for URLs in the ROOT, i.e. BEGIN with that.
You need the
* to make it work for folders. Disallow: /*file.php
This Rule MUST go in the
User-Agent: Googlebot section.
Other bots do not understand the
If you have a
User-agent: Googlebot section, then ALL of your rules for Googlebot must go in that section as Googlebot will then completely IGNORE the User-agent: * section.
You do this even if it means duplicating a lot of stuff into both sections.