| 11:12 am on Apr 28, 2011 (gmt 0)|
Yes, you can use the allow function. Example:
| 1:44 pm on Apr 28, 2011 (gmt 0)|
Since it is only one page amongst 100s, why not put this one page in an allowed folder and link from there ?
| 1:53 pm on Apr 28, 2011 (gmt 0)|
Not all agents understand "allow" so moving to a different folder is likely a good idea.
| 11:23 am on Jul 12, 2011 (gmt 0)|
This works. Thanks
| 11:26 am on Jul 12, 2011 (gmt 0)|
It works for Google and some others.
It does not work for all.
| 12:01 am on Jul 21, 2011 (gmt 0)|
How can I block those endings/versions of pages from getting spidered.
everything that has this in the url slideshow.php
everything that has this at end ?page=1
here another situation
here i would like to block TV-widget so only TV-widgets are in the serps.
So i guess some 301 or robots.txt is needed but how would it look like
| 7:31 am on Jul 21, 2011 (gmt 0)|
Blocking specific endings is very easy. You specify a "wildcard" for the beginning and then the unique string to block.
For the other examples, remember that
robots.txt matches "from the left" so you need to specify enough of the URL to match what you want to match and not match what you don't want to match.
will block both
will block only
However, the 301 redirect is preferred as that will also preserve some PageRank. It also stops further proliferation of the incorrect URLs being copied and paste to new links.
| 10:22 am on Jul 21, 2011 (gmt 0)|
thanks buddy - if i want to block anything that has to do with slidershow.php? bl bla
is that so
| 1:36 pm on Jul 21, 2011 (gmt 0)|
Yes. That blocks requests that begin with <somestuff>, followed by "slideshow.php" and may or may not then be followed by <otherstuff>.