| 4:58 am on Apr 25, 2011 (gmt 0)|
It's better IMO to use the meta ROBOTS NOINDEX for those pages because only Google non-standard robots.txt enhancements is flexible enough to what you want leaving the other SEs still to be dealt with.
| 5:22 am on Apr 25, 2011 (gmt 0)|
i dont think i can use robots NOINDEX for those pages as they are automatically generate by my forum software, but i going to look into their FAQ to see if that is possible...
| 8:58 am on Apr 25, 2011 (gmt 0)|
You would have to conditionally include it per page, a programming change, very unlikely it's already supported but something they should support based on Google's stance on thin/duplicate content so maybe they'll implement it for you, can't hurt to ask.
| 9:16 am on Apr 25, 2011 (gmt 0)|
i will get into contact with the forum than,
last question regarding this, could i at least block for now
(for some reason i cant post that url it breaks into gibberish, here a screenshot of the url)
I wonder if i can block at least the ?app=core part
| 7:59 pm on May 2, 2011 (gmt 0)|
just to let you know, i tried the lesser problem for now and settled blocking the following;
within 4 days about 5.000 pages were blocked and as a nice side effect the visitor level is almost back to pre panda level...
| 5:51 pm on May 10, 2011 (gmt 0)|
i have a somewhat similar issue. I want to blog user profiles for a forum where the extension looks like such:
how can i block all of those profiles using robots.txt?
thanks in advance.
| 6:42 pm on May 10, 2011 (gmt 0)|
...i would assume
should do the trick, if you have Google WMT you can test if it does block it correctly...