homepage Welcome to WebmasterWorld Guest from 54.198.130.203
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
Forum Library, Charter, Moderators: coopster & jatar k

PHP Server Side Scripting Forum

    
If statement for nofollow urls with numbers
Trying to block access to tag/386 for eg
cheaperholidays




msg:3561810
 10:07 am on Jan 30, 2008 (gmt 0)

Hello

I wonder if you can help?

We have <?php
if((is_home() && ($paged < 2 )) ¦¦ is_single() ¦¦ is_page() ¦¦ is_category()){
echo '<meta name="robots" content="index,follow" />';

What i would like to to do is noindex, nofollow, all urls with numbers for in folder /tag/386 for eg so that any number following /tag/is blocked

I cannot for the life of me find any php code for greater than

Kind regards

Les

 

PHP_Chimp




msg:3561815
 10:23 am on Jan 30, 2008 (gmt 0)

Could you not use robots.txt [robotstxt.org]?

User-Agent: *
Disalow: /tag/

cheaperholidays




msg:3561828
 10:59 am on Jan 30, 2008 (gmt 0)

Thanks php chimp have already done that, wanted to combine both to make sure

Les

robsoles




msg:3561855
 11:59 am on Jan 30, 2008 (gmt 0)

Hey Les,

Far from being expert in PHP and fearing perhaps I miss what you really mean by your question about greater than:

Your snippet is using 'less than' on the variable '$paged', to make this snippet the opposite (and so using 'greater than' in PHP) just exchange the '<' symbol for a '>'. Without writing a lot of script out (*Google: conditional php), you are probably better off with something like;

<?php
if (Con_Index()>0) {
echo WriteMeta("robots","index,follow");
} else {
echo WriteMeta("robots","noindex,nofollow");
}
?>

There will of course be a million ways this can be tackled and probably ten percent of them are worth the time taken to write and test them, I don't blame you for wanting to back your robots.txt file up but if a bot is ignoring robots.txt it is at least a bit unlikely it will honour even 'rel="nofollow"'.

Good luck,
robsoles.

cheaperholidays




msg:3561890
 12:55 pm on Jan 30, 2008 (gmt 0)

Thanks Rob

I have decided to block robots.txt and leave it at that, you know what its like you think "i wonder is this will work" and of you go on the PHP tangent...

Les

omoutop




msg:3561908
 1:06 pm on Jan 30, 2008 (gmt 0)

lets see.... you have something like: www.example.com/tag/345/maybe_more/folders/somefile.htm

i am guessing you use htaccess for this?
if so... you urls are being constructed as http://www.example.com/tag/$somevar/$othervar/$pagename.htm

if i am guessing right this far...

if((int)$_GET['somevar']>0)
{
echo '<meta name="robots" content="noindex,nofollow" />';
}

PHP_Chimp




msg:3561948
 1:59 pm on Jan 30, 2008 (gmt 0)

Although robots.txt will not stop bots viewing the page, neither will the robots meta tag. These are just recommendations to the bots to stay away. If you are really bothered about bots not obeying your robots.txt then you could go down the rout of blocking by the UA string. However most of the spam bots use normal UA strings, and its only bots like google, msn, etc that follow robots.txt, so there is not much point in looking at blocking them by UA string.

As far as im aware there are no robots that support the robots.txt file that dont support the meta tags, this also works the other way around. So either method should work, just 2 lines in robots.txt is a lot quicker than a php alternative.

omoutop




msg:3561978
 2:53 pm on Jan 30, 2008 (gmt 0)

well, if you got to exclude some hundreds of pages, php is prefered.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / PHP Server Side Scripting
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved