Welcome to WebmasterWorld Guest from 23.20.137.66

Forum Moderators: incrediBILL

Message Too Old, No Replies

Theme based site - how to make a <head> tag for part of site

     
8:57 pm on Apr 9, 2012 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Hello - I have a site where the header is the same all over the site, means when I want a noindex it will be on whole site, but I want this noindex just on a part of the site, like example.php is there a way how I can do that, create a seperate head tag just for that section of the site. Im not interested in a robots.txt solution
9:23 pm on Apr 9, 2012 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



"Two versions of the header" is the first thing that comes to mind. Do your index and no-index files have something in common, like location, or are they randomly mixed all over the place? Are the headers constructed by a php script? (You said "example.php" but this is the HTML forum, so I can't tell :()

If the files are distinguished in some obvious way, it should be pretty trivial to write a couple of extra lines of php telling it to look at the name of the requesting file. If the name includes /openfolder/ use the OK-to-index version. If the name includes /closedfolder/ use the noindex version.

If the whole header is assembled on the fly, then of course you wouldn't need the different versions. You'd just have that one extra <meta blahblah noindex> tag, either included or omitted.
9:30 pm on Apr 9, 2012 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Lucy you are everywhere :) Ok here is

head tag is on the index.php (maybe this really should be on the php section of ww) then some folders deep a little like wordpress we have the theme, one page there is picture.php where i would like to have a meta noindex, but there header from index.php is all over.
2:02 am on Apr 10, 2012 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Im not interested in a robots.txt solution


Than an if-then-else solution probably wouldn't excite you either.
3:01 am on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Maybe you could have someone cook you up a nifty .htaccess instruction to generate x-robots-tag for selected pages over in the Apache forum. The moderator, perhaps.
3:33 am on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



The moderator, perhaps.

!
I was going to say Don't hold your breath, but got suspicious and took a look.

I'll be ###. Where did they hide the announcement? It's like a good news / bad news joke isn't it. The bad news is that the Forum Founder had become irrevocably entangled in Real Life...
5:08 am on Apr 10, 2012 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



t's like a good news / bad news joke isn't it.


Scuse me?

Tread lightly ... ;)

I'm no Apache guru, I can write a few lines like most of us, but the forum needed a lot ofnon-expert mod stuff so I volunteered until a better option surfaces.
9:29 am on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



incrediBILL - it dont have to be a header solution, but robots.txt would not work be cause block each page will take 2 Month. if there is another solution to have a no index on my final pages as I call it, that would also be great.
1:22 pm on Apr 10, 2012 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



be cause block each page will take 2 Month


I don't think NOINDEX works any faster if that's your only criteria. I've had robots.txt blocking alter what's in Google's index pretty fast in some instances, not so fast in others. However, if you want to speed up the process, after altering robots.txt or whatever, go into WMTs and use Crawler Access -> Remove URLs to get rid of them quickly.
1:59 pm on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



ohh that was not what I ment, I have to add those pages to the robots.txt its not a folder its pages
3:55 pm on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member rocknbil is a WebmasterWorld Top Contributor of All Time 10+ Year Member



10 posts into the thread I'll venture:

- your pages are rendered via PHP.
- the header is part of the index.php, see option 2 below.
- The question is:

What is (or are) the condition(s) under which you'd like to issue this noindex?

option 1 is easiest: in your php,

if ($condition) {
echo '<meta name="robots" content="noindex,nofollow">';
}

If the condition is only a few, you can do if's. If it's many, you may want to do an array or a preg_match.

$noindex=('example.php','this.php','that.php');

if (in_array($_SERVER['REQUEST_FILENAME'],$noindex)) {
echo '<meta name="robots" content="noindex,nofollow">';
}

Option 2: your header really should be an external include. You can use the same logic in your index.php file to include the right one (or any number of "right" ones.)

$headers=(
'example.php' => 'ex-header.php',
'this.php' => this-header.php',
'that.php' => 'that-header.php'
);

if (array_key_exists($_SERVER['REQUEST_FILENAME'],$headers)) {
include($_SERVER['DOCUMENT_ROOT'] . '/' . $headers[$_SERVER['REQUEST_FILENAME']]);
}
else { include($_SERVER['DOCUMENT_ROOT'] . '/standard-header.php'); }

These are a few of a bazillion **simple** ways you could approach the problem.
8:01 pm on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



That's what I said

:: whine ::

minus the exact wording, on account of I don't speak php :(
10:51 pm on Apr 10, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I apologize. I was in a crazy mood the other day, but I didn't want to put anybody on the spot.

I did a little research and found some information about what I had in mind. I intended to suggest that a <Files> directive could be used to manage this issue in one fell swoop. You could use something like:

<Files "contact.html">
Header set X-Robots-Tag "noindex"
</Files>

This example only works with the named file, but you can expand it to use regular expressions by preceding the file name with a "~" or by using <FilesMatch>. If you wanted to assign this header to a directory, you could use the <Directory> directive instead as well. Both Google and Bing support this feature.
12:36 am on Apr 11, 2012 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Oh, and going back to the OP, plus his later explanation of why robots.txt won't cut it:

No matter what approach you take, if you want the pages to disappear right away, you will have to go into gwt and remove them manually. The "noindex" tag by itself won't make search engines delete the file from existing indexes; nothing will change until the next time they come looking for the file.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month