Forum Moderators: coopster
I broke it up into 4 pages - header, left nav, right nav, and footer. The main part of the page is where the content that changes resides.
I was excited about how dynamic and flexible this is - I can change the site, update the template, and I'm set.
However, the problem I see with this is that the links are all on one page, therefore decreseasing the interlinkage. Basically, instead of receivng 20 inbound links (from the nav), they only recieve one link from the left nav page.
Is ther a way to work around this? How effective is PHP to optimize sites?
With regard to interlinking: I know that my own internal links and those of other sites have been pretty much discounted by Google after the earthquake that was the last update. I wouldn't stress. Just make sure that your pages are easy for Mrs. Googlebot to spider and fine. A site map may be in order. :)
To rank you need keywords in anchor text. Keywords in anchor text ONCE (because that's all you need for php) isn't going to cut it - I think you need the kw in the anchor text at least 10% of inbound links to be able to rank for that word.
A site map won't cut it - that's just 1 more link - so far you have only 2 inbound links for the inner pages.
Apart from taking the left nav out of PHP (which is what I'm about to do, but I don't want to because it means updating that section on every page once I change the nav) - what else is there to do?
But: How about creating a sort of PHP iframe for your left nav? This should be what you're doing anyway. The menu is a php include for all the pages - so it will be present on each page. Or Google will think that it is, anyhow. Google doesn't care if the menu is a PHP include file or not, it will index the menu seperately on each page it features on. I.e - the menu may be calling menu.php, but Google will re-index it every time it spiders an individual page with the menu.php include on. Does that make any sense?
<edited> for nonsense...
I'm using the "require" function.
Each page is comprised of the 4 "template" parts, and the body of the page, which is unique for every page.
Do you know if this is correct?
E.g.
URL mysite.com/page/45
Runs a phpscript called page (server side with all the necessary includes)
The script page parse out the parameter 45 and displays the data for item 45, and no-one needs to know it's really a dynamic page.
I broke it up into 4 pages - header, left nav, right nav, and footer. The main part of the page is where the content that changes resides.
Header, nav and footer are not separate pages, they are 1 page. im not following your problem with internal links. The unique pages are your content pages that make up the body of each page.
As charlier mentioned, take a look at the source of the page and it should make sense. The spider sees what the browser sees.
That's why php is so good for optimizing I can do a million and one things and the spider doesn't know the difference. I build the perfectly optimized template and have it included on all the pages with out query strings and using html extensions. Spiders are in heaven and i only have to tweak one page and it echos across the whole site.
<added>I would strongly suggest against using a custom 404 to serve pages. You end up with one of two problems. The site never actually sends a 404 or you are sending 404 headers for every page on the site. I can't imagine google will rank a bunch of unfound pages, regardless of whether it spiders them or not.
But maybe im off track and you are actually using some sort of framing tech as well or within php?
I use this rewrite command in htaccess
RewriteEngine on
RewriteBase /data/
RewriteRule ^filename-(.*)\.html$ filename.php?variable=$1
on a page with 5 sections (header, nav, body, footer, and special box) and it works perfectly.
for example I send
Header("HTTP/1.1 200 OK");
Header("Status: 200 OK");
Also you need to use the $_SERVER['REQUEST_URI'] variable to get the requested path, rather then the PHP_SELF variable which will have been changed to the URL of your error document. One caveat with this approach is you loose POST data if you are submitting a form. For my email archives I use a structure like /listname/vol/docprefix_docID.html. The php code pulls the URL apart and selects the correct list,volume pagetype and page form the database and displays it.
As I said in my prior post, Google has 22,000 pages from this archive (5 lists) and the home page and current index pages have PR7s.
Cheers
ChralieR