Forum Moderators: coopster

Message Too Old, No Replies

Optimizing PHP sites

is it effective?

         

2_much

10:55 pm on Jul 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've recently been playing with PHP and am about to launch a site in PHP.

I broke it up into 4 pages - header, left nav, right nav, and footer. The main part of the page is where the content that changes resides.

I was excited about how dynamic and flexible this is - I can change the site, update the template, and I'm set.

However, the problem I see with this is that the links are all on one page, therefore decreseasing the interlinkage. Basically, instead of receivng 20 inbound links (from the nav), they only recieve one link from the left nav page.

Is ther a way to work around this? How effective is PHP to optimize sites?

edit_g

11:02 pm on Jul 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



PHP will work fine. Just make sure you don't use too many variables in your URL's. I.e. don't have the DB create pages like blah.com/blah?blah!blah=blah+blah&blah (you know what I mean).

With regard to interlinking: I know that my own internal links and those of other sites have been pretty much discounted by Google after the earthquake that was the last update. I wouldn't stress. Just make sure that your pages are easy for Mrs. Googlebot to spider and fine. A site map may be in order. :)

2_much

11:23 pm on Jul 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm sorry edit_g, I'm not sure I phrased my question correctly.

To rank you need keywords in anchor text. Keywords in anchor text ONCE (because that's all you need for php) isn't going to cut it - I think you need the kw in the anchor text at least 10% of inbound links to be able to rank for that word.

A site map won't cut it - that's just 1 more link - so far you have only 2 inbound links for the inner pages.

Apart from taking the left nav out of PHP (which is what I'm about to do, but I don't want to because it means updating that section on every page once I change the nav) - what else is there to do?

edit_g

11:29 pm on Jul 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ok. That still doesn't change the fact that Google is giving less and less weight to internal links.

But: How about creating a sort of PHP iframe for your left nav? This should be what you're doing anyway. The menu is a php include for all the pages - so it will be present on each page. Or Google will think that it is, anyhow. Google doesn't care if the menu is a PHP include file or not, it will index the menu seperately on each page it features on. I.e - the menu may be calling menu.php, but Google will re-index it every time it spiders an individual page with the menu.php include on. Does that make any sense?

<edited> for nonsense...

jatar_k

11:36 pm on Jul 9, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



2_much Would I be correct in assuming that you are including that left nav in the page using the include or require function?

Also do all of your pages have an actual pagename or is all the content just piped through a template?

2_much

1:36 am on Jul 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A friend just said to me that PHP is server side (i'm ignorant about tech stuff) so that Google sees the page once it's assembled.

I'm using the "require" function.

Each page is comprised of the 4 "template" parts, and the body of the page, which is unique for every page.

Do you know if this is correct?

charlier

1:58 am on Jul 10, 2003 (gmt 0)

10+ Year Member



Yes PHP is all server side, just do a view source and you will see the page exactly as Google will see it. If you are requiring the nav file on different URLs then Google will see them just the way they would see static pages. You can of course use the 'GET' data to show different pages on the same URL but that is a bad idea. If you want to have all the pages generated from a database and have no static pages at all you are better off using a 404 error document and encoding the page you want to show in the URL itself, eg /mypage27.html where the 27 tells PHP which record to pull from the database. I use this on an email archive which has 22,000 pages in Google so I know it works. I belive there is also a way to use the rewrite engine on Apache to do this but I have not tried it. A search here should find quite a bit about how to do this.

olwen

2:07 am on Jul 10, 2003 (gmt 0)

10+ Year Member



I'd be a bit wary of using a 404 page to create the dynamic pages. I use a php script with not extension and use .htaccess to force it to be executed by php

E.g.
URL mysite.com/page/45

Runs a phpscript called page (server side with all the necessary includes)

The script page parse out the parameter 45 and displays the data for item 45, and no-one needs to know it's really a dynamic page.

cayleyv

3:02 am on Jul 10, 2003 (gmt 0)

10+ Year Member



I broke it up into 4 pages - header, left nav, right nav, and footer. The main part of the page is where the content that changes resides.

Header, nav and footer are not separate pages, they are 1 page. im not following your problem with internal links. The unique pages are your content pages that make up the body of each page.

jatar_k

6:41 am on Jul 10, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



Dead on the money 2_much, I was leading you a bit because I figured that was exactly the disconnect you were having.

As charlier mentioned, take a look at the source of the page and it should make sense. The spider sees what the browser sees.

That's why php is so good for optimizing I can do a million and one things and the spider doesn't know the difference. I build the perfectly optimized template and have it included on all the pages with out query strings and using html extensions. Spiders are in heaven and i only have to tweak one page and it echos across the whole site.

<added>I would strongly suggest against using a custom 404 to serve pages. You end up with one of two problems. The site never actually sends a 404 or you are sending 404 headers for every page on the site. I can't imagine google will rank a bunch of unfound pages, regardless of whether it spiders them or not.

chiyo

6:49 am on Jul 10, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My understanding is that carleyve is correct, from my limited understanding. On some pages we use php includes, and from what i know google or any robot only sees what is on the page AFTER it is created server side. (A clue is to look at the source code which is exactly what is on the page, not links to your includes). Therefore they dont see it as frames or such.

But maybe im off track and you are actually using some sort of framing tech as well or within php?

bilalak

6:50 am on Jul 10, 2003 (gmt 0)

10+ Year Member



Do not worry about how many sections there are in your php page because the most important is to see how optimized is your final page for search engines including google.

I use this rewrite command in htaccess

RewriteEngine on
RewriteBase /data/
RewriteRule ^filename-(.*)\.html$ filename.php?variable=$1

on a page with 5 sections (header, nav, body, footer, and special box) and it works perfectly.

jatar_k

6:53 am on Jul 10, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



Welcome to WebmasterWorld bilalak

bilalak

7:27 am on Jul 10, 2003 (gmt 0)

10+ Year Member



Thank you Jatar_K

charlier

12:48 pm on Jul 10, 2003 (gmt 0)

10+ Year Member



Sorry I didn't add a lot of detail as this has been the subject of a number of old threads. If you use the Error document approach you need to send your own '200' header.

for example I send

Header("HTTP/1.1 200 OK");
Header("Status: 200 OK");

Also you need to use the $_SERVER['REQUEST_URI'] variable to get the requested path, rather then the PHP_SELF variable which will have been changed to the URL of your error document. One caveat with this approach is you loose POST data if you are submitting a form. For my email archives I use a structure like /listname/vol/docprefix_docID.html. The php code pulls the URL apart and selects the correct list,volume pagetype and page form the database and displays it.

As I said in my prior post, Google has 22,000 pages from this archive (5 lists) and the home page and current index pages have PR7s.

Cheers
ChralieR