Welcome to WebmasterWorld Guest from

Forum Moderators: coopster & jatar k

Message Too Old, No Replies

php and search engines?

5:54 am on Jun 9, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 5, 2004
votes: 0

I finally figured out how to dynamically generate the content on my site using php and way I'm doing it is loading a header php file and then the individual pages content and then a footer below it with the rest of the template. Would this hurt my search engine rankings? I'm thinking it wouldn't since the main content is still right there in the main file itself but then again its still a .php3 file instead of .html
6:46 am on June 9, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 2, 2004
votes: 0

fewer pages will be indexed as some pages are likely considered as duplicated content. Since dynamically generated content may not be able to read by SE.
6:48 pm on June 9, 2004 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 21, 2001
votes: 0

1.) I believe you can make the dynamic urls look like static urls using mod_rewrite. There's lots of info at WebmasterWorld about doing this (I havn't done it).

2.) Make sure you have a nice site map or other static links for the spider

2:33 am on June 10, 2004 (gmt 0)

Full Member

10+ Year Member

joined:Oct 15, 2003
votes: 0


If you are using include statements to insert the header and the footer into your pages, you will still be indexed by the search engines. This is how I do my pages and have never had a problem being indexed or ranking well. The problems start when there are session ids in the URL.


1:49 am on June 11, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:May 6, 2004
votes: 0

Using includes for page content is an excellent system, wishing you well with the modularisation!

Simply, a page is a page; whether it's created by hand in a text editor, or generated server-side with php, it's still some html to a spider, just some text.

a links page(s)/site-map that has all your links on it (split into <100 per page, google recomendations), is useful, it certainly gets my site a deep and thorough indexing from the googlebot.

coho75's right about the URL's; if your pages generate/require variables in the URL's, don't worry; all the important search engines can handle generated.php?truth=seo-myth&whatever=yaddayadda type pages just fine. remember, it's the search engines that must strive to keep up with web server technologies; not the other way around; we don't have to regress for them! Anyway, http GET is pretty standard stuff.

True, session data and multiple-parameter URL's will increase spider access problems roughly in line with the URL's complexity, but for simple URL's, with just a few parameters, there's no problem. Same goes for sessions; only if a client is refusing session cookies will the session ID be forced into the URL (as far as I know), and the bots all seem to handle sessions just fine without doing that. For sure, my "wisdom of jack handy" page forces a session on every single visit (you only get three helpings a day, you see!), yet still manages to find itself right to the top of around 39,000 Google results.

If your URL's are really long and complex, no matter, it's a piece-of-cake to get mod_rewrite to translate static-to-dynamic links. some parts of my own site are accessible by two or three different URL schemes, as I re-organised things, I create re_write rules for the old URL's, no one minds, certainly not the bots. And everyone's links work just fine. similar methods can be used to redirect old URL schemes to now, too.

For more info, google for ".htaccess" and "mod_rewrite"