Forum Moderators: coopster
In other words, is it really worth the effort to eliminate URL parameters.
If so, is there a way to get treat the 10K in the first one as 10K.php?
When using PHP in particulat, is there a way to eliminate URL parameters in IIS without having tons of subdirectories - one for each script.
I'm hoping someone out there has experience with this.
Thanks
David Brooks
[edited by: pageoneresults at 7:05 pm (utc) on Mar. 23, 2005]
[edit reason] Examplified URI References and Removed Email Signature [/edit]
SEF URL [google.com]
on my site I had urls like this:
[webmasterworld.com...]
and google would spider most of those pages, but they didn't rank so great, if at all...
changing the urls to look like this:
[webmasterworld.com...]
Got me a number 5 ranking in google for that company's name.
In my case it was as easy as buying an addon for the content management system I use (called mambo), there are definitely ways of doing it through mod rewrites, but I am no master at them...
look for jdMorgan under the "Apache Web Server" section of this site, he's the friggin god of mod rewrite...
[webmasterworld.com...]
-Jason
(note, as the urls shown above are underlined, it looks like there are spaces in my urls, they are in fact underscores "_")
[edited by: coopster at 4:10 am (utc) on Mar. 24, 2005]
[edit reason] examplified company name [/edit]
Suggestions please..
I'm not sure on this one...
I'm not using any file extensions at the end, they all look like directories...
I will say that for the company whose name I rank number 5 for on google, number one is www.examplecompany.com, #2 is a .htm, #3 is a .cfm?s_booth=88654, and #4 is a .asp, whereas number 5 (me) and number 6 (yahoo) are both directory style and #6 is a .html
best I can tell it makes no real difference, personally I think the directory method looks cleaner but there's so many factors involved in ranking I couldn't say for sure if google (or others) really care about the file type.
-Jason
Thanks
DRB
In the spider-friendly-keywords.php page simply put:
<?php
$spiderFriendlyParms = array ( 'id' => '12345' );
include 'generic-script-name-with-parameters.php';
?>
Then in 'generic-script-name-with-parameters.php' put:
if( isset( $spiderFriendlyParms ) )
{
foreach( $spiderFriendlyParms as $key => $value )
{
$_GET[$key] = $value;
}
}
(of course, I could just use array_merge() as well)
This allows the parameters to get passed into the actual PHP page using global vars and then they are copied into the $_GET array so the script can't tell the difference.
It works well. Now I just need to automatically generate one of these stub Spider-Friendly-Keywords.php pages for all the possible parameter values that could be sent to the database query.
In my case that is a few hundred of them.
David