Forum Moderators: coopster
Right now we do it via URL and GET in PHP link file. Perl script parses the content of the whole page and inserts variables to the end of URL. This way I cannot make PHP being parsed inside HTML as HTACCESS does nothing at the moment when Perl URL parser script “slurps in” and then “spits out” the content of the page. It’s too late.
So I thought about cookie/session where parser would extract variables from referring URLs, but it would not actually “touch” the content itself. The, it would pass variables onto PHP link file, or PHP link file would get variables from cookie/session.
Does this concept sound right to you?
Anyway, I didn't really follow the details of you setup, so in general terms...
Pros to GET params or sessions based on GET
- works even if user disallows cookies
Cons
- messier URLs
- have to take measures to make sure site is crawlable.
Pros to cookies or sessions based on cookies
- easier to keep URLs clean and neat
Cons
- user must allow cookies
Pros of sessions over cookies
- can easily track lots of data from page to page
- can automatically revert to passing session IDs via GET if user blocks cookies (but try to avoid serving up session IDs to bots).
No matter how you cut it, you're using either cookies or GET parameters. Sessions are more or less just another, dynamic way of doing that. Basically, a PHP session works by trying to create a cookie with the session ID and, failing that, creates a GET parameter with a session ID.
The main difference is that it only stores one piece of data in the cookie or GET - the session ID, which is then keyed to data stored in the filesystem. So it's less data to pass around (usually), but uses the same mechanism.
So if your visitor has cookies blocked, sessions will fall back to GET. If you have your sessions set up to use only cookies, the session will simply fail (as would a cookie).
I don't know if that helps at all...
The technical details of how it works now are:
- parser (Perl) slurps the content of the page,
- http referrer gets exploded and variables created,
- the whole page gets rewritten so variables get inserted to end of links (both internal and external)
- if you click onto external link (..file.php?m=something), file.php does GET on variables, inserts them to the end of outgoing link, job done.
That’s all what parser is used for. The problem is that once parser is done, and html output is there, PHP include will not work.
For reference, here is the code from htaccess:
RewriteEngine On
RewriteBase /
RewriteRule ^(.*)\.html$ parser.cgi?file=$1 [QSA,L]
So, if I use sessions (rather then cookies), and user has cookies enabled, my URLs will be clean?
Thanks again.
Pretty much. Is this data 100% necessary for the function of the site or for your income?
If not, I would set up sessions to use only cookies.
If so, I would use sessions in all cases, but detect the googlebot, slurp, the msn bot and any others you really want, and turn sessions off. Menalto Gallery takes this approach fairly effectively and it's open source, so you could get in there and find out how they do it.
No, data is not that crucial. At this moment, all data is just about search engine source, keywords, PPC referrals, etc.
From that perspective, cookie seems to be the cleanest way of doing it. If it fails, not a big deal. Plus, the failure rate should not be high, since most of the people have cookies enabled I believe.
[us3.php.net...]