Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: open
And, as I just recently learned how to do cookies in perl, thanks to a tutorial in the tracking and logging forum by Brett (thank you), was considering how a cookie based cloaking system could benefit my users.
When they hit the index page, I could give them a page with some suggestions for their first visit, eg, the 'fun features' of the site...so that they will have more impetus to come back.
Also, I could throw a 'bookmark us now' at them, so that only first timers get it...
Then, if they've been there before, I could perhaps point them to the new features on site since their last visit (if any), or the page they have visited most often.
I'm not ready to implement this, but i have all the knowledge I'd need to do it. Has anybody else implemented a similar system from the ground up?
How did it work out? Any major issues that I need to keep in mind? What my thought is for search engines, is that if it's a spider, simply serve them the 'lack of cookie page' thereby doing away with the need to actually customize the content for them as well.
And, a search engine referral will probably be a new user to the site, so this way the cache and the user's experience would line up together.
Suggestions? Comments? Voice of experience? I am finally getting more into developing in some interactivity in sites, and I must say, it's hard sometimes to pick out just what feature I'd like to do next. This seems like a good one though, so I'd appreciate your feedback.
IMO cookies alone lack the ability to selectively identify the request with enough accuracy to know if you are serving appropriately in enough cases.