@topr8: glad I could offer some inspiration.
@engine: I agree, raw log files are the basis for just about everything in site analytics. And site analytics is what is fueling just about everything under the hood going forward.
--------
There is a seed change on the web that many/most webdevs seem unaware is occurring, certainly not it's extent. But it is not actually new. Various sites have been pushing various bits for almost a decade now.
It's sort of 'understood' that a URL is unique. Google has, pretty much since forever, petrified webdev/SEO communities about this by screaming 'cloaking'. Except that they've also been opening that door for years because of mobile. And Local. And... the future...
For years one has been able to use the
Vary Header [tools.ietf.org] (WebmasterWorld seems to ignore/strip page anchors from links [ #section-7.1.4 ] ) to, for instance tell a SE that it needs to vary how it 'sees' a page...
Dynamic serving -> The Vary HTTP Header [developers.google.com].
Back in late 2006 and through 2007 I was building customised landing pages for WoM FB campaigns...
Basically I have created a number of broadly personalised (specific small group targeted rather than individual) landing pages. Each aggregate selected existing site information (mini-portal? start-page? landing-page?) somewhat differently with about a third being common across all. While part of the domain there are no in-links from the rest of the site and SEs are disallowed. These new pages do out-link both to each other and to the rest of the site.
...
Slowly the SEs have been overcoming their 'cloaking' phobia so that, for instance, geo-targeting is somewhat allowed. However, delivering 'personalised' niche material remains highly problematical, which is why I deny SE bots from larger and larger portions of my sites. They simply do not handle it well when offered by others.
---from Cre8 thread at the time
which worked extremely well but were clunky. And then I discovered the Vary Header. And eTags. And some other bits and pieces.
It's been a slow, if steady, play/test/repeat process but all pages are now served on presumed visitor context - a year or two ahead of schedule. So a single URL, i.e. www.example.com/somedir/somepage.html, may actually be served in several thousand variations based on all the personalisation data at my fingertips (aka residing in one or more analytics DBs).
Critical Note: so far the SEs, including Google, have not seemed upset by how/what I'm serving... so far... caveat emptor.
If a visitor is new or has cleared their cache or otherwise removed the eTag the page request fails over to more basic personalisation settings and the page may - or may not - look different, contain different/same content in same/different order, etc. A simple instance is whether American/British spelling.
Yes, it has been a long arduous often frustrating process, however, it has also been intreguing, fun, and taken conversion rates to ever greater heights. And so too revenue and profit margin. Google search (minus bots) traffic conversion rate from ~1.2% a decade ago to ~2.5% past few years; OA average CR from ~5% a decade ago to ~9% last year; return visitors from ~4% a decade ago to ~14% last year.
Also, my comments in:
*
While accommodating mobile traffic... Everything changed [webmasterworld.com], October-2016.
*
Is it time to overwrite content until it sticks? [webmasterworld.com], August-2016.
*
Dynamically Generated Landing Pages help CTR and Conversion %, what about SEO? [webmasterworld.com], November-2015.
By the power of Analytics!