Welcome to WebmasterWorld Guest from 54.198.147.221

Forum Moderators: DixonJones & mademetop

Featured Home Page Discussion

Analytics is THE engine of change

     
3:53 am on May 14, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1044
votes: 218


Important Note: Analytics is become far far more than mere tracking and logging...

Critical point of reference: three-quarters of visitors are irritated when site content does not align with their interests.
Corollary: SEs get it wrong; a lot. And recovery, on the part of the website/landing page is brutal since the advent of 'not provided'.

Personalisation is increasingly one way the web is either becoming a creepy stalker or improving a visitor's experience. The line is not always obvious except in hindsight.
Start simple and add slowly as capability and experience grow. And as always test, test, and test, before rolling out broadly.
Example:
From the current session:
* referrer information.
* geolocation.
* device/OS/browser.

From prior sessions:
Note: much as a game remembers a character and gained attributes, which affect interactions going forward so too the site visitor...
* where they've come from, how often.
* where they've been, how often.
* what they've done, how often.

From similar referred, behaving visitors...

And build from generalised personas increasingly personalised profiles for which one can better target and recommend, create better business rules for cross and up selling...

Note: the danger being, obviously, too invasive, too intrusive aka too obvious. As with most things, when well done it appears seamless and natural. Also: don't hide what you are doing, be transparent; especially as more and more jurisdictions are creating regulation and legislation - ignorantia juris non excusat - ignorance of the law is no excuse.

And what is the backbone in the background of all this?
Analytics.

But
this is analytics that
* has stored terrabytes of visitor behaviour and referrer data and used it to create base models and profiles that can be recalled instantly...
* is capable of taking real time inputs and making real time decisions to adapt and update said models and profiles...
And
this is an analytics methodology/system at least as complex as a site's web serving infrastructure.

Just as one typically starts simply when publishing a site and builds it up as necessary to meet traffic and bandwidth and security requirements so too one should with one's analytics - once one makes the business decision to step past Google Analytics or Piwik, where they become but one of many customised inputs.

If you become as mad as I (and I am lost!) you too may end up with:
input -> massively scalable database (Hypertable) + distributed file system (Quantcast File System née Kosmos File System) + map/reduce implementation (MapReduce-MPI Library) -> permanent cached results + manipulation (Postgres-XL (eXtensible Lattice) Database Cluster) ->
(1) reports.
(2) action, reaction via other software solutions.

Bog standard one size fits all sites are increasingly unfit for competition in many niches. Only enterprise inertia and wide spread implementation errors by early adopters are holding back a tsunami level change in context driven personalised information and sales delivery on the web. The URI/URL ain't what it once was; increasingly, it's whatever you need it to be.

By the Power of Analytics!
8:31 am on May 19, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member topr8 is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 19, 2002
posts:3294
votes: 27


thanks iamlost a timely post .... and another example of how some of the great posts on WebmasterWorld sink without trace!

i've never used google but do use piwik as well as a very simple custom analytics that i'm slowly developing for my own sites - it hasn't been a priority, however you've enthused me to bump it up the ongoing work list!
10:46 am on May 19, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:May 9, 2000
posts:24332
votes: 553


Thanks for the post, iamlost.

I've been a great fan of raw log files since I first got access more years ago than I care to remember. For many, the data in there is a gold mine waiting to be discovered.
By all means, use a tool to add graphs and show trends, which is also valuable. I'll use any source available, no matter if it has "not provided" plastered all over. There's stacks of other data there to be harvested.

However, it's knowing how best to analyse and use that data which can play a key role in developing a site or service.
6:38 pm on May 19, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1044
votes: 218


@topr8: glad I could offer some inspiration.

@engine: I agree, raw log files are the basis for just about everything in site analytics. And site analytics is what is fueling just about everything under the hood going forward.

--------

There is a seed change on the web that many/most webdevs seem unaware is occurring, certainly not it's extent. But it is not actually new. Various sites have been pushing various bits for almost a decade now.

It's sort of 'understood' that a URL is unique. Google has, pretty much since forever, petrified webdev/SEO communities about this by screaming 'cloaking'. Except that they've also been opening that door for years because of mobile. And Local. And... the future...

For years one has been able to use the Vary Header [tools.ietf.org] (WebmasterWorld seems to ignore/strip page anchors from links [ #section-7.1.4 ] ) to, for instance tell a SE that it needs to vary how it 'sees' a page... Dynamic serving -> The Vary HTTP Header [developers.google.com].

Back in late 2006 and through 2007 I was building customised landing pages for WoM FB campaigns...

Basically I have created a number of broadly personalised (specific small group targeted rather than individual) landing pages. Each aggregate selected existing site information (mini-portal? start-page? landing-page?) somewhat differently with about a third being common across all. While part of the domain there are no in-links from the rest of the site and SEs are disallowed. These new pages do out-link both to each other and to the rest of the site.
...
Slowly the SEs have been overcoming their 'cloaking' phobia so that, for instance, geo-targeting is somewhat allowed. However, delivering 'personalised' niche material remains highly problematical, which is why I deny SE bots from larger and larger portions of my sites. They simply do not handle it well when offered by others.
---from Cre8 thread at the time

which worked extremely well but were clunky. And then I discovered the Vary Header. And eTags. And some other bits and pieces.

It's been a slow, if steady, play/test/repeat process but all pages are now served on presumed visitor context - a year or two ahead of schedule. So a single URL, i.e. www.example.com/somedir/somepage.html, may actually be served in several thousand variations based on all the personalisation data at my fingertips (aka residing in one or more analytics DBs).

Critical Note: so far the SEs, including Google, have not seemed upset by how/what I'm serving... so far... caveat emptor.

If a visitor is new or has cleared their cache or otherwise removed the eTag the page request fails over to more basic personalisation settings and the page may - or may not - look different, contain different/same content in same/different order, etc. A simple instance is whether American/British spelling.

Yes, it has been a long arduous often frustrating process, however, it has also been intreguing, fun, and taken conversion rates to ever greater heights. And so too revenue and profit margin. Google search (minus bots) traffic conversion rate from ~1.2% a decade ago to ~2.5% past few years; OA average CR from ~5% a decade ago to ~9% last year; return visitors from ~4% a decade ago to ~14% last year.

Also, my comments in:
* While accommodating mobile traffic... Everything changed [webmasterworld.com], October-2016.
* Is it time to overwrite content until it sticks? [webmasterworld.com], August-2016.
* Dynamically Generated Landing Pages help CTR and Conversion %, what about SEO? [webmasterworld.com], November-2015.

By the power of Analytics!
12:10 am on May 20, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:8900
votes: 401


Thanks iamlost , great post.

Personally, I never did appreciate Google Analytics, preferring to write my own zeitgeist from raw logs. I use pieces of the old analog software & some grepping to output reports.

I say I couldn't appreciate it, meaning the fault is on me since the vast majority of GA users find it extremely useful. I did use GA for a couple years early on when I first added Adsense, but at that time I wasn't able to see much of a benefit compared to the downside of having the code on my pages and giving Google that much information.

Eventually I removed code mostly because of the drag on page loads (although I hear that has improved.) Anytime a page must wait for remote code to make a trip across the internet before the entire page can load, that is a self-defeating enterprise IMO . Sometimes the load time hit was so noticeable, I saw a direct correlation in bounce rate. Removing the code site-wide significantly increased page load times & AFAIK played a part in the subsequent boost in ranking that occurred soon after.

There's also the issue I have with the misinformation that GA delivers, especially regarding Proxies, VPNs and identifying User Agents, but those are topics for other forums.

I do see the value of GA in a comprehensive SEO package put together for a customer, especially if that customer is also publishing Adsense & I have installed GA on a number of sites for that purpose. The graphs are pretty with a lot of information.
4:44 pm on May 20, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1044
votes: 218


I've never used Google Analytics, initially because I was already mucking directly with logfile data and couldn't see the value given available time constraints, later because Piwik came along that did much the same AND was open source so I could customise to my little heart's content; third party tools push easy default but not particularly valuable data points distracting many/most analytics noobies.

Perhaps the biggest initial step with analytics to learn to decide on YOUR information needs to meet YOUR business requirements and only then determine the tools and settings required to accomplish each specific goal.

The second might well be not to get too granular in most instances. So many here at WebmasterWorld seem to be comparing data points sub-hourly - unless they have sufficient cleansed aka wholly human traffic for time period statistical significance level the results will be meaningless; further, even should it be statistically significant it may not be practically significant.
Note: with site stats a sample size under 400 is usually pointless. Personally I work with 10,000 minimum, which depending on site/page traffic may take some time. For quick and dirty comparisons I simply use YoY: if off by 10% I watch, if continues I investigate. Otherwise? No time to sweat the small stuff.

Step by business step determine the metrics required, select appropriate tools (or portions of a tool such as GoAn or Piwik) to benchmark and track, and leverage the resulting analytics.
Research, test, analyse, repeat as necessary then trial, analyse, adjust as necessary then put into full production... a never ending interlocking set of operations.

All the while doing the myriads of other things necessary to keep a webdev business rumbling along...

Brother Can You Spare A Time?
9:31 pm on May 21, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 20, 2006
posts:1938
votes: 27


Engine, check. Looking now for wheels, and gas.
9:49 pm on May 21, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:8900
votes: 401


RE: Piwik or GA

Well I guess if you don't mind that extra script on every page (that Google Pagespeed warns about) and the hit on page loads (especially mobile) it's probably the easiest solution to getting analytics. I've already got several scripts on my pages, but they're all essential for content.
10:37 pm on May 21, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:13820
votes: 482


that extra script on every page

In the page itself, or called by the page (equivalent to an external stylesheet that only gets loaded once)?

I don't know about GA, but piwik is huge. Or at least it seems that way because it's bigger than most of my pages. I guess objectively it's no bigger than a typical image file.

:: hasty detour to nearest slab of raw logs to confirm that a multi-page visitor doesn't re-load the whole vast piwik on every new page, leading to interesting discovery that at some time when I wasn't looking, piwik.js got to be significantly smaller than it was just a few years ago (probably with the 2.x to 3.x change) ::
12:09 am on May 22, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:8900
votes: 401


For first time visitors, any page they load must also load the script for GA or Piwik. Some will even need to DL it every page depending on their caching settings.

Regardless, Google judges each page on its own merit. Even though CSS and JS are cached in most browser settings, Google demotes page score. Just one of hundreds of factors, but with the mobile index, page speed has become one of the most important.

Just something to consider and why I use downloaded raw logs and run it through pieces of the old analog software & some grepping to output reports from my local machine with no stat code on the online pages themselves.
2:46 pm on May 22, 2017 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1044
votes: 218


@keyplr:
Yes, there is some overhead with script based systems such as Piwik but that was less of a consideration back when and, given my minimal use of javascript negligible even now. More of a pita is that it requires MySQL and I run (everything else) on PostgreSQL. It sits on my to-get-away-from list but like many a ratty old possession I've grown accustomed...

The one analytics step past GoAn/Piwik and logfiles that I have always been intrigued is (almost) never taken is the crawling of regular back links. Perhaps the most missed (as in gone) metric ever is that of the search query; the advent of 'not provided' sunk a lot of webdev behaviours. Perhaps the difference is that so many webdevs only chase/think of Google traffic such that they have few/no actual traffic referring 'ordinary' backlinks?

The interesting thing about links is that they are actually link pairs: the referrer and the recipient. Because one's page is the landing side of the equation data is readily available. The referrer side is unknown, unless one goes and checks it out. When a human visitor arrives via a previously unknown referrer a crawler is auto-magically sent to explore: title, description, headers, anchor and surrounding text, etc. By comparing the two pages one can acquire an understanding similar to that of the defunct search query of 'why' that visitor came from that referring page to that recipient page.

If the landing page is not actually the best fit from that referring page a small addition near the top noting closer matching pages might save disappointed back button hits, sort of an in page 404 redirection suggestion.

If the traffic volume becomes signification it may be worth making personalisation/contextual changes within the landing page to ease the flow between the referring page, landing page and other site pages as well as to increase conversion rate. Etc. Think of it like knowing what sort of people your friend Joe likes and those Lisa or Jim or Sally like, there will be similarities but there also will be differences; overlapping Venn diagrams with the tiny common bit being you - unless you can modify and better connect differently more broadly with each.

None of which is possible without analytics and data, data and analytics.