Welcome to WebmasterWorld Guest from 34.228.143.13

Forum Moderators: phranque

Message Too Old, No Replies

While accommodating mobile traffic

Everything changed

     
7:37 pm on Oct 15, 2016 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1206
votes: 339


Over in the October AdSense E&O thread Ironside asked a rather open ended question following my post:

@Iamlost did you do any radical changes to accommodate mobile traffic?

Short answer: yes.

Long answer summarised:
Note: and probably still too long...
I started seriously thinking and hypothesising and testing 'for mobile' from Wednesday, March 19, 2008. That was the day that Reuters published Google sees surge in Web use on mobile phones.

"We have very much hit a watershed moment in terms of mobile Internet usage," Matt Waddell, a product manager for Google Mobile, said in an interview. "We are seeing that mobile Internet use is in fact accelerating.
...
"Faster is better than slow, especially on a mobile device, where fast is much better than slow," Waddell said. "Not only are we are seeing increased user satisfaction but also greater usage."
...
Waddell said Google had seen iPhone users perform as many as 50 times more Web searches on these computer-phone devices as users of standard mobile feature phones typically do.

A slew of pertinent ideas were shared over the next few years: fault-tolerant design shifting from a primarily graceful degradation perspective to that of progressive enhancement, which nicely segued into designing for mobile first, fixed->fluid->elastic layouts, responsive design, etc. 2008 through 2010 was a mind-altering time for many webdevs including myself.

Even more then than now bandwidth and speed were critical considerations. Yahoo's User Interface library (YUI) (2006), YSlow (2007), and Google PageSpeed (2010) were suddenly being (re)discovered and widely shared. Google Analytics (late 2005) and Piwik (late 2007) provided insights previously only available to enterprise. The feeling was similar to the exhilaration and angst that accompanied the change from table layout to CSS.

I rebuilt my sites between 2010 and 2012 on the principle of designing from mobile first with progressive enhancement allowing additional content/features via a variation of RESS (Responsive Web Design + Server Side Components). So my sites have been not simply 'mobile friendly' but extremely mobile usable for 4-to-6 years.

Until this year all my sites followed all the common 'best' practices for speeding up HTTP/1.1. This past June through July I switched to HTTP/2, which meant that all those 'bests' needed to be dropped as hindrances and an adjusted mindset adopted for optimising for the new protocol, and saw an average 64% first connection to full render speed improvement.
Note: I changed for speed reasons, not for HTTPS, which for straight info sites is not a critical feature, imo.
Note: currently ~80% of traffic is HTTP/2, ~20% is failover to HTTP/1.1.

In 2011 I was blown away by Brian O’Leary, of Magellan Media Partners, keynote address at O’Reilly’s Tools of Change New York City conference, February 2011, entitled Context First: A unified field theory of publishing.
From a post I made in Cre8asiteforums, last year:

And I went YES!
It just made so much sense. Even before I thought about mobile, which was still only 3-or-4-years (smartphone) old. Initially in my mind context was demographic, was marketing segments. Within a month of research I understood clearly that context (in all it's myriad conglomerations and permutations) was the foundation on which content should be built/delivered.

Instead of thinking: structure/semantics -> content -> presentation/behaviour for each target also remove structure/semantics from the 'page' such that a given URL is totally amorphous. Think instead: context -> content -> structure/semantics -> presentation/behaviour.


After a great deal other prior and subsequent reading and research I now serve - to all devices - on context.
From a post I made in Cre8 in 2013:

And... let me tell you that context delivery is neither simple nor easy.
* identify kinds and degrees of user context awareness in specific scenarios.
Think personas on steroids.
* develop from the above a context awareness system that is replicable, reusable, and scalable.
* know and build within existing technical constraints while watching to see where they might be eased in n-time.
To add to the difficulty remember that context is not static but a dynamic process with a history. In fact much of what is termed 'context' is in reality a snapshot of a specific moment in context or context state; remember that context is a process, a flow, a series of context states over time. And that history, the previous context states influence future ones. Having fun yet? :)

Now one must collect specified user context information or metadata, within a framework build a (semi-)unique awareness of user context, match against user request, i.e. URL, considering referer(s), excluding extraneous, ranking remaining, and outputting a customised contextual result, i.e. page.

There are other considerations too :) The more I think I've got my head wrapped around what I'm doing the more I find that needs to be addressed.

And the process I use described, from a Cre8 post a year ago:

I've talked about what follows a number of times but only as disconnected bits; now I'm going to describe the forest rather than the trees. Some of what follows has already been implemented by a number of sites, some is in testing, some in research; however, it all is real and it all works if not as a whole live reality, not yet, quite.

Important Note: the following is simplified and limited, the reality is complex and far far more reaching.

You are warned.

A visitor arrives on site for the (apparent) first time and the default format for that landing page is served. Simultaneously the visitor is being 'fingerprinted' and entered into a database.

As the visitor scrolls, moves mouse, etc. reading speed and interests, individual on page track and multiple page click tracks are logged, entered into DB, and associated with fingerprint id.

Visitor leaves.

Visitor returns, is identified by 'fingerprint', prior site interests are checked against landing page content... and a personalised (rather than the default) format based on context of person, device, history, etc.



Mine are pure info sites; what others with eCom sites are doing is taking personal ID from orders and mining SM for additional data to better pinpoint offers. This for your mother's birthday, that for your anniversary, your nephews high school grad... all apparently serendipitous but in reality your web shared personal information leveraged. With nary a cookie or registration on site.



I know which books or movies or media devices or what-have-you I sold you through affiliation and which ones you were checking out on my presell pages that you didn't. So, I can better target you each time you return. I know which ad pages for which products interested you the most over time so I know which coupons and offers to suggest first and second on given days in given seasons.

Back to the beginning: a visitor arrives on site for the (apparent) first time from a back link aka not via SE and that referrer is checked against a DB. If not there, the default format... (and is logged to be added)... however, if already there I already know the context of what brought the visitor: the referring page's title, heading, description, link's anchor/surrounding text and subheading... and a semi-custom context driven page format and content is delivered and visitor is fingerprinted for future reference.

Yes, there is a fly (possibly - still too early to say definitively) in the ointment for those that live and die via Google: not every visitor sees the same page at a given URL; there is a default and then there are the personalised. It is using context to change content - remember how innovative was CSS? - this is a similar shift. The page is becoming a recombination of URIs under an overarching URL.

In time (still in research stage) a site may become a recombination of URIs packaged according to personalised requirements. Rather than move from formal page to page one would move through information chunks (I don't have the computer power to go more granular in real time).

The web is still wild and increasingly weird and loads of fun...

And the revenue rivers overflow with goodness and money. :)


And that, Ironside, is what I've done over the years, am still doing, to accommodate mobile. And totally alter my serving behaviour/design to all devices.
9:17 pm on Oct 15, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 2354
votes: 625


This sounds very interesting. A lot to think about.

I was going to ask for the link to the key note address you mentioned, but I found it on YouTube, for others that might be interested here is the link:
[youtube.com...]
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members