Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dynamically Generated Landing Pages help CTR and Conversion %, what about SEO?

         

Nutterum

8:00 am on Nov 25, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



As the title goes. I will be involved in a big website development project that will work on the idea of personalized landing page content. We will take into consideration data like IP geo-location IP range, time and date of visit for behaviour personalization and better mobile experience, search query (if applicable) as well as source/medium data and later on will include remarketing and heatmap data to further personalize the user experience.

In short we want to build a website that has over 400 variants for each landing page, that will trigger when the visitor fall under certain conditions. Our belief is such approach will dramatically increase CTR as well as conversions while providing the most valid (in our eyes) layout/colors/content/call-to-action to them. Imagine it as your facebook wall personalization employed by Facebook, but on a website.

There is one problem though. How to make sure that everything will not turn into one big SEO mess. What are your recommendations, have you worked on similar projects before and if yes, can you share some best practices? How do you show Google your landing page if that landing page has possibly hundreds of variants and possibly even more content variants, with titles and call-to-actions matching search queries and intent or certain remarketing triggers.

aakk9999

11:36 am on Nov 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The same landing page variations - will they have each a separate URL? Or will you serve slightly varied content on the same URL?

Also, s there a "vanilla" landing page variant? Eg. you cannot match IP to any geolocation and no cookies from previous visits, what landing page you show?

netmeg

1:38 pm on Nov 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I use dynamically generated landing pages for PPC, but I keep them out of the search index. I'll be interested in hearing how you do with this; I probably wouldn't try it with a client site, but it might be an interesting experiment with one of my own.

Nutterum

8:20 am on Nov 26, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



We will have a "vanilla" landing page. All other pages will be rel=canonical to that page. Yet all the content including on-page SEO elements will be entirely automated and generated in from a script that will try to show "the best landing page this specific visitor would like to see" . I thought about running the pages loose on the web but I will get pandanized more quickly than I can say Matt Cutts Rocks, so I have to make sure I won't self spam the entire domain.

As for the URL. It will be one URL but with different parameters after it. We are thinking of using in-house parameters we will code as well as Google Tag Manager.

netmeg

1:23 pm on Nov 26, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



All other pages will be rel=canonical to that page.


How many variations are we talking about? I could see this maybe being a red flag to G.

If I were going to try this, first I'd probably just try a couple of variations, and see how the users react first, and then how Google reacts before I expand it out too far. If the users don't respond as expected (and I trust you will have some goal defined and a standard for measuring whether it's a success) then you might be going to a lot of trouble and potentially some risk for nothing.

Andy Langton

2:02 pm on Nov 26, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If the content is different, you can't rely on the canonical tag, which is a "hint" to Google to map the pages together. Your success depends on how different the pages are, of course. You certainly don't want Google to find a whole bunch of pages with different keywords inserted in them.

It will be one URL but with different parameters after it


Presumably, this means you redirect users? You should consider blocking Googlebot from both the redirection mechanism (e.g. do it with an external javascript that is blocked in robots.txt), and the landing page variations.

Otherwise, you could dynamically change the content without changing the URL. This avoids having so many errant URLs 'in the wild'. You would then need to make sure that Googlebot is treated the same as any other visitor, noting that Google always crawls from the US, and never sends a referrer. Be cautious about visitors clicking on Google search results being shown something different from the page Googlebot receives.

lucy24

7:14 pm on Nov 26, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



<tangent>
How does this work with humans? Say Visitor A recommends the site to Visitor B ... and then Visitor B sees content that's different from what Visitor A saw. I could see this leading to confusion and hurt feelings all around.
</tangent>

Nutterum

8:33 am on Nov 27, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



That is my concern @Andy Langton. The content will not be radically different but the some parts of the layout and the call to action as well as maybe the colors and perhaps the title will be different, taking in to account the various factors we are going to plug in.

Luckily for us we aim mainly for the US and Europe, so Gbot will be satisfied with what he is seeing.

@lucy24 - well yes and no. We aim both the big and the small. If our algorithm catches the IP cluster of a big company, the content will be tailored to that big company. Say someone from that company recommends the page to his souse. When he/she lands on the page, we will treat the visit as a small tier/freelance visit and show visuals, call-to-action and products related to them. To give a rough example, if someone from the HQ of Microsoft checks the website, we will show the most shiniest of landing pages full or trust signals and awards, but to the spouse we will show a more relaxed airbnb style of landing page. The text content will be a bit less corporate and more of the try us for free, or check these product bundles type of deal. I understand we can't personalize every visitor but we have over 20 visitor portfolios we want to cater for via this approach.

Edit: Imagine the normal A/B testing of a landing page but with with 300-400 variants and the test will continue indefinitely while trying to smart match the visitors to the variants via good amount of scripting.

Andy Langton

9:44 pm on Nov 27, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Search engines still (quite reasonably!) expect the "one item of content per URL" model, so to me, this is canonicalisation problem. If you keep the search engines at the 'default' then you're unlikely to run into problems. That could be redirects or robots exclusion, or a combination of the two. Just don't make 400 pages that are available for indexing.

lucy24

11:39 pm on Nov 27, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If it's always the same googlebot crawling from the same US-based IP, there shouldn't be all that many possibilities. It would be different if all the variable stuff were shoved into visible query strings, and then every time someone gmails a link to a friend, another version gets crawled.

Except, oops, this couldn't be construed as cloaking, could it? The human clicks a link in a SERP and ends up seeing different content than what the search-engine spider saw.

keyplyr

4:47 am on Nov 28, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



IMO the SEO challenge is not to compete against yourself, so if it were me, I would create unique file names as well as page titles, content and possibly even DOM structure (if the script is that sophisticated.)

I've taken over web sites where the former SEO genius sold the customer a sack of failed promises since it usually takes a few months for the new rankings to settle in. By that time the checks were cut and the perp was long gone. The rankings went bad because some pages bumped other pages out of the SERP due to similar content and Google just used the most relevant page, now diluted of course.

iamlost

10:15 pm on Nov 29, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Nutterum: thanks for bringing the topic of page/site personalisation up. It is a topic that has interested me and that I've been working in for years.
Note: I've been writing about this for years over at Cre8asiteforums for those who want to go read.
Note: my sites are evergreen info monetised via direct ad sales, affiliate referral, and AdSense as default filler; search traffic is an average ~40% of total, Google ~22%.

There are two main components:

1. visitor identification.
There are broad and narrow workable IDs: broad targets groups based on referer (requires knowing something about the referring platform/site/page/market); narrow targets individuals (or very small groups, i.e. family), which I do via device fingerprinting and cross device usage.
Note: I prefer not to use cookies but others' requirements may vary.

2. personalisation implementation.
My first implementation, back in 2007, targeting FaceBook users, were multiple broadly personalised landing pages: each with snippets of information, from around the site, of interest to that subgroup plus some common general info as filler. There were no site links in and SEs were blocked, however there were appropriate links out to the rest of the site. A little band of 'influencers' then posted appropriate mentions with links. It worked quite well, sufficiently so that I replicated for several other SM platforms and subsequently for top referring sites.

However, as I continued with segment targeting I realised that the 'problem' was that I was building rather a large number of pages to say pretty much the same things in different ways so as to better connect with visitors. And that it was because I was thinking in pages rather than in information delivery. Then I read Brian O'Leary's essay Context First in either late 2010 or early 2011:

...my idea in a nutshell is this: book, magazine and newspaper publishing is unduly governed by the physical containers we have used for centuries to transmit information. Those containers define content in two dimensions, necessarily ignoring that which cannot or does not fit.

Worse, the process of filling the container strips out context – the critical admixture of tagged content, research, footnoted links, sources, audio and video background, even good old title-level metadata – that is a luxury in the physical world, but a critical asset in digital ones. In our evolving, networked world – the world of “books in browsers” – we are no longer selling content, or at least not content alone. We compete on context.

I propose today that the current workflow hierarchy – container first, limiting content and context – is already outdated. To compete digitally, we must start with context and preserve its connection to content.

And I went YES!
It just made so much sense. Even before I thought about mobile, which was still only 3-or-4-years (smartphone) old. Initially in my mind context was demographic, was marketing segments. Within a month of research I understood clearly that context (in all it's myriad conglomerations and permutations) was the foundation on which content should be built/delivered.

Instead of thinking: structure/semantics -> content -> presentation/behaviour for each target also remove structure/semantics from the 'page' such that a given URL is totally amorphous. Think instead: context -> content -> structure/semantics -> presentation/behaviour.

My second implementation, starting back in 2011, was designed around device fingerprinting to identify returning visitors. The foundation idea was to convert as many first time visitors as possible to returning customers, then work at personalising those customers' experiences where feasible.

Note: I know that a URL is actually a subset (with URN) of URI but I view a dynamic personalised page as a recombination of URIs within a URL. You are welcome to use your own worldview.

In time (still in research stage) a site may become a recombination of URIs packaged according to personalised requirements. rather than move from formal page to page one would move through information chunks (I don't have the computer power to go more granular in real time). But that is probably 3-5 years off. Currently, I'm testing moving through information chunks in various ordering within each page.

A simplified walk through:
Possibility 1:
* visitor is not recognised arriving via SE or non-targeted referrer: default page for that URL is served, while visitor is fingerprinted and added to database.
* visitors actions, i.e. scrolling, cursor, apparent reading speed and interests, clicktracks are logged and associated with fingerprint ID.

Possibility 2:
* visitor is not recognised arriving via targeted/recognised referrer: a broadly semi-personalised page is served based on the referring page's title, heading, description, link's anchor/surrounding text and subheading.
* visitors actions, i.e. scrolling, cursor, apparent reading speed and interests, clicktracks are logged and associated with fingerprint ID.

Possibility 3:
* visitor arrives and is recognised, prior site interests checked against landing page content, and a personalised format based on context of person, device, history, etc. is served.
* visitors actions, i.e. scrolling, cursor, apparent reading speed and interests, clicktracks are logged and associated with fingerprint ID.

Note: I know broadly (P2) and narrowly (P3) which trinkets or widgets or thingamabobs or what-have-you were sold through affiliation or were checked out on my presell pages. So, I can better target each return. I know which ad pages for which products interested you the most over time so I know which coupons and offers to suggest first and second on given days in given seasons. Conversions increase significantly.

Note: only the default (possibility 1) is shown to SEs. All off site links resolve to the default unless (1) the visitor is recognised or (2) a non-SE visitor arrives via a targeted/recognised referrer.
Note: remember that I am NOT changing the URL, just the chunks and their order within, all done server side on the fly.

Important Note: Possibility 1 was implemented in 2011, Possibility 2 was added 2013/14 (held up while sites went responsive with server assist), Possibility 3 is in live testing now. That it works and works well is already apparent.
What is still unknown is how SEs especially Google will respond. So far so good (guess blocking is sufficient) but is early days...

And... let me tell you that context delivery is neither simple nor easy.
* identify kinds and degrees of user context awareness in specific scenarios.
Think personas on steroids.
* develop from the above a context awareness system that is replicable, reusable, and scalable.
* know and build within existing technical constraints while watching to see where they might be eased in n-time.
To add to the difficulty remember that context is not static but a dynamic process with a history. In fact much of what is termed 'context' is in reality a snapshot of a specific moment in context or context state; remember that context is a process, a flow, a series of context states over time. And that history, the previous context states influence future ones. Having fun yet?

To address the OP question: what about SEO?, I have to say that the only SEO is that I've done my best to block SEs from the personalised pages so as not to confuse the poor critters and possibly damage the query standings of my non-personalised pages. I also have a query in through various channels to several SEs including G but have yet to hear back.

ergophobe

6:38 pm on Nov 30, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Personalization is all the rage - Sitecore exists at the high prices it costs for a Sitecore site because they are selling personalization hard. Acquia Lift, with Sitecore firmly in its sights, is now pushing personalization as well.

So I feel like this is a solved problem in some respect. I don't personally know how to solve it, mind you, but huge companies where a tiny drop in sales means scores of people go home and don't come back are somehow pulling it off. Though it may well be that they take some level of SEO hit in return for way better conversion rates.

Anyway, mostly I just wanted to comment on this

IP cluster of a big company


In my case, my IP tells you I'm in New York. I'm actually in California, but the company network is run through New York. If I were at one of the Australian locations, I have no idea what it would tell you.... I assume Australia, but I'm not even sure of that.

Nutterum

2:23 pm on Dec 1, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



@iamlost - I am extremely grateful for your post. A lot to think about and a lot I agree with. We aim pretty much the same thing down the line. We want to provide useful experience to the different people visiting our website. The context before content idea is very very powerful and something to consider. Perhaps this will allow us to lessen the number of landing page variants as well.

As for SEs, I figured that I can not rel=can much as this will create a mess. Blocking the personalized pages from the index is the easiest solution but we still ponder over the idea that if a searchers searches for carrots - we should show him a carroty landing page and Google should understand our intent without penalizing us. Time will tell if we can tell such a story to the SE-s or will be forced to no-index everything but the generic page and sort of "cheat" our way to personalization.

netmeg

5:26 pm on Dec 1, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



we still ponder over the idea that if a searchers searches for carrots - we should show him a carroty landing page and Google should understand our intent without penalizing us.


The problem is, Google isn't there; I dunno if it ever will be. It's not a human or a bunch of humans, it's some code. There's only so much intent it can ferret out. And if your intent *looks* like some kind of spammy intent, that's where it's going to go first. Because there are as many (or more) of them as there are of us.

Keep the pages out of the index to start. And start a very small test and see what Google does with them - you may need to do multiple tests over time; if it doesn't work the first time doesn't mean it won't in the future. That's what I would do.

Andy Langton

1:11 am on Dec 2, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



we should show him a carroty landing page and Google should understand our intent without penalizing us


Google's stance on this is reasonably logical - they want you to show users the page that was ranked in search results. There's only so much wiggle room you can get before this is no longer the case.

As far as algorithmic "cloak detection" goes, I don't think it's advanced as some might think. Google will visit and render the page (comparing the render to the expected content) and might follow with a visit that doesn't identify as Googlebot. If US visitors from Google search see the same page as Googlebot, you're unlikely to have any algorithmic difficulties. If you're making extreme changes to pages, a human evaluator might not see it that way.

If your landing page variations are all on different URLs, your risks of issues are much higher. Visitors might link to one or other of the variations (meaning you lose value). The fact that users do not stay on the URL they clicked in search results might raise some eyebrows. If you remove content that was important to rankings, then you're probably cloaking, in the true old school sense of the word :)

Storiale

1:05 am on Dec 3, 2015 (gmt 0)

10+ Year Member



We are doing the same thing here - not indexing dynamic pages (associated with our internal search), testing on PPC and then will manually create pages for SEO that do well in PPC. It's the best of both worlds, in my opinion. Can customize a bit, but no different than a weather site that provides info for local weather. That's the best case scenario - not willing to risk cloaking on a site that brings in 100 million a year in SEO revenue.

Nutterum

11:59 am on Dec 3, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



@Storiale - and there is the fine print realy. If I make the most awesome weather site that provides info for local weather, I`d make enough significant changes to the dynamic part of the content that will either be considered cloaking or duplicate.

As for the test with PPC then deploy to Organic - our team thought of this approach as well when creating the landing page variants. Thank you for confirming our thought process! We kinda stole the idea from Expedia and Booking.com - two websites that autogenerate landing pages for the purposes of their dynamic PPC campaigns. I am not entirely sure how they managed to do it, but we are aiming to go with a similar approach, but in different business vertical.

netmeg

1:51 pm on Dec 3, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



autogenerate landing pages for the purposes of their dynamic PPC campaigns. I am not entirely sure how they managed to do it, but we are aiming to go with a similar approach, but in different business vertical.


It's a lot easier for PPC than for search, actually (and there are services like Unbounce that let you create dynamic landing pages)

Nutterum

8:58 am on Jan 5, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Just a quick update. We managed to make a pretty robust workaround on the way pages are displayed via Google Tag Manager as well as IP and cookie data. We are about to test this over the next month. I will update further and provide more information on what worked and where we found problems with our implementation for those interested in this new type of organic page.