Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Google & Traffic Shaping - a hidden method to the quality madness?



9:35 pm on Oct 27, 2010 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

On the Monthly Thread [webmasterworld.com], I've posted about my experience of what appeared to me an extended period testing
Using a multivariate dataset, across a range of different keyphrases, user intents and user types, Google exposed our site in marginal but significant ways (putting us up one place, dropping Universal search, above or below shopping results, etc). They did this with (at least) four separate sets.

Over the course of 6 weeks, we experienced a slow churn of referrals, with four discrete datasets:
BIG uplift in traffic since 1st SEPT (20% above trend) with a corresponding drop in conversion rate, so sales were broadly static (on trend). Referrals shifted at precisely the same time. No visible change in ranking.

A referral shift on 16th Sept, another 6th October- both traffic and conversion neutral (relative to the 1st Sept).

Then the biggy. 12th October, huge referral shift. Traffic-neutral, but conversions back at pre-Sept level. In other words, we are now 20% up on sales. The referrals are NOT the same (or even particularly similar) to the pre-September level

Does anyone else have any experience of what appears to be purposeful traffic shaping- with a definitive end result?

In the past, I've shied away from any theory that requires a "my site is special" mindset, but I am convinced this was outside the normal algo development cycle. My personal point of view is that Google is aggressively profiling users and sites, and trying to match the two within a specific context. Any takers?

I don't want this to become a "Google has no purpose, all my traffic is going to SPAM sites" free-for-all. Please post with qualitative data, or some meta analysis.


9:17 pm on Nov 23, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

WRT "The Big Five" factors - overlooked there is an increasingly important SE factor: semantic clarity. When your content is sending clear semantic signals, then there's little need for Google or anyone else to do dynamic testing to find the "right" audience, the "right" query intention and so on.

It is easier to achieve semantic clarity with an all div layout - tables tend to juxtapose content in the source code that comes from various parts of the visible page, for example. Does this mean I only work with table-less layouts? Hardly, it's often not practical with an enterprise level CMS.

Still, optimizing semantic and relevance signals in the content area is an important factor in avoiding the dynamic testing that may otherwise be your site's fate.

Many sites (I dare say most site) do just fine and don't experience the kind of fluctuations that this thread is focusing on. But it is an area I'd double check if you find yourself on a severe conversion roller coaster.


12:15 am on Dec 14, 2010 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

I see traffic fairly constant , yet trophy terms not clicked on, having been driven deeper on the page.

Is the constant nature of traffic caused from traffic shaping or user quereies on deeper long tail search, caused by changes to the SERP layouts and Google instant, or a combination of both.

It seems to be getting really too complicated to analyse with the regional and personal settings also playing into this.


1:06 am on Dec 14, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

caused from traffic shaping or user queries on deeper long tail search

That seems like question you can answer via your analytics package, Whitey. In fact, now that you've asked it, that sounds like a question it would be GOOD to answer.


10:23 pm on Dec 26, 2010 (gmt 0)

5+ Year Member

quoted from [webmasterworld.com...]
Huge opening for programmers to develop "virtual users" using proxies and loading up "test" info to find SERPS with different parameters and "user settings".

An extension of that would be to again use proxies to "shape" web page profiles that the webmaster is seeking. 10000 interval visits a day by a proxied bot performing what the webmaster sets it to do is very easy.

I see this traffic shaping exercise to be totally open to spam and manipulation. I honestly hope big G has a big ACE up it's sleeve that we don't know about yet.

I'd like to call upon adventurous programmers to try develop this bot and testing it to see how it can "shape" SERPS. It would seem that there is increasing evidence from many fronts that traffic shaping is well and truly here to stay. Let's see what we can do with it.


I like the idea ryanep!


11:13 pm on Dec 26, 2010 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

This certainly is a direction of study that seems to be suggested. If we do understand what Google is doing, wouldn't the first challenge be to understand the type of taxonomies that Google has established to test various results against?

I see one significant challenge. There's no reason to think that every site gets this "traffic shaping" treatment we've been discussing. Google seems to roll it out only in certain cases - perhaps where they're having trouble "pegging" a website within one of their taxonomies.
This 65 message thread spans 3 pages: 65

Featured Threads

Hot Threads This Week

Hot Threads This Month