Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Zombie Traffic / Traffic Shaping / Throttling

         

Shaddows

7:40 am on Sep 9, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For a bit of background, here's the original (short) Traffic Shaping thread from 2010.
[webmasterworld.com...]

And this is the popular thread from 2012
[webmasterworld.com...]

I still get Zombies, they don't convert. They don't interact. They do nothing. I get the a lot less than I used to, and I'm happy.

Many people I respect don't believe in them. Tedster went off the idea after failing to nail down a mechanism. Netmeg has never subscribed. Leosghost thinks it comes down to serving mobile poorly. The problem is, I would never spot it on someone else's site - you only know when traffic patterns change on your own site.

Zombies, on their own, are not a problem- apart from incremental bandwidth consumption, I suppose. It's only a problem when combined with Traffic Shaping / Throttling. This is another controversial topic that many respected webmasters absolutely do not believe in. And again, you have to ask the question of HOW does Google actually implement any perceived shaping.

Well, some answers to that question include Dayparting, personalisation, user intent reinterpretation, and auto-suggestions. Personally, I think Autosuggest is a major factor these days. To take a trivial example, type the letter "i" into Google, and I am suggested either
    ISIS
    Israel
    Iraq

OR
    Iglu
    inuit
    Igloo


My previous term being "middle east" and "Eskimo" respectively.

Also, when you think about it, it's probably more fair to rotate Page 1. I mean, for 10,000 relevant pages of reasonable quality content, is it fair that only 0.1% of them get any traffic? Or would it me more fair to rotate the top 100 sites (with the "natural" top 10 getting more time proportionately), so that 1% of pages get at least a trickle? In this scenario, changing your "time-at-the-top" recipe would appear as traffic shaping.

Anyway, irrespective of the actual mechanism, many of us are certain we have a pre-set level of traffic. A good morning tapers off to a dull afternoon, a disappointing day suddenly ends with a bang. Any time you're off the glide-path, you get Shaped until you are back in your limits. And then the Zombies come. Same traffic, less conversions. Impending apocalypse.

On the bright side, my Zombies now arrive as part of a complete referral shift. That is, I suddenly lose traffic to some pages while getting new traffic to previously quiet pages. I strongly associate this with an impending mid-level core algo change. When the algo change occurs, I have a correlation where the high-converting pages (of both before and after the Shift) get sustained traffic, while low-converting lose.

I assume my experience is actually what Google is trying to achieve on other sites too. However, listening to the chatter here, they seem to get this wrong far more often than they get it right.

For context, I'm a reasonable sized ecom, in UK, selling stuff you can typically buy on Amazon for less money than we charge. We routinely outrank Amazon for generic keywords (vanity phrases), and consistently on product searches and long tail.

aakk9999

11:56 am on Sep 9, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am wondering whether how/which Adwords are served for the query may influence traffic shaping.

There is another thing that I noticed with regards to personalisation - I am bilingual and at some point Google has profilled my "sticky" IP to talk the other language and from that point I did not get English Ads served any more, they were in the other language. I am wondering how many bilingual searches get ads in their other language and how this may influence clicks. E.g. if my sons then search from my PC, they would certainly completely miss Adwords ads as they do not speak that second language, so more clicks would go to organic listing.

Leosghost

12:55 pm on Sep 9, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



re bilingual..I'm also..English and French ( I speak and read in some other languages, but rarely search in them )..

Bear in mind that many words are the same ( if we ignore the accented letters and the pronunciation ) in the two languages..

My OS ( on desktops ) is set to English..and so are the browser languages..

My OS on mobile devices ( tablets and phones ) is set to French..and so are the browser languages..

Google defaults all android devices in France to Google.fr..one can get around this with a proxy, few actually bother..

I usually get English ads if I search English Google..in English ..( Google.com for example ..there are others ;)..I usually get French language ads if I search in French in Google.fr..

If I search in French in Google.com..I get French ads
Very occasionally I'll get ads for one language showing, even though I searched in the other..

Our home network ( fixed web facing IP address ) has 3 desktops permanently connected to the web..

All are set to English language OS..and Browser languages..

But one has almost all searches made in French from Google.fr..language matching to ads language is usually very accurate..

We search at odd times of the day or night..I have not seen any indications of "time affected results"..

But, those observations may not apply to all "language pairs" or other "language multiples"..

Unless one is running "user action tracking" scripts, such as following mouse movements or touch gestures, then it will be impossible to know for sure if a "zombie" visitor is one who has left a tab open ( but is no longer "focused" on that tab )..or if they are "interacting" and being satisfied with the page, or "interacting", but not getting anywhere, or having difficulty finding what they want, and thus clicking, as is the case when mobile traffic arrives at a non mobile optimised or non responsive site or page.

The latter has been the case in every example that I have seen so far when webmasters have said that they have zombie traffic..( traffic arrives but as site is "desktop" only or not optimised for mobile devices, so visitors move around trying to engage but without succeeding and eventually leaves the site ) including the site of backdraft7 which was referred to ( but not named ) in the previous discussions..

Thus sites which have high SERP position can and do, sometimes get very badly converting "zombie" traffic..

Not an explanation of all of what gets called "zombie traffic"..But ..IME..a very large part of it..

netmeg

2:55 pm on Sep 9, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When I think zombies, I think bots.

Rob_Banks

4:11 am on Sep 10, 2015 (gmt 0)

10+ Year Member



I've kind of drifted into netmeg's bot camp. I find myself using requests for images and scripts as more accurate indicators of "real" traffic rather than what server logs may indicate.

We all know MOZ, Majestic, AHREFS and many others are constantly visiting with (assuming) defined user agents, but what about the startups, under the radar, next big SE, find a new way to spam type people?
I believe Screaming Frog has the capability for fake user agents while analyzing a site. There have to be a ton of private entities with the same capability.

How much of today's traffic isn't really traffic?

Apologies for the diversion, Shaddows. If I named names that mods feel inappropriate, feel free to slap me. :)

aristotle

4:38 pm on Sep 10, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Evidently Shaddows is mainly talking about traffic from Google search. Occasionally you'll see a bot that tries to fake a referal from Google, but these are usually easy to spot, and in any case not nearly common enough to account for so many complaints about Zombie traffic. So I'm pretty sure that the complainers are talking about human traffic, especially traffic from Google.

In my opinion most of it is actually mis-matched traffic. It's quite common for me to click a Google search result and immediately see that the site I land on doesn't have the information I'm looking for. This is what I mean by mis-matched traffic. When this happens, I quickly leave the site and have no further interaction with it. So in the logs it looks like an example of zombie traffic.

johnhh

5:15 pm on Sep 10, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



many of us are certain we have a pre-set level of traffic

We are 100% sure this this happens, our graphs, which go back a long way, show that as soon as we hit a certain level of traffic we know we are about to be hit, and we always are.

The question is how and why.
The how I think is a number of factors.
1. Google adding 'words' to the query so if you search for green widgets you get green widgets + household ( for example ). Basically if you are doing well for a query, they just direct people elsewhere.
2. Using ISP data to track users going direct to the site.
3. Personalisation and localisation, as the normal user will not clear cookies.
4. Data from Chrome browser.

All the above gives Google a site overview.

The why is more simple
$$$$
1. Advertisers , and I mean major advertisers spending millions, get upset if they don't also appear in the SERPS on page one.
2. Promotion of Google Properties or companies financed by Google Ventures - you may be suprised how many they are.
3. Pro American bias, select any subject and do a search , then check out who owns who, and I include equity houses in this. Yes there are alternatives to Amazon, but you never see them.

If you get too big you get what we call 'slapped down',

The result is our traffic from Google decreases and straight-lines, often for months, before increasing only to be hit again.

Only this week our traffic on last Sunday and Monday, normally our best days, decreased, yet Wednesday traffic was up, as was conversions. This of course wil not be allowed....

I have seen many comments on this forum about European companies not being 'good enough'. We have some great European companies, the problem is as soon as they make headway they get taken over, often never to be seen again as the technolgy is stripped out.

Rob_Banks

6:03 pm on Sep 10, 2015 (gmt 0)

10+ Year Member



Evidently Shaddows is mainly talking about traffic from Google search.


It was much easier to isolate potential zombies when referrer information included keywords. Now, I see about 20% of the search terms used which means I no longer pay much attention to keywords and SERPS. Overall traffic and the percentage relationships between the major search engines is what I watch along with individual page performance.

FranticFish

4:06 am on Sep 12, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Occasionally you'll see a bot that tries to fake a referral from Google

I've been looking into this lately, and want to do some testing, but I have an idea that with a headless browser and someone else's IP address run via a proxy (I've read that you can search Google via proxy) there's no need to fake the referral. The search, the click, the visit could all be 'real' - just not performed by a human.

Rob_Banks

8:39 am on Sep 12, 2015 (gmt 0)

10+ Year Member



There are available programs that will scrape Google using the program's unreliable proxies or your hopefully better proxies. That will allow you to search for, and collect things like URLs that contain footprints like "viewtopic.php" which might be exploitable for comment spamming.

I'm not aware of available programs for a search, click and visit to mimic a human. I'm sure they exist, I just don't know where you can get one. Something like that would also need to randomize many other details.

Good for you on wanting to test, that's what makes the black box more understandable.

FranticFish

9:08 am on Sep 12, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Check out phantom.js - I'm involved in a test where the dev I work with used it to login to an account and click around inside the user area.

Rob_Banks

4:17 am on Sep 13, 2015 (gmt 0)

10+ Year Member



Thanks for the share.