Welcome to WebmasterWorld Guest from 54.162.19.123

Forum Moderators: Robert Charlton & andy langton & goodroi

"ZOMBIE TRAFFIC" Separating fact from fiction & emotion

     
4:20 pm on Nov 10, 2015 (gmt 0)

New User

joined:Apr 30, 2015
posts:37
votes: 9


This recent discussion about "ZOMBIE TRAFFIC" is just utter nonsense. What are people saying, anything worth while or just a communal <snip> because sales are down on the norm? The talk is firmly in the tin foil hat area.

Are you talking about SERPs, if so why, if your positions are dropping then that's that. If positions not dropping are you seriously saying Google is sending you people they know will not buy from you !? REALLY?!

Are you talking about PAY PER CLICK? if so then your talking possible click fraud then, arenít you?

Giving any constant period on the internet, people buy or they don't buy and there's many many factors why they will one day and might not the next day.

[edited by: goodroi at 5:55 pm (utc) on Nov 10, 2015]
[edit reason] Let's be careful to keep the discussion on a professional level [/edit]

6:16 pm on Nov 10, 2015 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3238
votes: 195


Imagine if Google never penalized a website and you were part of the first batch of websites to experience a Google penalty. It would probably be hard for you to figure out what exactly was going on and communicate the situation to other site owners who have never been penalized. Just because something is not well documented or experienced by you does not necessarily mean it isn't happening to someone else. It also doesn't mean that everything someone posts on an internet forum is correct or accurate. There have been many self proclaimed experts that just spout incorrect stuff even after being proven wrong.

I would not saying that every comment in the very lengthy zombie traffic thread [webmasterworld.com...] is useful or accurate but IMHO there is something weird happening to the organic traffic for a subset of websites. This could be a new type of filter/penalty or it could be an unexpected influence of PPC day bidding taking away the higher converting traffic for a certain time period each day or a combination of things. I have seen the analytics of a few websites that just leave me puzzled.
8:58 pm on Nov 10, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7525
votes: 507


I believe there are always a number of cyclic trends pervading all aspects of the web and searches.... but to use only one data set of information (google) is restricting, to say the least. Have you observed the same in Bing, Yandex, Duck Duck Go? You might have to rely on your own logs for this info.

The bots are getting better at mimicking humans, so there is some amount of "zombie traffic" out there. How much? That's what we are trying to find out.

But in all large data sets there are inner cycles and trends, some obvious, some not so obvious. But the whole data set is the human race, which includes global politics and economics (personal and national) and that data set is much larger than even the Web.

Keep those trends in mind when viewing your "analytics".
9:49 pm on Nov 10, 2015 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:June 26, 2004
posts:379
votes: 33


FishingDad +

Why do you feel the need to start a new thread? If this is not affecting you then why are you so emotionally involved?
10:18 pm on Nov 10, 2015 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:June 26, 2013
posts:454
votes: 69


It would seem this thread has more ranting, name calling and accusations then the one the poster is complaining about.

This thread serves absolutely no purpose and is written in such a way to invoke a hostile response. If the poster does not agree with the zombie traffic thread, or have anything of value to add to it, he has the freedom to not participate in it. But to demean people who are experiencing a similar problem and sharing their views, by starting a new thread bashing them, is wrong.

This thread should have been deleted by a mod right away. If we were all to start threads ranting about the contents of other threads and opinions of other members, it would be impossible to discuss anything.
11:56 pm on Nov 10, 2015 (gmt 0)

New User

joined:Apr 30, 2015
posts:37
votes: 9


I am interested in clarity about this "problem with google", no one is providing it that I see.

Is it serps, is it pay per click, or both.?

I am asking these questions here as the other thread is just confused and full of, nothing tangible.
12:11 am on Nov 11, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7525
votes: 507


I know it sounds strange, but I missed the previous thread noted above. I took a look. Whew! All over the place. When I replied above I was speaking to something different, true Zombie traffic which, in general, is bot activity screwing with sites.

As for the other thread's discussion regarding G being in charge of such results (or not, or some magical thing from the Fourth Dimension... I didn't read all of it, gave up!) I don't buy it.

I do see that G is on the verge of delivery failure, that adsense might have seen its day, that adwords is saturated and declining and that some (repeat SOME) webmasters have either not moved on or cling to a fading revenue stream, OR believe in a Myth of Prosperity by playing with G ad serving properties.

Meanwhile, the real world does get involved and some buy cycles are based on that.

Zombie traffic, for me, is BOTS acting like humans... but since they don't have a credit card to follow up, nothing happens to the benefit of the publisher. These days, to me, it is a cost of doing business (or lack of).
10:46 am on Nov 11, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1369
votes: 132


Right, so...
Currently the world is divided into two groups. One believe every single bad thing in the world is an intentional, planned, deliberate and foreseen proactive policy decision designed and implemented by a single, unified, omnipotent, omniscient collective consciousness manifesting itself as a multinational company known as Google.

The other thinks that Google is a successful multinational company that operates like any other, with internal contradictions, misaligned goals, silo'd decision processes, with each subsection focussed on departmental goals that may have unintentional consequences, particularly in spheres aligned with other departments.

I fully admit that the current Zombie discussion is dominated by the former. Or, in the words of Simon_H:
There is plainly a frustrated contingent whose logic begins with "I'm making less money than I have for the past X years", ends with "Google is evil" and the bit in between is just the latest conspiracy theory.


So for clarity, Zombie traffic should not include Bots, but some people seeing Zombies just cannot filter those out. This is no different from the fact that many people do not know the meaning of "statistically significant"- or that I see frequent reference to "Title Tags". Few people are proficient in all subjects discussed on these boards, even if proficiency would directly help them.

Zombies (sans bots) are best observed as a drop in conversions for a stable level of Google-derived traffic. Most people seem to see it site wide, others in subsections.

Personally, I think there is a deliberate attempt to deliver a consitant level of traffic to some sites- and further that this is reasonable as cycling through 100 potential destinations is better than only showing a static 5 above the fold on a single SERP. Especially if you give different frequency weightings to those sites, and use Personalisation data as part of the selection criteria for what subset you surface in any given circumstance.

Going back to my original observation of the tribes of the world. It is an unfortunate fact that for any new or unproven idea, one group will call the other a Fanboy or a conspiracy nut depending on your tribal affiliation. It is not helpful. Sceptics are required in any discussion- they stop groupthink. Also essential are people who explore an idea out loud for other people to contribute - even if they turn out to be wrong. The safety to explore a new idea without being called names by the orthodoxy is the only way new discoveries can be made.

Name calling by either side should stop now.
3:48 pm on Nov 11, 2015 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Sept 28, 2015
posts:273
votes: 171


Thank you @Shaddows for the reference. As @mrengine says, this thread has been created purely for the purpose of generating a hostile response. The original post breaches forum rules, plus we shouldn't split the same topic into 2 separate threads. Could we please get this thread locked so sensible debate can return to the main thread?
8:30 pm on Nov 12, 2015 (gmt 0)

Full Member

10+ Year Member

joined:June 4, 2005
posts:259
votes: 37


The late, and legendary member "tedster" (Ted Ulle) of this forum started a thread in WW on Apr 6, 2012 titled
Zombie Traffic from Google and Traffic Shaping/Throttling - Analysis [webmasterworld.com ]

In one of his last tweets Sep 20, 2012 [twitter.com ] @tedulle tedster tweets:
After 3 years of analysis, "zombie" traffic identified as mobile traffic to mobile unfriendly pages


The above thread (15 pages long) makes useful reading. The phenomenon is certainly not new.
It has just got worse.

Why?

.
9:20 am on Nov 13, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1369
votes: 132


The real question is why it turns on and off, without affecting overall traffic.

Even if you assume mobile is the cause, and assume that a statistically significant drop in traffic can occur for one day only on Desktop, and assume that a statistically significant spike in traffic could arrive from mobile, and allow that these two events could happen coincidentally on one day, I cannot see how you can explain the repeated coincidence of two apparently unrelated traffic events repeatedly over time.

By the way, I don't think Zombies are increasing. Mine are decreasing as a proportion of traffic, having started around Mayday (2010), increased as discussed here [webmasterworld.com] in October 2010 and peaked by Tedsters' thread posted above in October 2012.

I think possibly they are becoming more widespread. Maybe people are just looking for them, or are more proficient with analytics (maybe less...). They are certainly the vogue bogeyman du jour.

But if you see permanent Zombies (not classic on/off periods), you are probably watching bots and mobile.
11:55 am on Nov 13, 2015 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Nov 2, 2014
posts:487
votes: 202


this thread has been created purely for the purpose of generating a hostile response. The original post breaches forum rules,

You can say that again Simon. Apparently the admin here finds it acceptable for members to hate on other members that are critical of Google.

To respond to fishingdad:

Is it serps, is it pay per click, or both.?

I said "It never fails that when zombie traffic from Google organics arrives, the same low quality (possibly non-human) traffic applies to Adwords too" on October 10th and other members have mentioned the same. And you can see all of the comments about people pausing, stopping, ending their Adwords campaigns.

Maybe if you had spent more time reading the posts, instead of posting this thread calling what we were discussing as "nonsense," you would have less to whine about.
12:51 pm on Nov 13, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1369
votes: 132


acceptable for members to hate on other members that are critical of Google.

Almost everyone here is critical of Google.

It just splits into those who critique, and those who already know Google are evil, and just need to fit any given observation into a pre-existing world-view.

Just because Google is doing something, doesn't mean they are doing it deliberately.

Similarly, just because someone disagrees with you, it doesn't mean they are hating.

Intelligent people can disagree. Professional forums are supposed to be a place where people can discuss and debate without name-calling.

One final suggestion. If you have a problem and do not know they solution, and another set of members do not have that problem, perhaps you could learn from them rather than insult them and exclude them from the conversation. At the moment, only people with the unsolved problem are welcome in any given discussion, and that's just crazy.
1:37 pm on Nov 13, 2015 (gmt 0)

New User

joined:Apr 30, 2015
posts:37
votes: 9


"It never fails that when zombie traffic from Google organics arrives, the same low quality (possibly non-human) traffic applies to Adwords too"


My post here started

Are you talking about SERPs, if so why, if your positions are dropping then that's that. If positions not dropping are you seriously saying Google is sending you people they know will not buy from you !? REALLY?!

Are you talking about PAY PER CLICK? if so then your talking possible click fraud then, arenít you?


The conclusion as far as Google can only be either click fraud, contact the authorities. Or SERPS there for nothing be concerned about get on with something else if your positions are the same as norm. Simple as that, is this no help for those spiralling up into there own backside looking for answers to the totally irrelevant or the impossible.

The MAJOR and ONLY concern is click fraud and if this is whatís being suggested then I would like to hear of these cases of fraud being put to the authorities, personally I think it must be awash with foul play and would love to see Google taken to task on it.

The "org" thread, I tried to read but by the time I got to page 2 my tin foil hat had slipped off and Google had picked me up with its mind powers and shook me like a dawg.

PS

I do not recall any name calling here?
2:11 pm on Nov 13, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1369
votes: 132


I do not recall any name calling here?

Nope, but that's the second time you have quoted Monsters Inc. (also Msg# 4775405 on the recent, locked Zombie thread <-- Is it still impossible to link to messages?

It's not exactly edifying.

I have never had a problem with PPC, and was not even doing PPC back in 2010.

I've not got a problem with Google, the traffic provider. I have an issue with Google, the company, but usually you won't see me express an opinion on the SERPs boards cos it's not relevant. Hating or loving G is not going to help you rank, or overcome a penalty. And actually, we're doing perfectly well, thank you very much.

I have always seen the Zombie phenomenon as a chance to glimpse a part of an algorithm that we normally do not get. As you say, if you cannot see rankings change, how is it possible for Google to control (limit or boost) your traffic, or your traffic quality? I've never found a mechanism, but my current theory is that Google is heavily into Personalisation, and shows different sites to different cohorts, so one individual cannot track rankings. Quality would be determined by whatever cohort you are currently being shown to- and there is probably no nefarious reason for this other than Google wanting to know what their USERS prefer to see.

And to forestall the obvious "I am depersonalised with VPNs, virtual machines, proxys, TOR, spoofed useragents, rotating browsers and temporal distortions" - well congratulations. You are in the smallest of all Personalisation cohorts- one that I would feed nonsense to if I were Google.
3:04 pm on Nov 13, 2015 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3238
votes: 195


If some is depersonalised with Google using VPNs, TOR, temporal distortions, wormholes and/or time machines - I think that person would have a very hard time to understand & analyze what Google is doing. I am not saying that person is wrong or not smart. I am saying that person is experiencing Google is a way that almost no real consumer will experience.

I think it is better to have a wide range of machines set-up to simulate different real life users. If you had an account setup for a NYC internet user, another for a Chicago internet user and a third for Dallas internet user I think that would give better insights into what Google is or is not doing compared to depersonalizing yourself to an extreme degree.
3:50 pm on Nov 13, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2630
votes: 191


In reality, there is no depersonalisation. Everything is personalised, if on no other basis then on IP basis. Once we bought 10 proxy IP addresses, and three of these were "speaking mandarin" and further two some other Asian languages (looking at the language of the Adwords served). IPs were personalised based on who used them before / still uses them.

@Shaddows, what I do not understand is this: the current theory is that Zombies are supposedly human traffic which is mismatched. You say the traffic is constant, the conversions are turned on/off. But if the traffic is mismatched, wouldn't you see it from some other measurements? E.g. time on site, bounce rate, page views per visit ? Some other aspect of behaviour other than just not going down the buying funnel?

If all other measurements are similar then the only other thing I can currently think of is shift in the intent (research vs purchase) - a site is at some time only sent the info/research traffic, but not the traffic using what Google would know are converting keywords.
4:29 pm on Nov 13, 2015 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3125
votes: 212


Sandra -- I proposed a theory in the now-closed thread, according to which Google can determine which sites have relatively low conversion rates for the same or similar products, and uses that information to decide which type of traffic to send to those sites.
4:33 pm on Nov 13, 2015 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Sept 28, 2015
posts:273
votes: 171


If this thread is to (ironically) become the new place for the zombie phenomenon to be discussed, may I please ask the mods to (1) change the sticky at the top to something that properly defines zombie traffic, e.g. msg 4775862 (edited as people wish) to ensure this stays focused and (2) have a zero tolerance of trolls.

@aakk9999 It's a good point about other stats and this was discussed previously. For us, 'bad' traffic and 'good' traffic do indeed appear similar in terms of bounce rate, time on site, etc. Initially, that seemed wrong to me, but I actually think it's to be expected. These are real people, not bots. Let's say that 'good' traffic converts at 10% and 'bad' traffic converts at 1%. The 99% of 'bad' traffic will behave very similar to the 90% of 'good' traffic, so the stats won't be all that different. And poorer matches can often result in a longer time on-site as they spend time trying to find what they want whereas good matches are in and out very quickly.

P.S. Totally agree with @Shaddows and @glakes. Disagreement within intelligent discussion is healthy. That's very different to trolling.
4:57 pm on Nov 13, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1369
votes: 132


The 99% of 'bad' traffic will behave very similar to the 90% of 'good' traffic, so the stats won't be all that different

Great insight! After all, the SERPs must be pretty close to user intent matching to even surface- just not quite as good as usual.

Looked at in the negative, (99% Vs 90%) - its only a 10% quality shift. Worth thinking about in more detail.
5:39 pm on Nov 13, 2015 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
posts:2630
votes: 191


The 99% of 'bad' traffic will behave very similar to the 90% of 'good' traffic, so the stats won't be all that different

So if the site is receiving 100% traffic, of which 90% does not convert (e.g. research/info phase) and 10% converts, then in the "off" periods these 10% of converting traffic is missing and is replaced with 'bad' non-converting traffic(*)? So whilst the converting traffic vs replacement non-converting traffic have a different stats, overall there is 90% behaviour stats match as the first 90% of traffic is of the same kind?

(*)Non-converting traffic or traffic wanting to convert but purchasing something in the 'low converting' section of the site, as aristotle's theory mentioned

It does appear as shaping the traffic with buying intent and sending it elsewhere. But in the 'off' periods, why switch off all it? (if I understood the 'off' switch correctly). Why not just reduce converting traffic by e.g. 40% ?

The sites I look after do not have Zombie phenomenon - there is a seasonal traffic fluctuation and a predictable on/off purchasing period (e.g. research during the day, purchase mostly in the evening) so I am not seeing Zombie phenomen the first hand.
7:11 pm on Nov 13, 2015 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Sept 28, 2015
posts:273
votes: 171


@aakk9999 It's all guesswork, but trying to explain how things work IMO... I don't think Google is identifying and switching in and out just that 10% of converting traffic. I think it's simpler than that. Google gets a wide variety of queries, some where it has a high confidence in understanding search intent and some a low confidence. Google still has to list *something* in serps where it has a low confidence in understanding intent. So, statistically, sites that appear in serps for high-confidence queries will see a higher conversion rate than sites that appear in low confidence serps. But whether high or low confidence results, they're all real users and 90%+ won't convert anyway, hence the average time-on-site, bounce rate will appear similar.

It's logical that Google will want to give sites perceived as high quality more exposure in high-confidence serps than sites perceived as low quality. Combine this with traffic shaping/quotas and I believe Google will be switching sites in and out of certain serps as they near their quotas, resulting in the on/off effect being seen.

This is not dissimilar from @aristotle's theory, which involves an ecommerce quality score being applied to sites.

I think this is further complicated because this may not just be one thing happening. Someone else (@Shaddows?) suggested this may be multiple things conflicting, e.g. traffic shaping with RankBrain A/B testing. And it seems very coincidental that this all kicked off exactly around when iOS9/adblockers went live. The problem with traffic shaping is it's highly susceptible to issues because any 'bad' traffic sent to your site for whatever reason ends up eating into your allowance and then potentially causing the on/off switching as Google enforces the quotas. So whilst a single cause may not explain why we're seeing, a combination of a few may.

We certainly *do* see this phenomenon. We monitor traffic hourly and overlay graphs in GA from individual days on one week to the next and the total volume is almost identical to the hour. But conversions are completely different.
7:17 pm on Nov 13, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7525
votes: 507


I fall into the same category as aakk9999, though my zombies are definitely BOTS seeking content. These I manage in .htaccess, except for the ones that get by doing a good mimic of human activity. Those are the ones I call zombies, which apparently is a distinctly different kind of zombie than the ones under discussion here.

Meanwhile, daily ups and downs will only make you insane. Weekly, 10 day, biweekly, 21 day, monthly, quarterly make more sense for tracking/trends, with a nod and wink to seasonal/school cycles where actual human behavior would realistically change.
7:43 pm on Nov 13, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11832
votes: 284


Like aakk9999, I've not observed Zombies on sites I've dealt with, so these observations are based on what I've seen of stats and reports of others. Much of this is necessarily conjecture.

I've noted that this zombie traffic seems to appear around times of Google testing, which are not the exact times of full algo changes. Reports often precede and follow algo or infrastructure changes, and I've come to think that the sites affected may be parts of subsets used in evaluation of heuristics, testing of filters, evaluation of serp layouts, etc.

My emphasis added...
It does appear as shaping the traffic with buying intent and sending it elsewhere. But in the 'off' periods, why switch off all it? (if I understood the 'off' switch correctly). Why not just reduce converting traffic by e.g. 40%?
If this Zombie traffic is associated with testing, it may be much simpler (or more dependable) for Google to compare whatever it is comparing at a relatively consistent level of traffic, so it adds or introduces types of traffic not only to keep a consistent level, but perhaps also to compare several different sets of factors. Someone with more background in statistics might be able to describe the mechanisms more clearly.

I've come envision that Google is dealing with what I call a "multi-dimensional" model of user intent, query meanings, geo-aspects, freshness requirements, and site relevance and quality factors, and to a degree, many of these are interacting with each other to shape what we call "personalization."

As Google tests a variety of factors over time, it may well be that the Zombie effect is an artifact of different kinds of tests... including buying intent, but not necessarily limited to that alone. Reports at one point were clearly associated with, say, geo-specific features in the serps, and complaints about Zombies often suggested that traffic was coming from the wrong country.

As many factors are now localized, geo-filters are now associated with many different factors in the algorithm, and if Google is testing by switching filters on and off, perhaps geo-filters are sometimes, but not necessarily always, among the factors triggered.

On sites which see a small but dependable number of conversions at certain points, this may not be a normal base level of conversions, but rather an amount of ready-to-buy traffic that Google is adding simply to maintain some sort of statistical norm.

Regarding visits which correlate with particular times of day, I can say with almost absolute certainty that these are patterns associated with those times of day (eg, lunch hour, bedtime, etc).
8:36 pm on Nov 13, 2015 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Sept 28, 2015
posts:273
votes: 171


@RobertCharlton That makes a lot of sense. I've assumed that the criteria for sites to see this phenomenon is that the sites are low quality, hence Google is allocating them a higher proportion of low confidence serps being switched in and out to meet quotas. I've assumed this because, when I asked, the vast majority of affected sites were hit by Penguin or Panda. We've been hit by both, but have been 'clean' for almost a year. As you suggest, the actual reason could be that our profile makes us (and others) perfect test sites for Panda 4.2, Penguin real-time, etc. So it's not that sites perceived as low quality are being allocated this on/off zombie traffic, it's that sites perceived as low quality are being used as test sites and hence having on/off zombie traffic allocated to measure engagement, etc. Much like the 'sinister surges' seen before Panda 4.1.

If that's true, it tells us a lot. It follows that Google would predominantly want to test borderline sites rather than sites that are perceived as either very high quality or very low quality, because that's where any algo would need tweaking/learning. It also potentially tells us that, depending on how those sites respond to the tests, they may see in the near future a recovery or a filter/penalty applied. If you're seeing the zombie phenomenon and are not aware you've been hit by any filters/penalties to date, you may want to start an urgent site clean. On the positive side, it also potentially tells us that the phenomenon is temporary and, assuming the site is now perceived as high quality, it may return to normal/good traffic soon.
8:40 pm on Nov 13, 2015 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Sept 28, 2015
posts:273
votes: 171


Something else to add... This doesn't explain why this phenomenon is being seen on paid as well as organic. I believe there's an indirect/unintentional dependency between organic and paid (e.g. if good results appear in paid, then the user will click before reaching organic results), but any thoughts on that side of things?
9:51 pm on Nov 13, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11832
votes: 284


Simon_H - I'm pleased I communicated as well as I did. I didn't discuss the type of heuristic testing that I think Google does, as that's a big topic, but let me say on this that I think you're half right....
That makes a lot of sense. I've assumed that the criteria for sites to see this phenomenon is that the sites are low quality, hence Google is allocating them a higher proportion of low confidence serps being switched in and out to meet quotas.

My guess is that while many sites in a "seed set" are close to the edge, as I understand it, some of the sites/pages are close to the edge in the other direction... and I assume that Google is testing also to make sure that certain sites it's identified as wanting to retain in its index aren't hit by false positives. Thus, I'm thinking, some of the testing is making sure that "good" sites included in the seed set aren't eliminated from the results.

It may also be that some sites have lots of good qualities that attract searchers, but that searchers are leaving at the buying stage, and Google keeps these in to continue testing them. Beyond that, in simply trying to classify certain types of sites, there isn't always a good or bad... just pages that are appropriate for one type of search or searcher or point in the conversion cycle, and not appropriate for another.

If Google was seeing that the wrong sites were getting eliminated for a given test, Google might need to revise its seed set for that test.

I would love feedback on this... These are my assumptions, and I'm not a search engineer.

(As an aside, this may be the kind of problem Google has been having with making both Panda and Penguin to be self-updating... that the seed sets weren't working as hoped, and known desirable sites/pages were getting eliminated).
9:52 pm on Nov 13, 2015 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:11832
votes: 284


Something else to add... This doesn't explain why this phenomenon is being seen on paid as well as organic. I believe there's an indirect/unintentional dependency between organic and paid (e.g. if good results appear in paid, then the user will click before reaching organic results), but any thoughts on that side of things?

Again, theory only... I gave this some thought when it appeared in the other thread (which unfortunately got too noisy to have ongoing discussions), and my guess is that this may have to do with RankBrain, and where in the processing chain the query gets rewritten. It may be that so much interpretation is required to refine the query to make it understandable to the rest of the algorithm that Google is using the refinement, rather than the raw text string typed in, both for organic and for AdWords. I'm theorizing that organic and AdWords would otherwise be disconnected... but the Artificial Intelligence used for refinement, which is a self-correcting system, would be going through the same correction cycle for both organic and AdWords.

My guess is that this refinement cycle would not be connected to the AdWords ranking algo (which is what was suggested by Yukko), but there would still be a cyclic correction factor, because AI is cyclic. Google might have to take its RankBrain refinement signals perhaps just from organic... or else to have two separate systems to eliminate the effect of ad click-throughs... and I strongly believe that a degree of separation is implicit in the integrity of the system.

In any case, to keep the systems separate, I am assuming that Google would refine the query before the AdWords algo touched it.

I would love feedback on this too, but please... let's not debate whether Google is connecting the two to sell more ads. That's a separate thread, and it's a Business practices discussion.

In this case, the question I'm asking is how Google keeps them separate enough that the "church/state" separation is maintained.
11:09 pm on Nov 13, 2015 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:7525
votes: 507


Google has an Establishment Clause? (Sending a bill for monitor spew by coffee!)

These are interesting speculations, Robert, and happy to have them to chew and digest.

As with any organism (and G is one, including a "brain") things change with age. With any maturity achieved different signals will produce different results, even in the signals are virtually the same as, say, six months back. This might indicate growing pains as G approaches its "college" years and is working toward graduation before embarking on a PHD or Masters.

Or the data set has become so bulky and cluttered that some form of reason needs be applied.

And we webmasters and users, are the data set, and that is constantly evolving and expanding, and includes other languages as well. If one thinks just a bit, it is a rather large and complex design, guided by faulty humans and (so far) brain dead but highly accurate and fast computers.
1:51 am on Nov 14, 2015 (gmt 0)

Full Member

Top Contributors Of The Month

joined:Sept 28, 2015
posts:273
votes: 171


@RobertCharlton Regarding Google using seed sets and the choice of sites, yes, we're suggesting the same thing. As you say, there would need to be sites on both sides of the line with some sites being correctly placed and others intending to be moved to ensure an exhaustive test set. However, this does seem to contradict how Google said Panda 4.2 would roll out as the implication was the scores and algo were set in stone a while ago, not that there would be any testing/refinement.

Regarding where RankBrain sits in the query refinement process and whether it impacts paid, I raised a similar point on SER. Search intent (e.g. transactional vs informational) is certainly used to determine which widgets appear (Shopping, Search Ads, Images, Snippets, etc) so if RankBrain is now involved, then it follows that it may impact which/when paid widgets appear. If paid widgets are now intermittently (e.g. to A/B test) appearing in serps that would have been considered purely informational or navigational pre-RankBrain, that would explain an on/off reduced paid conversion rate.

Regarding the cyclic AI correction factor, yes, data arrays would presumably be shared across organic and, say, shopping results so each would end up learning from the other and there would be in indirect dependency. What isn't clear is (1) why this only seems to affect some sites, (2) the on/off nature of the conversion rate changes.

Perhaps Google has defined a single set of test sites and is using them to test multiple algo changes across both organic and paid.

And there are also people saying this on/off phenomenon is being seen in Amazon, eBay, Etsy, etc. I don't have experience of that but, if true, that would completely change everything again.
This 396 message thread spans 14 pages: 396