Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

"ZOMBIE TRAFFIC" Separating fact from fiction & emotion

         

FishingDad

4:20 pm on Nov 10, 2015 (gmt 0)



This recent discussion about "ZOMBIE TRAFFIC" is just utter nonsense. What are people saying, anything worth while or just a communal <snip> because sales are down on the norm? The talk is firmly in the tin foil hat area.

Are you talking about SERPs, if so why, if your positions are dropping then that's that. If positions not dropping are you seriously saying Google is sending you people they know will not buy from you !? REALLY?!

Are you talking about PAY PER CLICK? if so then your talking possible click fraud then, aren’t you?

Giving any constant period on the internet, people buy or they don't buy and there's many many factors why they will one day and might not the next day.

[edited by: goodroi at 5:55 pm (utc) on Nov 10, 2015]
[edit reason] Let's be careful to keep the discussion on a professional level [/edit]

Shaddows

5:30 pm on Dec 10, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Our Zombie traffic has been declined substantially since 2012, but we have changed an awful lot.

We now include things like videos and tutorials. Also, we have massively increased repeat business, and shifted our offering away from consumer to B2B (used to be 50/50, now 80/20).

All these things will have had an impact. Less consumer products means less pages for Joe Public to land on (businesses being less likely to be doing research). Videos and tutorials will engage a proportion of traffic that was mis-matched. Repeat business means people are looking for us - and yes, people search for our brand name from Google.

glakes

11:44 am on Dec 11, 2015 (gmt 0)



Yesterday I saw Google traffic down significantly and conversions up substantially. This was the first time that traffic went down and conversions went up since late September when the zombie mess started. Others suffering from the same problem had a similar up day as reported in the December Google SEO forum. Today, so far, traffic is up and conversions are down. A return to zombie land it seems.

Strange how I an others seem to be wired to Google's same on/off switch.

masterjoe

4:09 pm on Dec 11, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Seeing the same thing once again Glakes. No action so far, and likely won't see anything until the next non-zombie day, which will probably be around the same time as others here... the question is, what else is there to do? Looks like there have been no solutions since ~2012 that will bring back regular traffic. The only thing closest to a solution sounds like what Shaddows had to do, and change operations to maximize the traffic you already get.

petehall

7:51 pm on Dec 12, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Zombie traffic... generated by negative SEO robots which click around the other 9 sites in the top 10 of the one you're trying to demote.

glakes

9:45 pm on Dec 12, 2015 (gmt 0)



Zombie traffic... generated by negative SEO robots which click around the other 9 sites in the top 10 of the one you're trying to demote.

Very unlikely on my site. We block bots and use a service to block bots. Google blocks bots too. If it were indeed bot traffic, Google would have credited my Adwords account for all the junk clicks. They did not do so on their own and did not do so when I complained about the quality of traffic they were sending. Google stuck by their guns in claiming the traffic was legit, yet offered no explanation to why traffic quality can vary greatly from day to day. Or in my case, one good day a week and the rest rubbish.

Furthermore a number of us see similar on and off patterns. The likelihood of bots targeting all of our sites with the same schedule is rather low.

petehall

3:39 pm on Dec 13, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google blocks bots too.


Where are you getting your information from?

And you think it's possible to block robots coming from random IP's via Google's results.? Would you care to explain how you block this as its something I would like to implement, quickly.

Awarn

10:24 pm on Dec 14, 2015 (gmt 0)

10+ Year Member



Here is just a thought. I know I get more people who say at times they can't add items to the shopping cart. I have traced it at times to be because of IE. Maybe p3p issues or something else. I had a customer earlier say they couldn't and they said they used a Kindle and Silk. Could this lack of conversions and apparent zombies be from shoppers having issues like that? Seems that since 2012 when IE when through some updates to increase privacy they may have hurt functionality.

glakes

2:06 am on Dec 15, 2015 (gmt 0)



Where are you getting your information from?

Personal experience. Have you ever run a number of Google searches and seen the captcha box pop up? Google does block automated queries. You can verify this by looking at the people complaining about scraping Google results, how many proxies they waste, etc. It's not 100%, but Google is filtering out a lot of the garbage. What Google does not catch, we try to.

And you think it's possible to block robots coming from random IP's via Google's results.?

Absolutely. Though we can't catch all of them, we do block server farms and countries that do nothing but spam. We also run a honeypot and if a bot hits the links then they are automatically added to the htaccess block list. Legit bots are whitelisted.

Would you care to explain how you block this as its something I would like to implement, quickly.

There's a lot of bot blocking services out there. Shield Square is one I've seen and there are many other plugins for Wordpress. Wordfence is one that claims to block bots and other security threats on the fly and it's free.

Though nothing is going to be 100%, taking out 80%+ of unwanted bot activity can save a lot of grief down the road - especially when automated scrapers try to post your content to places like blogspot.

toidi

1:49 pm on Dec 15, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Google blocks bots too. If it were indeed bot traffic, Google would have credited my Adwords account for all the junk clicks. They did not do so on their own and did not do so when I complained about the quality of traffic they were sending.



You might want to read this

[webmasterworld.com...]

masterjoe

6:10 am on Jan 7, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Has anyone noticed a slight increase in non-zombie traffic? It seems to be "okay" for me now. I tend to get conversations spaced out over a few hours, every day or atleast every other day now... instead of having rather nice streaks of conversions on non-zombie days which occur 1-2 times weekly. This is still ongoing, but I feel there has been a slight lift in conversions as of late. I have been getting more email signs too.

JS_Harris

12:53 pm on Jan 7, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have a website with 15 years of traffic data available. I can tell you that, without a doubt, the "type" of traffic Google sends can and does change. Not only the type but the time as well, if Friday afternoon is the peak traffic period today it might very well be slowly shifted so that your site peaks on Monday afternoon despite the fact that your own data suggests Friday traffic converts and Monday traffic does not. Another site you own may not receive that same shift suggesting that in reality Friday is still the peak, just no longer your peak from Google.

All that being said there is no conspiracy, there are many possible reasons this might happen especially now with customized and device separated results.

Nutterum

12:37 pm on Jan 13, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Something I`d like to ask the Zombie Traffic is have you seen it more with the weekend update?

timemachined

4:53 pm on Jan 13, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



I have a question for the Zombies on here too or perhaps (I don't know which thread this should go in)

Do you have pages showing in G that are ranking for the on page anchor keyword link instead of the terms you have written the article for?

i.e. Referenced Anchor text used on a 'Blue Widgets On Jupiter' page is 'Spanish Widgets Blue On Mars' and the Blue Widgets page can be found in search by entering 'Spanish Widgets Blue On Mars' just because G ranked it based on the anchor text rather than the article on Spanish Widgets that maybe doesn't (the correct page actually has two related anchor links on the Spanish Widgets article) have any embedded link on the page.

Example: Google Page 2 / 12th searching for Spanish Widgets Blue On Mars

Blue Widgets On Jupiter | Website
website / Blue Widgets On Jupiter /
19 Dec 2015 - This post has metas and is all about Blue Widgets On Jupiter. Yet the Big G indexes and ranks the article on the SOLE link at the bottom ... Post navigation. Spanish Widgets Blue On Mars ...

Yes that page is linked from the home page and on the 8 year website mentioned further down, I used to have latest articles in the sidebar. But again, G indexed pages based on the sole anchor text in the sidebar instead of the actual article and was irrelevant to search. It got so bad, I have had to remove any sidebar mention of other articles with links. This has been going on at least 18 months. Only numerous Google fetches corrects the problem, if at all.

This index fault may create traffic without user being bothered to find the right page. While the search is related and same topic, it's irrelevant when user is looking for something else.

Why I ask is this, I am getting really fed up of G ranking pages based on the anchor kw text in another article or at the bottom of an article, despite kw metas, on page kw ratio being 3% and having related words.

Yet disturbingly it chooses to index based on the kw anchor link instead of the relevant article. How stupid is this? Why on earth rank based on an anchor text link to another article and rank the page its on for that term, rather than the actual article about such subject. And this isn't new but it appears to be worsening.

It's no wonder some websites go as far as to kw anchor link from its own page to its own page. Now going through the WP theme editor removing the navigation link html to previous and next post as this is infuriating. My content is good enough to rank for the right page, why does it rank the wrong and irrelevant page? It is maddening!

Incidentally this is on a month old website and also occurs on a 8 year old website too. For all the claims that G is intelligent and very clever... yes but not on this.

Reasoning. You all have different ways of explaining Zombie traffic. Reference the post by heisje + who quoted @tedulle tedster: "After 3 years of analysis, "zombie" traffic identified as mobile traffic to mobile unfriendly pages"

To me zombie traffic would be G searches by searchers deemed to be idle. If it's bot traffic, just call it that, login to webmaster tools, see all the traffic generated by bot spammers who claim to have visited your website but never actually have, just to get you to visit their website and sell you something.

Now G can't control and eradicate that from appearing in stats but it may be possible for them to display different results to active online buyers and those just seeking out information and not known to buy. That's what they're all about. What could create more zombie traffic? The idiocy of G in my view, ranking irrelevant pages based on a sole kw anchor link for a different page as detailed above.

8 year old website, 1000+ indexed pages or one month old site, 30 pages niche, same problem. In my view, if your website is indexing in a similar way based on anchor text link for an irrelevant page, most users would click the back button rather than figure out Google mucked up and indexed the anchor link and that they should click that instead. Not only zombie traffic, but sale sucking vampires.

EditorialGuy

5:50 pm on Jan 13, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Why I ask is this, I am getting really fed up of G ranking pages based on the anchor kw text in another article or at the bottom of an article, despite kw metas, on page kw ratio being 3% and having related words.

A while back--maybe two or three years ago--Matt Cutts spoke of focusing on "things, not strings" (a.k.a., concepts, not keywords).

As for meta keywords, hasn't it been years since Google and the other major search engines paid any attention to those?

timemachined

6:12 pm on Jan 13, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Yes but there's things and then there's ranking an entire article based on a single keyword STRING anchor link anywhere on a page and on a not best match page. G never used to do that, it's in the last few years and really screws up a website's indexing.

I don't know why it persists and could be responsible for an increase in back button presses. It's so bad that I'm considering not having any links on the homepage at all - magazine style where latest posts go, already removed from sidebar and also considering removing sitewide too.

Imagine, new post goes to homepage and sitemap, g sees on home page so what does it do? Includes the home page for that new term, or another page linked from the home page instead of the actual page, what's the result? It ends up further down the results... use google fetch and hey presto, that page gets indexed based on the actual on page relevancy and the correct page. To me, if it takes a manual action to correct, that means G aint working properly. Even after fetch, page fetched can drop out. Stupid situation.

And meta descriptions still feature heavily in search, as do titles. As for meta keywords, I don't know, but imagine it coming back (if it went away) and having to add in your own manually - yes 'all in one' etc. will auto add according to on page relevancy but I only add two anyway so doesn't take up much time. G switches on and off new and old stuff - authorship etc. can't rely on what they say at any given time.

EditorialGuy

6:59 pm on Jan 13, 2016 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As for meta keywords, I don't know, but imagine it coming back (if it went away) and having to add in your own manually - yes 'all in one' etc. will auto add according to on page relevancy but I only add two anyway so doesn't take up much time. G switches on and off new and old stuff - authorship etc. can't rely on what they say at any given time.

I doubt if they'd go back to a signal that's so easily spammed. Besides, for every page that has a meta keywords statement for "fuzzy red widgets" or "shiny green whatsits," it's likely that there are many other pages that feature the same keywords or keyphrases. (If everybody's wearing a blue suit, saying "Meet me in Grand Central--I'll be the guy in the blue suit" becomes meaningless.)

Simon_H

7:17 pm on Jan 13, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



@timemachined We see this too on ecommerce category pages. It's annoying. To provide a simplified explanation... say there's a category page on 'trainers', which links to 'red trainers', 'white trainers', etc. Because the 'trainers' page has anchor text to the 'red trainers' page, then the 'trainers' page comes up in the serps when the user searches for 'red trainers', and the 'red trainers' page doesn't come up in the serps.

A couple of things to consider.
(1) Have you been hit by Panda? It could be suppressing the lower level pages, which is why the higher level pages are shown instead.
(2) I don't think this is related to zombie traffic. Zombie traffic switches in and out on a daily or near-daily basis. Whereas what you're talking about is just overall reduced traffic.

timemachined

8:14 pm on Jan 13, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



@Simon_H and further down, whereas if the correct page was being indexed it would be higher up. It makes you want to write about an einstein term, chuck in a link about kafka and you'd rank for a kafka term. It doesn't make me want to do that but might as well based on G's criteria (a single element) for ranking a page right now, in this particular situation.

If there was ever such a thing as a sandbox then yes maybe panda - as this exact occurrence is happening on a new site too, but I can't recall when this actually started on the 8 year old site. I do have 'empty' pages which may have caused a penalty but not on the new site. So it would be a panda penalty and new site penalty which seems strange.

I think this actually started before I added empty 'automated' pages as I removed the sidebar recent posts widget for this exact reason. It was indexing based on the single link on a page, rather than the content. Though that had a bonus too in that G would index on long strings and I'd show for content pages I didn't have and an amalgamation of terms across two pages - yes it's that silly. If anyone knows this is panda for certain, then why do it on a new site too?

But if was panda or some other animal penalty why would most pages with content rank correctly and be first page, whereas new pages written in to also be on topic, and linked from home page, be indexed incorrectly.

I was kinda leaning to thinking that there was an inbound link v pages index count. I recall something in the past years and year ago x amount of links = x amount of indexed pages possible. But that's what I can't figure out. Some pages do index correctly some don't. If I use G fetch, most change immediately and remain.

As for zombie or vampires, that I think would depend if the datacentres with the different algorithms were switching between live results as it seems to do. With zombie, also have to remember most people I think will be on personal search it not switched it off via settings, app or manually with &pws=0. If at any point personal search is played with by G, that could see a rise or fall in a website being shown.

I guess I'll only know more when my 8 year site has more content on those empty pages or the new site is up for at least six months and if still happening. And yes I have linked the sites for GA and WMT and probably have the same whois but took people's advice on that not being a problem.

At least you're having the same problem as me, Alex Jones won't be saying its aliens.

Simon_H

8:40 pm on Jan 13, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



@timemachined I'm not sure what you mean by 'empty' pages, but it sounds like you have quality issues which may point to Panda. And if you've been hit by Panda, there's no point trying to apply any logic to what is and isn't ranking well. Panda is a site-wide filter, but it doesn't apply equally to all pages. Some pages may rank fine for competitive terms, other pages may appear to be missing from the serps. You may even find that the offending pages rank well, where higher quality pages have disappeared.

Do you use Google Analytics on your 8 year old site? If so, show traffic for as many years as possible, note any dates when traffic seems to suddenly drop or increase and compare this with key Panda dates. That should give you an idea if Panda is responsible. Regarding the new site, I don't know any detail, but there are several reasons why a new site could still be hit with Panda.

But, again, probably best to continue on a different thread as I don't think this relates to zombie traffic.

Babadook

2:47 am on Jan 17, 2016 (gmt 0)

10+ Year Member



I often wonder about Zombie traffic that we see in search and what some single friends have told me about dating sites. Dating sites have a steady stream of Zombies, that is fake profiles of people that don't exist. The dating Zombies show up as people viewing your profile. It's not real but the illusion is to make you feel better. The same thing goes on in search. Evidently there is a percentage of bots just playing with themselves. What if a computer creates pages that get a good ranking like they do without human interaction, then a Zombie visits it? What would that be called?

timemachined

8:40 am on Jan 17, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Zombie Love?

Tyme

1:07 pm on Jan 17, 2016 (gmt 0)

10+ Year Member



My theory.
I have a page titled (Gross big green widgets) The page is full of gross green widgets.
Joe surfer is looking for tiny little blue widgets.
Google for reasons unknown chooses to rewrite my title too (Nice little tiny widgets) .
Joe surfer clicks on the rewritten title in serps and lands on the Gross big widget page. Joe surfer stares at the page for a few minutes and tries to find out where the heck the nice little widgets are...Poor bugger never finds them...
Could this be the cause of zombie traffic?

Babadook

1:26 pm on Jan 17, 2016 (gmt 0)

10+ Year Member



@timemachined

Yes, Zombie Love. But it is very unsatisfying for them most often. The only time you know that pinnacle has been reached is when the server slows or crashes.

Nutterum

2:30 pm on Jan 20, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



I agree with Tyme,

Most times especially with small traffic sites (less than 100 visits per day on average) even something small, like Google rewriting your title can cause unwanted traffic that just roams and does not convert. Another possibility that I have experienced in the past is from the company blog I used to run. There was this article that was relevant to the service offered but in a non-direct way. Suddenly this blog post appeared on 4th spot for a keyword with over 120,000 searches per day. Imagine my surprise and big smile when I found out. However overconfidence is a slow and insidious killer - the traffic was abysmal. Granted I had increase in total conversions by a small amount but the conversion from this traffic was <0.1% . However I am fairly sure there are people here that prefer to jump on the bandwagon labeling their lack of in-deapth GA experience "zombie traffic" .

I do however have colleague webmaster that have showed me zombie traffic working like clock-work. They could predict conversions and almost to the letter traffic and bounce rate for the next 5-6 sometimes 10 days in advance. Scary stuff to say the least and in my mind a distinguishing factor between "fake" and "real" zombie behavior.

dipper

8:55 pm on Jan 20, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Is it possible - that Google wants to give a bit more traffic to your website to see how it performs, and when it doesn't do as well as they would like it to do, they pull that back to give to sites that "convert" better? - if you frame your thoughts this way then you'd always want to be included in the testing Google does, and always get this zombie traffic, just in case you nail it, and Google decide you deserve it permanently as your "converting" it well. Having more traffic from Google, whether it converts well or not, is always a bonus imho.

masterjoe

3:56 pm on Jan 22, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



I spoke too soon, the zombie traffic is back and I seem to get strings of sales and then days of having nothing once again. I noticed on a few keywords that I have far more relevant pages than the pages they are ranking for these particular terms (to add on to the previous discussion). It's beyond stupid, there should definitely be more weight on what's actually on the page rather than deeming it overoptimized or whatever else they think is wrong with it.

i don't write for certain kw densities or anything else, but the keywords I am trying to rank for are naturally placed within the content. If this is contributing to zombie traffic then I expect a fix soon, because the results are ridiculous at the moment. Old sites, spammed PBN linked crapfiliate websites are all over my niche.

Simon_H

4:03 pm on Jan 22, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



@dipper A key criteria of the zombie phenomenon is that traffic *doesn't* increase. The traffic volume stays the same. It's just that during zombie periods, the traffic seems to be junk and doesn't convert.

Simon_H

7:05 pm on Jan 22, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



Google's Andrey Lipattsev did a Q&A very recently, where he specifically revealed that Google is constantly experimenting/testing on live. He adds "We have on one hand the big launches, so an engineer sits down ... runs some tests, see how it affects a sample of results... The launch committee says okay, let’s launch this." The emphasis is on "sample of results", the point being that Google has a very limited test environment, instead using the live organic serps as a permanent beta environment.

I think this may be key to the cause of zombie traffic. Given that Google does significant testing on live, they *must* have testers, whether human or bot or both. It is extremely unlikely they'd deploy an experimental change and then completely ignore it. And their testing would be structured with a set of test sites and some kind of test schedule; I doubt they'd limit their testing to people informally trying random searches and guessing if the results seem ok.

I also think that this leaves Google with a major dilemma. On one hand, they need to test on live and testing must also involve clicking on results in the serps because we know that user metrics play a part in ranking. But on the other hand, Google doesn't want sites to suddenly see a huge surge in organic traffic due to testers hitting the site during a period of testing. The only option would be to enforce traffic quotas such that their test traffic replaces real traffic such that total daily traffic appears consistent from day-to-day. The switch of real to 'fake' users wouldn't be noticeable to many sites, only to those that measure conversions, e.g. ecommerce sites.

I think this may explain the zombie phenomenon. I think that there may be specific test days and test periods when Google's network of testers (human and/or bot) are testing the algo updates on a specific set of test sites. Which is why groups of apparently unrelated sites appear to experience zombie traffic on the same day as each other.

I also think that this testing is leaking through to paid. Certainly, if there is a network of testers (human or bot) that have been asked to run specific searches and then click on a specific test site in the serps, I can see the same site being inadvertently clicked if it appears as a paid result.

Any thoughts?

mrengine

8:55 pm on Jan 22, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



He adds "We have on one hand the big launches, so an engineer sits down ... runs some tests, see how it affects a sample of results... The launch committee says okay, let’s launch this." The emphasis is on "sample of results", the point being that Google has a very limited test environment, instead using the live organic serps as a permanent beta environment.

Any thoughts?

The engineer is not sitting down for months running tests. Anyway a big algorithm update just occurred so those sites which were the test subjects should be released to *normal* traffic patterns. Yet people are still dealing with Google zombie traffic. I don't see how it is possible that there is any relationship with Google's live testing and zombies.

Simon_H

9:11 pm on Jan 22, 2016 (gmt 0)

10+ Year Member Top Contributors Of The Month



@mrengine Perhaps I have abbreviated the quote in a confusing way. Andrey implies that the engineers only carry out limited tests in the dev environment before the update is released. Once released, I doubt the engineers would be heavily involved in testing.

And that's a huge assumption of yours that all test subjects should now see normal traffic. Google carries out frequent updates to the core algo, both minor and major, which would need testing at any time. Not to mention that Penguin is due imminently. So I disagree that Google no longer has a need for algo testing.
This 396 message thread spans 14 pages: 396