| 4:39 pm on Nov 29, 2012 (gmt 0)|
Wait a sec, what's happening here is good. TMS has thrown a really excellent, data-based "test" at the theory, and backdraft7 and Shaddows both responded with the relevant information: their sites only started doing this a couple of years ago, and it seems to get worse around known Google updates. Those responses right there: if the rest of the thread hadn't already convinced me what's happening is Google related, those would have.
Now, whether it's something Google's doing on purpose or a side effect of other stuff, I don't know. But does it matter? The effect is the same regardless of the intent.
I think there's a lot of data supporting this theory, and it might be time to accept that it is happening, and it is Google related, and start talking about how to get past it. At the end of the day, that's what matters, isn't it?
|Martin Ice Web|
| 4:42 pm on Nov 29, 2012 (gmt 0)|
diberry, nice writing. Maybe we should search someone who does not have the zombie and on/off periods? Somebody out there?
| 5:06 pm on Nov 29, 2012 (gmt 0)|
"how to get past it" mainly depends on what causes it...and that, I think we try to figure out in order to react on it...
| 5:33 pm on Nov 29, 2012 (gmt 0)|
|Maybe we should search someone who and on/off periods? Somebody out there? does not have the zombie |
I don't have it (anymore at least, not sure I ever did), and I don't mind offering up some anonymous stats as controls in any future experiments.
|"how to get past it" mainly depends on what causes it...and that, I think we try to figure out in order to react on it... |
Well, yes and no. It's kind of like dealing with a penalty. If you've tried and tried and you can't figure out the negative signal you're sending that Google's penalized you for, you can always try to create more positive signals and see if that helps raise your traffic despite the penalty. It doesn't mean you stop trying to figure out the penalty - it just means you're spending your time on what you know is likely to work (creating more positive signals) and putting the lower ROI work on the backburner.
And we don't all have to take the same approach. I'm not saying we should stop trying to understand why Google's doing this. I'm just saying I think the theory holds water, so now it's time to also tackle the question of what people can do to save and enhance their businesses despite it?
| 6:00 pm on Nov 29, 2012 (gmt 0)|
"I think the theory holds water"
then welcome to the club of deep insight in SE behaviours :-)
My current theory is that some signals or a penalty causes G to test out our site or specific pages. I looks to me like G has some method to simulate users, as when looking to heatmap records the visitor behaviour is similar... and there is never a conversation or sales from those visits..they are not behaving like our average user!
But I think this sounds a little like SF :-) or maybe we just getting attacked, some messes with our user experience / bounce rates.
Al the reported visits are G referrals from different search sites, unique IPīs, hosts so on...
It might also be a correction test if some of our pages went straight up to top SERPs, something like G tries to find out if the page is useful to all users he is referring to us.. and as this needs to be figured out fast.. he test it with 1k zombies...
My suggestion is to monitor the sites the zombies are visiting (are this new posts, old posts, have a wide target group, specific and monitor the user behaviour as well, monitor SERP changes and over all traffic to the sites when its OFF. Maybe its just the you have created a high quality post and you reach a wide audience with it but the post is actually only for a niche...)
Well I know I m not that smart :-) but more doesnít come to my mind at this moment...
| 6:42 pm on Nov 29, 2012 (gmt 0)|
|1. Why you make a competition out of the observation? |
I didn't... You did.
|2. Overcome sales/loss of zombies or what? Sales/loss of zombies does not exits, as they are zombies or you think you can convert them? |
If you don't have sales for a certain part of the day(s) you would have sales for if the traffic was the same and not 'zombie' you most certainly are losing sales.
And, if you're not losing anything, then what's the point of this thread? To figure out how to fix something that's not broken and doesn't cost you anything? Doesn't sound very productive.
|You referring with THEY to G or do you? Well, I duno if this phenomenon is controlled by G, G employees, algorithm, animals or has nothing to do at all with G, thatís what I try to figure out here |
Why does it matter? You already stated you aren't losing anything because of it anyway...
| 7:17 pm on Nov 29, 2012 (gmt 0)|
Whatís the purpose of your postings here? You might want to post your wisdom on some other threats Iím sure you are welcome there...
Losing bounce/ user experience signals to google is a big loss and you would understand that if you are relay interested in this threat.....
If you feel like the threat is not appropriate or you disagree you might want to formulate your disagreement a little more mature and not start to insult here
| 7:18 pm on Nov 29, 2012 (gmt 0)|
Either you're losing something and you need to find a way to overcome it (maybe creatively or through another means) or you're not losing anything and I cannot imagine why it would matter to anyone if there's zombie traffic or not when there's no loss, because if there's no loss figuring out if there is or not will not result in gain.
| 7:28 pm on Nov 29, 2012 (gmt 0)|
|Whatís the purpose of your postings here? |
To give people a different perspective on the situation or alternate ideas they can explore. It wasn't until I saw the numbers from 3 of the 5 business and sites I had access to that I saw the pattern I noticed.
Sometimes people get caught up in a situation and 'can't see the forest through all the trees' and a different perspective or position can sometimes help them through that.
If you don't like my posts, don't read them ... It's definitely not a requirement for you to read every post in every thread.
|Losing bounce/ user experience signals to google is a big loss and you would understand that if you are relay interested in this threat..... |
Really? So people are reporting zombie traffic and that's impacting their overall rankings ... Totally defies logic and reason ... First, Google has stated repeatedly bounce rate is a very noisy signal and either doesn't use it or uses it as a very small factor ... Second, if all their rankings were impacted due to zombie traffic bounces then they wouldn't know they had zombie traffic, because they wouldn't have traffic to the other parts of the day where it's not zombie traffic to compare it to.
### # ###
Maybe someday you should read some of my other posts in other threads rather than trying to send me there?
| 7:40 pm on Nov 29, 2012 (gmt 0)|
You might want to rethink the suggestion you have made to backdraft to open the I DESERVE MORE GOOGLE TRAFFIC threat and open the "SE BEGINNER GUIDE" threat, that would save us much time to reading trough your posts and you would attract the audience you are searching for.
Google is up to something what causes zombies, thatís my opinion..the why and how we gona need to find out...
And if you think that 1k zombies per day doesnít influence on your bounce signal/ user experience why donít visit the blackhat world and find some1 who can send you some visits to mess with your bounce and user experience rating... doesnít harm or?
Another tip, as I donít worry about possible lost sales, others in that threat might do...
So thank you for joing the threat and I hope you will come up with soemthing usefull!
| 7:50 pm on Nov 29, 2012 (gmt 0)|
WTF are you talking about?
If you actually read my posts you would know I said Nothing of the Kind!
Backdraft7's been here for years and I respect his opinion, unlike that of those who don't actually read my posts and then try to put words in my mouth.
| 7:59 pm on Nov 29, 2012 (gmt 0)|
rofl he edited the post and deleted his statement backdraft and me quoted out...
come up with something we can use
| 8:03 pm on Nov 29, 2012 (gmt 0)|
It wasn't me you quoted... I didn't say that. (There's no way I'd make a statement like that, especially to backdraft7.) I tried to figure out who posted it when I saw the quotes, but think as you like... You've obviously got it all figured out.
| 8:12 pm on Nov 29, 2012 (gmt 0)|
Sorry my bad I do apologize to you! It prolly was a zombie saying :-) ... and now he is gone... is not related with you. Prolly realted with the way Iím observing posts and monitor posters behaviour...
| 8:15 pm on Nov 29, 2012 (gmt 0)|
I can see why so many long term posters have left here, witch hunt springs to mind.
FYI I too have run many online and offline businesses. Some days you'll sell 50 of something some days you won't, that's life.
Unless you can come up with any fact, you all shouldn't be so harshly discounting anything.
[edited by: Dave_Hybrid at 8:18 pm (utc) on Nov 29, 2012]
| 8:17 pm on Nov 29, 2012 (gmt 0)|
|The general "I deserve more Google traffic" thread might be more appropriate for your post. |
Attention to detail is important.
|I can see why so many long term posters have left here, witch hunt springs to mind. |
Yeah, it's a bit too much for me sometimes too, which is one of the reasons I took quite a while off from posting here ... Someone putting words in my mouth doesn't sit very well, but it says a great deal about the credibility and attention to detail of the poster, which, in my opinion, may very well carry over into other areas.
| 9:34 pm on Nov 29, 2012 (gmt 0)|
This last page is so confusing!
SEChecker - you've been lumping TMS in with bluntforce, whose post is still there, unedited on Page 9. TMS is NOT dismissing the theory or suggesting anyone's looking for excuses or just whining or any of that. He is asking probing questions that helped backdraft7 and Shaddows construct responses which made it even MORE clear that whatever this is, it's Google related.
TMS is trying to help. Please go back and re-read bluntforce's comment and then re-read TMS's separately. He's offering some exceedingly valuable data - patterns observed in not just several websites, but several on- and offline businesses. This is exactly the sort of data we need, and I feel you've misjudged his intentions here completely.
| 9:40 pm on Nov 29, 2012 (gmt 0)|
Thanks diberry ... Glad someone got what I've been doing and I know people like shaddows and backdraft7 do, because I've been reading the level of their knowledge in posts for years and they're both 'thinkers', definitely not the type to throw their hands up in the air and quit or stop trying to find a solution to, or a workaround for, an issue.
I'm actually glad I could help them out a bit, even if it's just to 'spark' some different thought or ideas or data sharing, because I'm sure they've helped a ton of people here over the years with their contributions, so I'm happy if I can help them out a somehow.
| 10:47 pm on Nov 29, 2012 (gmt 0)|
Glad to see the smoke cleared on that one! TMS is correct in saying that it's not like we're not trying. Most of us have been around since well before Florida and have survived thus far. If we sound cranky, sorry.
| 11:03 pm on Nov 29, 2012 (gmt 0)|
I'd sound cranky too Backdraft7, for sure, and this isn't my first user name either ;)
Part of the reason I jumped in is what I saw years ago, but didn't post about at the time, and partly decided to now because just this discussion is 300+ posts with no solution or fix or workaround presented from what I've read and the discussion I skimmed still seems to be revolving around the 'if' rather than a 'fix'.
'If' doesn't seem to be anywhere near as important to me in this type of situation as a 'fix', so assuming it's happening and assuming it's going to keep happening, what's the workaround to the situation?
I know the businesses I worked with made some adjustments when we saw the cycles, one of which was to seriously dampen PPC spending during the 'sales drop' periods, then ramp up again just before the purchasing period and begin tapering off again just before the end of the purchasing period to save on spending when we knew there would be no corresponding sales, but the items they sold had a longer sales cycle than some 'buy it now' type purchases, so that exact pattern might not be a benefit for everyone, but I think it's something to look at, especially if there's a definite, predictable 'cycle' to the on/off periods...
ADDED: Basically, we really 'shuffled' PPC spend budgeting to maximize the purchasing period we saw and let the 'down time' go ... The highest spend days were actually just before the sales period started and the first couple days of purchasing, because we knew there was a longer sales cycle, so a heavy spend going out of the purchasing period didn't make much sense.
Shaddows observation about the InktomiBot is interesting too, in my opinion.
| 11:53 pm on Nov 29, 2012 (gmt 0)|
|And if you think that 1k zombies per day doesnít influence on your bounce signal/ user experience why donít visit the blackhat world and find some1 who can send you some visits to mess with your bounce and user experience rating... doesnít harm or? |
Google doesn't have access to your server logs or visitor data, so having a black-hat send you some 'zombie visitors' that puts a visit in your logs with Google as the referrer doesn't harm anything WRT rankings, and again, Google has said for years bounce rate is a noisy signal they either don't use or only use with little impact on rankings.
Click-thru, click-back, re-click (a subset of bounce rate) is more reliable and could be used, but even if you could find a black-hat with a bot advanced enough to parse the AJAX Google uses now and trick their system into thinking there was a real query and click and click back to the results and another click or query, when there wasn't so the 'chain of events' was tracked by Google (not likely, but the only way they would know it happened) there would be a definite pattern to the clicks, timing, site/page(s) clicked, query (or queries), etc. tracked and it would, very likely (most certainly), be discounted (or removed) from ranking impact, because Google doesn't like to have their results manipulated and something like you're talking about would be very obvious behavior in a large dataset (it's as easy as for them to see an 'odd pattern' in their data as it is for you in your stats, so there's no way what you're talking about would go unnotice as 'manipulative' and 'thrown out' or 'severely discounted' by their system), especially when they have access to things like toolbar data that would show a distinctly different pattern from real visitors.
Also, assuming, they're real people who 'just don't buy', you cannot tell what's happening on other sites (pages) on the same page of the SERPs for the same queries at the same time as yours, so a click-thru, click-back, re-click from a real visitor who doesn't purchase really doesn't tell you anything, because you don't know how many sites (results) the visitor is doing the same thing on, so there's no way of knowing if it has an influence on your rankings, but I don't see people reporting total loss in rankings when they report zombie traffic, so unless someone has definitive proof otherwise, it does not seem overall ranking impact from zombie traffic is at all likely.
ADDED: Third, if there's really an impact, the 'traffic drop' from zombie traffic should not be a 'one-time-event', but rather an on-going slide, eventually to nothing, because the zombie traffic is an on-going event, which means someone like Backdraft7 who has been seeing an odd traffic pattern for years should be totally out of the rankings or very close by now, but that's not the case.
BTW: Welcome to WebmasterWorld
| 12:44 am on Nov 30, 2012 (gmt 0)|
It's something that I have been thinking lately about the idea behind Zombie Traffic or traffic that converts.
One thing that I have noticed is that Google does keep track of personalized search or personalized profile for each of its users. Based on that personalized profile, Google provides different search results. Based on the website content, linguistics, and link profile, and user history / association. Each website attracts specific type of users. Because of this, this may explain why different niche sites under same owner often exhibit similar behaviors.(May be because of wording, coding, or design layout).
It seems like that websites are categorized and associated with different data sets (or user bases). For example, two different sites are categorized as
Site A: Good for 18 year old who loves cars
Site B: Good for 20~30 year olds who loves cars
Site A and Site B sell similar items or have information about - cars
If based on content, analysis and what not, somehow if your site gets thrown into the Site A. Your site gets increasingly associated strongly with the certain user / age groups (more links from social sites frequented by 18 year olds). Or linguistic comment / content profile that fits a 18 year old. You may end up with tons of page views and 0 conversion. Because 18 year olds have less money to spend but more time to "surf"?, they end up just "browsing" and never buying. Where as if your site gets bucketed with 20~30, you will in turn get more sales.
However, it is to my understanding that Google continuously throws different user bases or tests into individual websites as control and testing groups, to test their behaviors on certain sites. That could be the result of certain period resulting in higher sales because of user base turn "on or off". This means that Site A may still get 20~30 year olds that convert occasionally, and Site B will sometimes receive test 18 year olds that do not convert. I think the thought is that if a site sucks for certain user bases, the "suitable" visitors will do the work for Google and help them comb through the SERP. Such as Site A a 18 year old focused site will put off 20 year olds, and Site B will put off 18 year olds.
@TheMadScientist idea that our real human visitors do act in a sales cycle in the "real world". We can definitely conclude why we see odd patterns of traffic and conversion turning on and off without much of a logic. We lack the data of when and how Google is directing specific visitor sets to our websites.
Maybe, "Zombies" are simply human beings that do not have the money, but somehow Google thinks "Zombie" traffic is what your website is worthy of (or suitable for the better word). This thought makes sense when combined with Google's coded message "make your site based on your users". It really means, make a website for users with money who will buy. If you do not, Google will throw your site with the less important user affinity.
With this in mind, traffic shaping and throttling makes sense as well. The total search volume from specific user group is limited. And when you break out of one user group and advance into the next, you may see very little change in your overall traffic. Or you can actually lose traffic, but gain additional conversion rate. Throttling exist, not in a conspiracy theory, but maybe a by product of Google's main purpose in serving the right content to specific user groups.
| 12:54 am on Nov 30, 2012 (gmt 0)|
Those are some really interesting ideas frankleeceo ... Glad you posted, thanks for sharing.
| 1:13 am on Nov 30, 2012 (gmt 0)|
Nice post there.
|t seems like that websites are categorized and associated with different data sets (or user bases). For example, two different sites are categorized as |
Site A: Good for 18 year old who loves cars
Site B: Good for 20~30 year olds who loves cars
Site A and Site B sell similar items or have information about - cars
Your above statement may be true, but then why does google seem to think that I personally love Amazon and that they should be at the top of nearly every search I make.
A)I rarely if ever visit the site (havent in 6 months)
B)I haven't bought from there in over 5 years.
I think some of the "zombies" that I see on our site are the page previews. Since they implemented page preview our bounce rate has gone through the roof. I think Google is seeing those page previews as a bounce.
On another note, I had a conversation with a google adwords employee yesterday because our products on google shopping (a now PPC regime) weren't showing up for 99% of the searches that they should even though their tools said that they were
"searchable". This individual stated that they're having serious issues with it and that "some sites" weren't showing products if they had set up the correct category taxonomy. Yes I said CORRECT taxonomy (breaking down the subcategories as they define). And some weren't showing up if they had a UPC code.
So I uploaded a secondary data feed yesterday evening and omitted the upc and set only the very top category for our widgets... I was amazed to see that our ppc budget was blown by noon today.... a little pissed too since those clicks did not result in nearly enough sales to justify the budget. I'm going to parse those logs heavily tonight.
I think there's some serious ghosts in the Google machine.
| 3:18 am on Nov 30, 2012 (gmt 0)|
I guess the only way to confirm or debunk this issue is to run a human/bot test on each and every page affected.
Can anyone suggest a non intrusive (as much as possible) *must* pass human/bot test. An DHTML message box with a couple of questions that must be answered in order to proceed and view the page in question... or something along these lines.
Message box will leave a cookie if human test was passed successfully and will not reaper if the viewer happens to continue to next page.
That would be the only real/confirmed/factual way of putting this matter to sleep...or not.
So the coders among us, lets get creative and start suggesting some code ideas for the bot/human test.
They look like a duck, they walk like a duck... I'm itching to check if they ga ga too....
| 5:57 am on Nov 30, 2012 (gmt 0)|
Xcoder, I like that, but might that not just drive away a lot of human visitors who are put off by having to answer questions to look at a web page?
What about heat mapping that shows where the visitor clicked on the page? Do bots leave click patterns on heat maps? I assume their activity would be invisible to the map.
| 6:03 am on Nov 30, 2012 (gmt 0)|
Wow! Now we're getting somewhere in this thread...
Great suggestions and posts here on this page everyone, thanks!
I love the thought and creativity of the suggestions on this page, seriously great ... This page (9 @ 40 posts per page) is close to the WebmasterWorld I remember from way back in the day.
Thoughtful, creative discussion, suggestions & ideas = Awesome! & What makes this place Great!
| 6:54 am on Nov 30, 2012 (gmt 0)|
Let me think about it for a bit and I'll try to cook something solid up tomorrow, because to really test, it should probably be complex JS in case some bots are parsing basic URLs, so something like using jQuery and grabbing a preset variable from a hidden div (simple, not 'hidden text size', like <div id="bottest">P14</div> OR the name of the domain or something similar) on the page and POSTing the contents of the hidden div (or whatever) to a PHP script that either adds a count to a database if the variable sent is correct or doesn't if it's not seem like it's the direction I would go, but it's late and I'm posting off the top of my head, so I might have a better idea tomorrow, who knows...
[edited by: TheMadScientist at 7:09 am (utc) on Nov 30, 2012]
| 6:58 am on Nov 30, 2012 (gmt 0)|
It doesn't have to be questions. But must invoke an action that only a human would carry.
Perhaps, a message with "welcome to our website. Please help us improve. Tick this box to take a quick survey" OR "tick this box to close this message"... some kind of an action that only a human can carry out. User will not be able to go past this dhtml msg box without taking an action (close it or take a short survey).
It doesn't have to run 24x7, but must run enough time to gather a good chunk of data (last 500 hits or something like that), and especially during obvious zombi "attacks".
get creative people... we can crack it.
| 7:06 am on Nov 30, 2012 (gmt 0)|
|because bots don't run JS |
Google bot does, its a chrome hybrid. It executes java-scripts just like any normal browser. I've seen it happening more then once.
| 7:11 am on Nov 30, 2012 (gmt 0)|
Yeah, but we're not talking about GoogleBot (I hope), because that should be simple to identify as GoogleBot (or Google in general) via IP Address and hopefully we're not missing that...
One of the interesting posts in this thread is relating to previews ... I got a Google or Bing IP Address (I don't remember which it was for sure) for preview requests in my stats rather than a visitor's IP Address as an X-Forwarded-For like it should be for a while ... Has anyone tried tracking the 'zombie visits' by omitting the X-Forwarded-For from IP Logging and running an IP Lookup on the requests for the page(s) in question during 'zombie hours'?