How do you differentiate between human and bot traffic? Clearly traffic that converts will be human, but human traffic may be present when bots are present, and human traffic may not always convert. So a situation may arise where both bots and non-converting humans are online giving the impression that only bots are present.
Assume a baseline of 100 human visitors, with a average conversion rate of 10% with a degree of variability such that on any given day you can have between 5 and 15 conversions.
Day 1 -> 100 human visitors, 10 conversions, 50 bots => perceived conversion rate = 10/150 = 6.66%, actual 10/100 = 10%
day 2 -> 100 human visitors, 15 conversions, 50 bots => perceived conversion rate = 15/150 = 10%, actual 10/100 = 15%
day 3 -> 100 human visitors, 5 conversions, 50 bots => perceived conversion rate = 5/150 = 3.33%, actual 10/100 = 5%
Based on the above example you your analytics misleads you into believing that your best day is day 2 and is only average, the other days are either really bad or just acceptable. I would also feel that something was very wrong on day 3, where did all the humans go?
But if you were able to filter the bot traffic, your perception would be much different. All three days would appear to be in normal range.
If bots are mixed in with human and we are unable to detect them, then there is real problem with making judgments based on our analytics (GA or otherwise).
Assuming bots bypass Google then Rankings should not be impacted, and even if the bots were to arrive from Google I am not sure if that would impact rankings. If there were an impact it would most likely be a positive impact on rankings (this is debatable). In regards to ads, again assuming that the bots are bot-net bots using or mimicking actual browsers then ad impressions should be unaffected, clicks or CTR on the other hand will be fall just like conversion rate.
Strange, yes I agree. But apparently plausible, [
webmasterworld.com...]