Forum Moderators: Robert Charlton & goodroi
[edited by: goodroi at 5:55 pm (utc) on Nov 10, 2015]
[edit reason] Let's be careful to keep the discussion on a professional level [/edit]
Zombie traffic... generated by negative SEO robots which click around the other 9 sites in the top 10 of the one you're trying to demote.
Where are you getting your information from?
And you think it's possible to block robots coming from random IP's via Google's results.?
Would you care to explain how you block this as its something I would like to implement, quickly.
Google blocks bots too. If it were indeed bot traffic, Google would have credited my Adwords account for all the junk clicks. They did not do so on their own and did not do so when I complained about the quality of traffic they were sending.
Why I ask is this, I am getting really fed up of G ranking pages based on the anchor kw text in another article or at the bottom of an article, despite kw metas, on page kw ratio being 3% and having related words.
As for meta keywords, I don't know, but imagine it coming back (if it went away) and having to add in your own manually - yes 'all in one' etc. will auto add according to on page relevancy but I only add two anyway so doesn't take up much time. G switches on and off new and old stuff - authorship etc. can't rely on what they say at any given time.
He adds "We have on one hand the big launches, so an engineer sits down ... runs some tests, see how it affects a sample of results... The launch committee says okay, let’s launch this." The emphasis is on "sample of results", the point being that Google has a very limited test environment, instead using the live organic serps as a permanent beta environment.
Any thoughts?