|Link velocity... does it matter in 2014, and why?|
I was looking at the above thread which did not get too many replies. Some sites get 1000's of links in a week and rank fine, some sites get 50 in a day and die.
Which signals do you think google needs to think that those 1000's are natural. Bit of a debate going on between those who says it does not matter and those who say it does.
Since Google doesn't reveal their secret formula, no one knows 100% for certain how this works.
I would guess that the more links you have the more links you can gain. The New York Times gaining a thousand links overnight is not a big thing. A brand new 20 page site gaining a thousand links overnight is a big potential red flag.
I would also suggest that not all links are equal. I suspect some links are high quality and other links are very poisonous. If you gain a small number of very poisonous links from websites that obviously exploiting Google, I would expect you will have ranking trouble sooner than later.
I value links by the odds that the link will send me real traffic that converts. The more chance the link will perform, the more effort I exert in developing that link. My focus is building up external traffic sources so I am not dependent on Google. The more I do this, the more traffic Google sends me. People that tend to target low quality links that don't send any traffic tend to have more ranking issues IMHO.
Personally, I think Google can tell that all velocity is not created equal. I have a lot of seasonal sites; they really only gain links for a few weeks of the year, and during those weeks they get a TON of them - news media and government and local / municipal and bloggers, and scrapers as well, and this has been going on for better than 15 years and I've never had a link problem.
I work with seasonal sites too where the difference in the traffic between busy and quiet time of the year is more than tenfold. I am sure Google is aware too.
When Google classifies sites, it knows an average behaviour of certain categories/niches with regards to traffic and links and anything that deviates over x% looks unnatural. For some sites, busy/quiet pattern is natural and the evenly spread link acquisition could be flagged as unnatural one.
Must be data driven and it relies on human greed
The issue Google has is that because of the huge volume of data, it has to rely on the some kind of deviations from the link acquisition pattern within the site classification to suspect unnatural links(*). And human greed (greed for links that help ranking) helps Google in this.
Not new - this is how Retail Loss Prevention works too
I worked in Retail IT for years and this is not dissimilar the way Loss Prevention works. Take for example, a supermarket. If the till operator has stolen money once (by, lets say, performing bogus return/refund), it is difficult to notice it because bogus refund is masked/hidden amongst all other validly performed refunds.
But what then happens is that when bogus refund once worked, after some time the till operator does it again (and at that point it is still not noticed). And with the time they get braver and braver and they do it again and again in a shorter and shorter period of time in between. And at certain point they suddenly have done x times more refunds than the average of all other operators - when there will be a very little doubt that it is not circumstantial.
This deviation from the average pattern is what then gets highlighted by Loss Prevention reporting system and this is how they get caught.
The same principle is used with links acquisition
Webmaster artificially builds a very few links and sees miniscule improvement. They wait for some time and the improvement is still there and no penalty was received. Great, they say... few more would not hurt, and they perhaps see further ranking improvements.... then they say one more would not hurt... and at some point the site link acquiring pattern is beyond doubt totally different to what similarly categorised sites have.
So there are "corridors" of what is acceptable/understandable/expected for a certain site, not just based on site classification by Google, but I am sure on other factors such as site exposure etc.
And then there are False positives (and False negatives)
It is clear that the above can produce false positives because the average corridor, however wide, is still the "average" and there will be sites that will acquire links outside this corridor by a completely natural way, or there will be sites that are not quite correctly classified so a different corridor should really apply to them.
Discouraging link building [webmasterworld.com...] [webmasterworld.com...] is one way. Webmasters that are involved in SEO will read/will hear what John Mueller or Matt Cutts said and these are the ones that Google wants to discourage. Average Jo Blogs has no idea who Matt or John are and will be dropping links in their form posts just as they did before.. and these are links Google does want to stay because algo is still based on links [webmasterworld.com...] .
(*)I have ignored other factors that can also tell to Google links may be unnatural such as links from link farms etc. since this thread discusses velocity.