Forum Moderators: martinibuster
If the top sites on the internet are gaining 500 to 1000 backward links within
a given index (just for arguement sake); another site that would gain 5000 links
would obviously indicate to google that there is something not kosher about that
site, no?
Others say the whole idea is goofy, noting that Microsoft has millions of links that say "Microsoft", and even Acme Widgets probably has mostly links that say "Acme Widgets".
A quick jump from 2 to 10K might trigger some kind of link growth rapidity flag, but we don't really know if such a thing exists. Are these 10K links on different sites, or on 10K pages of a single site? Do tell us how you make out... :)
link growth rapidity flag
To all PubCon attendees - can someone ask Mr.Google Representative?
we do know that Google became sensitive to Googlebombing, and it's conceivable that some legit links could get devalued if Google somehow kicked them into a "bomb" category.
Yes but i think their idea of tackling it involved checking the title and body of the site in concern. If the keyword was not present in either of this then it might get categorized as "Google Bombing". It becomes dicey when the linked page has a robots.txt file barring search engines to index it.
Coming to original question at present I don't think so. Wouldn't be surprised if they were experimenting with such an idea. They always do such research on a small subset of the index.
But if the threshhold was set high enough, the amount of innocent sites that would trip the alarm would be lessened somewhat. Do you think Google accepts a certain amount of collateral damage in its efforts to keep the results free of spam?
Do you think Google accepts a certain amount of collateral damage in its efforts to keep the results free of spam?
Not only do I believe this, I've used those same words in other threads about topics like multi-hyphenated domains. I certainly don't think Google is callous or evil. I DO think they test various algorithms and see which produce the best results. If your site happens to resemble something they are filtering for, too bad. If millions of sites get bombed, they'll probably try something else. If 95% of the SERPs look better, though, and most "good" sites aren't affected, no problem.
One specific example was the PR0 debacle of a couple of years ago. One of the things that triggered the penalty was a "themeindex" file, a signature of Zeus software. It seemed that there was no review to determine whether one's link directory was of high, indifferent, or low quality - they all got clobbered. Link directories of decent quality weren't the only collateral damage. Included in this purge were some pages that had nothing to do with Zeus, but which had the bad luck to have been named "themeindex" by their creators.
In short, Google is algorithm-driven. They will tweak the algorithm to produce better results on average. Sometimes these tweaks may adversely affect some sites with great content, but what can one do?