Forum Moderators: open
Reason? They get the rankings because of all the legal stuff they have done, not the illegal. Google identifies optimisation tactics and ignores them.
The reason for this is because many companies employ a third party to optimise their sites. As we all know, many of these optimisation firms do dodgy tactics or are plain stupid. So why should the company suffer for ever more just because they have the misfortune to use a dodgy optimiser? Seems a bit unfair.
I think we all have a culture of "Google will penalise" or "google will ban" when in fact the truth is "Google will ignore" and "Google only deals in plus points, not minus points"
Thoughts?
Result was I was bounced to position 65 and the offending site retained position 1.
When it comes to detecting "Spam", they seem to do in large batches, i do believe some will see a
Black Monday
quite soon, i could be wrong though.
They remove sites every now and then, and i've heard the stories many times, "they posted it somewhere, 15 minutes later the site was gone from the index".
There is a few people who does "spam" that is extremly far ahead of the others in the SEO world, those will probably continue to be ahead, because they use methods which requires human editing, but as Brett has stated many times.
500 Employees + 10-20 hour workday (That's voluntarily from what i know) = around 100-250.000 pages per day viewed at least.
Imagine if Google employees sat down one day with the request from Eric, Sergey and Larry, you have 6 hours to clean out spam from our index.
They would clean the index better than any spam filter would!
Small note, Expedia was penalized in January 2002, and if they are fortune 500, there is a pretty big chance that they should be in top 10 for their keywords, with or without an SEO.
When you work with Fortune 500 companies today it's more about making sure the content gets spidered than target 1 keyword for the main page, mostly because fortune 500 tends to have huge sites, with a lot of content which in the end adds up to huge amount of traffic.
Reading log files, making them understand their users, and in that way present better information to them, use the data for future product development etc, it's not SPECIFIC keyword targetting anymore, SEO/SEM is so much more, (i know getting "sidebared" on the subject).
Also, "Spam" is a general term, each SE uses different definitions and we don't exactly what they are either.
Hardly penalized.
However, for a small business, we know the rules are different. There are hundreds, thousands of urls - gone. Completely. Odp listing -> yep. Etc, etc, etc.
The message is pretty clear, imho - small business - not enough revenue potential, so banning them *for life of the url* is just fine. Nobody will notice they are missing.
But - fortune 500 company, sure they'll get the proverbial slap on the wrist. And then they'll be back.
Odds are, they will also start spending (after the penalty and they lose the traffic...) more on adwords / premium listings.
I noticed expedia is doing so.
So sure, fortune 500 may not be immune, but penalties in that realm, imho, are very different than the 'small business' penalty.
It is because it's reasonable to assume that surfers would want and expect those sites to be indexed. In other words, making Expedia unfindable would degrade Google's results noticeably.
Instead of penalizing sites, they should look at the techniques a site used that got them penalized and adjust their algorithm based on those techniques.
Instead of penalizing sites, they should look at the techniques a site used that got them penalized and adjust their algorithm based on those techniques.
I believe that in general they do just that. Only the really blatant and large-scale cases get individual penalization; spam reports are taken and reviewed to find trends that might be addressed by algorithmic changes (that, of course, could include additions to the PageRank algorithms to apply a PR0 in response to certain linking approaches).
In the past we've seen both good and bad from the approach -- the bad being when fallout from a change made in response to a "spam" technique also degrades the rankings of "innocent" sites. In the long run, though, I personally feel it's the best approach. There are simply too many pages in the index to be able to effectively counter any "spamming" technique by manually applying penalties.