Msg#: 4674337 posted 5:00 pm on May 25, 2014 (gmt 0)
Welcome to WebmasterWord, elrafei!
Each of these Google algorithms focus on a different aspect:
So called Payday Loans algo is an algorithm that targets queries that are heavily spammed. The "payday loans" is an example of spammy query (a query where many sites are trying to rank using spam techniques)
Panda targets low quality sites, aiming to remove low quality sites from search results.
Penguin targets webspam and sites that violate Google guidelines (link schemes, keywords stuffing, cloaking etc)
Hummingbird is a new algo where Google tries to "understand the query" when sifting and ranking the results, that is, find meaning of the words behind the query rather than just matching the words typed as query with results.
It is difficult to answer more precisely without knowing why you are asking. Mods note: I have given one example of a spammy query. At this stage we will NOT welcome more examples (this is to ensure this thread is not turning into spam itself).
Msg#: 4674337 posted 5:17 pm on May 25, 2014 (gmt 0)
It is difficult to answer more precisely without knowing why you are asking.
I agree and it is very important to note that whilst Google has attempted to resolve the many issues with each of these algos, Google has actually harmed/killed off many sites which were not at all spammy/low quality or violated their guidelines.
In other words there has been a lot of innocent casualties with Google's "friendly" fire and many do not understand why they have been eliminated in this way.
If your question is "how can I avoid these penalty algos" then construct an original site with original text and images, make it as high a quality as you possibly can and do not stuff keywords at every opportunity, make it as natural as possible.
Yes I know there are spammy sites all over the place doing precisely the opposite but at least if you construct a quality site you'll sleep better at night rather than wondering whether you've been ousted or not.
Msg#: 4674337 posted 8:52 pm on May 25, 2014 (gmt 0)
Having separate sporadic updates is one of the worst aspects of Google's approach. It produces volatile unstable search results. A good algorithm would produce results that evolve slowly and gradually. All these different pieces should be integrated together and the updating should be continual.