Forum Moderators: open
Common knowledge of algorithms and mathematics prove there are countless ways to "alter" a website or webpages ranking. Making changes and editing 1/2 of a line of code on your site might change the way it ranks for what you would like it to, even making your site disappear from the results you monitor. This isn't to be said it probably hasn't had an equal and opposite reaction for obscure term/phrase combinations you are not monitoring, logically ones that you couldn't imagine.
The infinite factors that contribute to 1 website ranking at a "top" spot for a search term cannot be duplicated. You can make changes and see similar affects, however there are too many minute factors going on than you can tally. If you think of everything in basic mathematics; there are too many 1's and 0's.
What I am getting at is there are too many factors that can cause you to drop in placement on Google. When these factors become triggered, it seems that lately instead of searching for plausible answers, people are taking the path of least resistance – crying OOP.
If someone from the plex working in the sauce leaked data supporting any such claims there would be reason to theorize. Until then, is it not simply justification as a result of failed techniques on an ever-changing technology platform?
Waiting over 2 years to get a link is obsurd, seeing areas with broken links in this "important" directory is obnoxious. It has grown too big to manage and quality has suffered greatly.
1) Top ten sites vanished to nowhere in a single update (November).
2) Someone discovered that adding -asdf -asdfg, or whatever to the search caused those sites to return.
The only logical conclusion is that a filter/algo was removing/demoting these sites AFTER the basic search algo had done its job finding them.
3) It was discovered that sites might appear for a two-word search term but vanish for a three-word term for which they used to rank well.
The only logical conclusion is an over-optimisation filter exists/existed (or a bug - but that's not likely in this case).
This OOP filter was given other names (I think of it as a dynamic spam filter) but it was/is not a figment of a collective imagination.
Kaled.
If you think of everything in basic mathematics; there are too many 1's and 0's.
The problem is that there are many variables (I agree with the word 'factors', but not with the idea that there are too many 1s and 0s).
Interesting problems, like understanding search engines, cause many people to give up. Other people work towards overcoming those problems.
Reverse engineering a search engine requires either environments in which variables can be isolated and/or discounted, or a sophisticated approach that is able to infer underlying rules from the complex output of the 'black box'.