Forum Moderators: Robert Charlton & goodroi
Feb 7, 2017 Google Algorithm Update - studies suggest it was Phantom
Basically, don’t just look at content quality. There’s more to it. Understand the barriers you are presenting to users and address those. For example, aggressive ad placement, deception for monetization purposes, broken UX elements, autoplay video and audio, aggressive popups and interstitials, and more.
Essentially, if you are following the Google guideliens and you are doing things technically right, then that sounds to me like there might just be just quality issues with regards to your site. Things that you can improve overall when it comes to the quality of the site.
Which means there is no simple kind of answer, like there is no meta tag that would make your web site higher quality. It is just in general, you probably need to take a step back, get random people who are interested in that topic to review your site compared to other sites to kind of go through a survey to see what you can be doing better to improve the quality of your site overall. And ideally, don’t just like tweak things to kind of subtly improve the quality to see if you can get a little higher. Really take a stab at it and try to figure out what you can do to make a significant dent in the quality of the site overall.
[edited by: martinibuster at 4:44 pm (utc) on Jul 24, 2017]
85% less traffic...
If the penalty is algorithmic
For us it was 100% a content issue.The report on the other thread describes exactly John Mueller recommended, in MB's quote above:
And ideally, don’t just like tweak things to kind of subtly improve the quality to see if you can get a little higher. Really take a stab at it and try to figure out what you can do to make a significant dent in the quality of the site overall.