Forum Moderators: Robert Charlton & goodroi
Edit: When the core update will be out, don't count on it to do anything in your favor. There is a greater chance it will do further damage. So those that were hit (like me) by December core update, be mentally prepared that it may make things worse.
[edited by: mzb44 at 2:46 pm (utc) on Jun 30, 2021]
Some high authority websites are abusing their ranking and publish new posts with less than 250 words, knowing very well they are likely to enjoy very high ranking regardless of the content. It's happening in my niche. They publish dozens of posts per day, all less than 250 words each.
About May 2020, it makes sense in a way. If you get penalized for something they released in May new websites can't be penalized for it because they never existed. The interesting question is whether they will do it again, like once a year or once every 5 years?
[edited by: Samsam1978 at 3:21 pm (utc) on Jun 30, 2021]
I cannot even face the thought of having to start again. It took me a decade of work which is going down the pan since May update. Copy cat (poorly written) websites and news sites are outranking me with little content, they even use my youtube videos!
I cannot even face the thought of having to start again. It took me a decade of work which is going down the pan since May update.
This reminds me of something interesting I saw this week. I'm seeing several sites that were launched after the May 2020 core update - the one that hit me massively - now all outranking me by A LOT.
The interesting thing about these sites is that they aren't that much different from my site or other sites existing from before May 2020. In a lot of cases those new sites are almost copies of my or other older sites. Not scraped and I wouldn't call them spam necessarily but they clearly got "inspiration" from all the older sites and essentially rewrote whatever those sites did and that's that (I recognise their writers from Upwork - all are from "cheap" countries).
I don't buy that Google is intentionally trashing old websites in favor of new ones. I think the more likely explanation is that there is a major bug in the way that google processes / handles old websites (outside of those few truly massive old players).
My gut is that backlinks are the root problem. So many old websites have a large % of their total backlinks as spam or low quality.
Think about it... the size and complexity of a data set can result in HUGE differences in algorithmic conclusions. I suspect what we're seeing is the action of algorithms coded to work best on perhaps 0 to 10 years of data now choking on sites with 25 years of legacy data. Algorithms tend to have very different output when you get massive amounts of legacy/historical data thrown at them, vs small data sets. I suspect the algorithms are looking at the smaller data sets of new websites and going "ok, we know how to deal with this. we always have".
However, when the algorithms operate on 25 years of data, backlink profiles polluted by 25 years of spam and content scraping and noise, site structures polluted by 25 years of redesigns and historical 301s that Google *still remembers even though they have not been there or used in over a decade*, it totally makes sense that they will produce erroneous rankings for these older sites. Data from new sites is just so much cleaner and less noisy.
I believe it also correlates to Webmaster Tools showing "Indexed, not submitted in sitemap" instead of "Submitted and indexed". The algorithm stops understanding how to index some large older sites when there is an update
Are they intentionally pushing down salt-of-the-Internet niche champions to dumb down SERPs and encourage interaction with googlespam and boost priority publishers or have they failed spectacularly with a brand new AI-based ranking model, supported by AI kool-aid drinking top managers at the search team?
A - malicious intent or B - fundamental mistake? I still can't decide.