Forum Moderators: Robert Charlton & goodroi
[edited by: Samsam1978 at 3:21 pm (utc) on Jun 30, 2021]
I cannot even face the thought of having to start again. It took me a decade of work which is going down the pan since May update. Copy cat (poorly written) websites and news sites are outranking me with little content, they even use my youtube videos!
I cannot even face the thought of having to start again. It took me a decade of work which is going down the pan since May update.
This reminds me of something interesting I saw this week. I'm seeing several sites that were launched after the May 2020 core update - the one that hit me massively - now all outranking me by A LOT.
The interesting thing about these sites is that they aren't that much different from my site or other sites existing from before May 2020. In a lot of cases those new sites are almost copies of my or other older sites. Not scraped and I wouldn't call them spam necessarily but they clearly got "inspiration" from all the older sites and essentially rewrote whatever those sites did and that's that (I recognise their writers from Upwork - all are from "cheap" countries).
I don't buy that Google is intentionally trashing old websites in favor of new ones. I think the more likely explanation is that there is a major bug in the way that google processes / handles old websites (outside of those few truly massive old players).
My gut is that backlinks are the root problem. So many old websites have a large % of their total backlinks as spam or low quality.
Think about it... the size and complexity of a data set can result in HUGE differences in algorithmic conclusions. I suspect what we're seeing is the action of algorithms coded to work best on perhaps 0 to 10 years of data now choking on sites with 25 years of legacy data. Algorithms tend to have very different output when you get massive amounts of legacy/historical data thrown at them, vs small data sets. I suspect the algorithms are looking at the smaller data sets of new websites and going "ok, we know how to deal with this. we always have".
However, when the algorithms operate on 25 years of data, backlink profiles polluted by 25 years of spam and content scraping and noise, site structures polluted by 25 years of redesigns and historical 301s that Google *still remembers even though they have not been there or used in over a decade*, it totally makes sense that they will produce erroneous rankings for these older sites. Data from new sites is just so much cleaner and less noisy.
I believe it also correlates to Webmaster Tools showing "Indexed, not submitted in sitemap" instead of "Submitted and indexed". The algorithm stops understanding how to index some large older sites when there is an update
Are they intentionally pushing down salt-of-the-Internet niche champions to dumb down SERPs and encourage interaction with googlespam and boost priority publishers or have they failed spectacularly with a brand new AI-based ranking model, supported by AI kool-aid drinking top managers at the search team?
A - malicious intent or B - fundamental mistake? I still can't decide.
I saw something interesting the other day on mobile devices. I searched for something, it did not let me proceed to the second page. The message was something like try again when I tried to load more results. It did not happen again, it might explain a few things.