Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - October 2018

         

broccoli

11:36 am on Oct 1, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



The following message was cut out of thread at: https://www.webmasterworld.com/google/4918232.htm [webmasterworld.com] by robert_charlton - 4:08 am on Oct 1, 2018, (PDT -8)


I seem to have recovered most of my rankings from before my suspected mobile-first Fred penalty, apart from the very highest volume ones, where an annoying thin-content site is still pushing me down.

The traffic to my site has doubled to about 4K. I’m still well off the 10K figure I was at before the March update pushed up a bunch of low quality sites in my niche.

No corresponding increase in adsense earnings though. As I’m a viral site I see weird, unnatural adsense drops after traffic increases all the time. CPC is still the same but CTR has halved. I hope it settles down. If not, my entire niche may no longer be financially viable.


[edited by: Robert_Charlton at 12:11 pm (utc) on Oct 1, 2018]
[edit reason] Cleanup after thread split to new month [/edit]

Jori

10:09 pm on Oct 20, 2018 (gmt 0)

10+ Year Member Top Contributors Of The Month



I second NickMNS
I'm seeing in my niche competitors with no EAT gaining positions. They are now in front of me most of the time.
I'm thinking maybe it is not an EAT issue, but a mobile issue.

I received, like others here, the message from google saying my site was now in the mobile-first index just some days before the 27th september update.
I lost 20% of trafic. Because, maybe, Google see now a different internal structure, even with my site beeing responsive.

The main navigation menu is different for desktop and mobile. In mobile, I don't show the submenus for readability. If Google give priority now onto the mobile version, all the page attached to the submenus will be penalized.

My concurrent, on other hand, have a very simple menu : it's just all the links throwed at the user, without any kind of ergonomy : it's difficult to read.

Does all of this makes sense for those of you beeing "penalized" by the current updates ?

Shepherd

10:11 pm on Oct 20, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



For now, I'm thinking that overall quality requirements on all sites have been turned up a notch, and that "needs met" in the Quality Raters guidelines are most likely a larger factor than before.

Needs met. This is an "on page" factor(s). Quality rater reviews 10 (made up number) pages on a topic. 3 of the pages meet needs, 3 of the pages do not meet needs, 4 pages indifferent. google takes the common on page attributes for the 3 pages that meet needs and the 3 pages that do not meet needs and feeds that information into the machine. Moving forward, pages that have the same attributes as the needs met pages get a boost, the pages that have the attributes of the needs not met pages get demoted.

IF this is the case, we should be able to see a significant pattern of common attributes for boosted and demoted sites. While these attributes could be anything (form, badges, certain words, images, videos, ugc, bullets... etc.) they would be on page attributes, only something you (or a quality rater) could see by viewing a live page. It wouldn't be anything like domain authority or inbound links, or any other off page factor.

Seems to me IF (I have no idea) this was the case it would really make for a bloated algo with so many different factors to look at/for, each set possible different for millions of different topics.

NickMNS

10:25 pm on Oct 20, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I received, like others here, the message from google saying my site was now in the mobile-first index just some days before the 27th september update.

I received this message too at the same time, for my second website and traffic increased. I had received that message for my main site in May and it made no difference. Again there seems to be no rhyme or reason.

broccoli

10:51 pm on Oct 20, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



@NickMNS

I have second website that I was unable to get any traffic to since its creations over 18 months ago. It also has high quality content, but in addition I created a few pages that I'm less than proud of. The pages were created, after the initial build because I felt that logically they needed to exists, but since I had no traction with the site, I invested very little time on these pages. Since these updates I have started to get traffic to this second site, to the lower quality pages that do not fully meet needs. The high quality pages that do meet specific needs still get no traffic.


This fits my theory of comprehensive site relevance getting heavily rewarded over page relevance/authority, which I’m about to test out.

The rubbish above me in the serps is from highly comprehensive sites, some with hardly any backlinks. Do you think there are any pages missing from your main site?

I also think there’s a tipping point where too many pages or stuff that doesn’t get searched for actually counts against you, even in spite of a massive backlink profile. It’s the only way I can explain the rankings of other sites in my niche.

lostshootingstar

12:36 am on Oct 21, 2018 (gmt 0)

10+ Year Member Top Contributors Of The Month



If you mean what I think you mean, I agree. "mini sites" that are built around a single keyword like "cheap widget repair chicago" rank with hardly any content, links, etc. In fact many of them are just 3-4 page sites with less than a paragraph of main content and still rank great.

This is compared to a larger site that has a page for "cheap widget repair chicago" and "cheap widget repair Atlanta" and "cheap widget repair in Seattle", etc. I think it's a valiant effort to reward these sites, but it's very, very easy to take advantage of as many of my competitors have started doing (as have I).

broccoli

1:42 am on Oct 21, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



Yes, kind of, but it doesn’t relate to local serps.

The site dominating my niche has 25 pages, all top keyword searches and almost no backlinks.

The site coming second is a 900 page site that also covers all the same bases and a lot more, and has thousands of backlinks, but the creator has made a lot of pages that probably don’t ever get searched for in a bit of a cynical attempt to get ranked on indirectly related searches. I think they’re getting punished for it.

Meanwhile my site isn’t as comprehensive but has better content than either competitor. Prior to this Spring I had the top ranking for almost every one of my pages. Then I got hit by both sites at once and everywhere we have crossover they both push me down, which has destroyed my income.

One of my pages is 19 years old, has considerably better content, has been updated with fresh content, and has 40,000 natural backlinks. It ranks second after a cheap knockoff made by site 2.

Another page, one of my most popular memes, has 900 backlinks including several press articles, and it’s now ranking 3rd or 4th after site 1., site 2., and a third site with a similar profile to mine (old, many backlinks, not comprehensive).

So I’m hastily filling in the gaps to make my site the most comprehensive. If that doesn’t work I’ll make a second site. If I outrank myself I’ll know there’s some funny business going on.

EditorialGuy

3:23 am on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've noticed some odd results that have looked like blasts from the past: skimpy, keyword-focused pages on exact-match domains that I'd never seen before, for example, ranking in first or second place above pages with in-depth content. (Not across the board, but in enough instances to make think "WTF?" or a PG-rated equivalent.)

Is it possible that Google has cranked up the role of artificial intelligence, and we're seeing strange results here and there as consequences of the machine-learning process? If so, isn't it likely that the results (especially for frequent queries) will improve with time?

Shepherd

11:47 am on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



...will improve with time?

Maybe; where did they get the seed information for the AI, quality raters? Did they hire doctors, lawyers and the such to evaluate YMYL pages, I doubt it, more likely they hired the class of people that write all of the "spun" garbage cluttering the web. It's hard to expect AI to improve if it thinks, based on the information provided, that it is doing a good job.

It's an ambitious project google is taking on, converting the web from a Democracy to an Autocracy.

broccoli

12:32 pm on Oct 21, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



I looked into the quality raters. They have country specific contracting groups looking for people native to the country who have degree level qualifications or higher.

It’s possible that they’re trying to generalise and extrapolate too much with what they rate, but I honestly think all of this fuss about quality raters and user signals and AI is a distraction. The same search algorithms are running things underneath all this, they’re just using raters and AI and user signals to tweak the strength of the different ranking factors.

What they’ve done is sharpened the pareto distribution so the sites that perform well in the short tail get rewarded in the long tail. It’s all politically motivated to combat fake news. That typically means big brands win, unless you’re in a small or obscure niche where big brands don’t yet exist and then things just get weird and messed up. I think it’s a foolish choice on google’s part, because a small number of sites get rewarded and a large number of sites get put out of business, and the internet gets cleansed of innovators and entrepreneurs and turns into a desert of corporations. Ironic choices for an organisation as far left as google.

MayankParmar

1:38 pm on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Anyone seeing a drop today? :/

NickMNS

2:01 pm on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@broccoli
What they’ve done is sharpened the pareto distribution so the sites that perform well in the short tail get rewarded in the long tail.

Distributions such as Pareto do not have a long and a short tail. There is only one tail, a tail that stretches to infinity, hence the name long-tail. Typically when talking about keyword distribution one is referring to the frequency with which a keyword is searched for.

The bell shaped part of the distribution represents the few keywords that are searched for frequently. This is where most sites focus their energy, because they can take a keyword and build content around. Given that these terms are searched frequently, the strategy results in frequent visits to the website.

The tail of the distribution represent keywords that appear infrequently, once a year, once a month and so on. Using the strategy described above for "long-tail" keywords will fail, as the content will rarely ever be visited.

Long-tail vs short-tail likely comes from the misconception that the length of the tail is related to the length in characters of the keyword. This stems from the fact that longer keywords tend to be in the distribution because they tend to be more specific and thus used by fewer searchers. But if you compare the term "Where to buy a new Apple iPhone5" to "model xa3c" it is evident that the first keyword will be searched more frequently despite being much longer than the second.

For a site to suddenly capture a larger portion of the keywords, what needs to happen is that the algorithm needs return results that are less specific. For example site "one" ranks #1 for "iPhone5" because it provided in depth information about that specific device. Site "two" is of slightly lower "ranking power" (for the lack of a better term) but provides detailed information on place where one can purchase the iPhone thus it ranks #1 for "Where to buy a new Apple iPhone5". Now if the algo is updated to give less importance to the specificity of the content it would then accord the top rank to site "one" for both these terms.

This does seem like a likely strategy for Google because over time we have seen them improve on the attempt to better understand user intention exactly for the purpose of trying to return specific and accurate results. But it is possible that the over time i the algo has become too good and that what user "intend" with their search queries is not really what they want. So by returning a more general result they provide a broader content, that has higher probability of fulfilling the users wants. Personally I think this is a stretch but not impossible.

Strictly speaking this is not really to do with long-tail as the site that ranks for "model xa3c" should be largely unaffected, this is more to do with specificity. My main site falls under "model xa3c" website pattern and I have been impacted, so again i have my doubts.

Mentat

2:25 pm on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@MayankParmar - yes, big drop today, with no holiday => google update.

aristotle

4:08 pm on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



People who keep complaining about google's search results should keep in mind that oftentimes there isn't much to choose from. The vast majority of websites are low quality. If you're looking for high-quality detailed information on a particular topic, there's a good chance that it doesn't exist anywhere on the web. How can you expect google to find something that doesn't exist?

justpassing

4:43 pm on Oct 21, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



People who keep complaining about google's search results should keep in mind that oftentimes there isn't much to choose from. The vast majority of websites are low quality. If you're looking for high-quality detailed information on a particular topic, there's a good chance that it doesn't exist anywhere on the web. How can you expect google to find something that doesn't exist?

People are complaining that their sites which are the best in the World, in their opinion, are not ranking first, as they think it should ...

MayankParmar

5:49 pm on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



I would not say my site is best but how come thin tag pages, garbage pages from new sites have managed to outrank my site?

One latest example, people opened my article ranking 6 as the top 5 results were garbage promoting a fake app for affiliate commission. Google demoted my article to 3rd page after this/last week's update. I know my article helped people as they commented, emailed just to thank me for detailing a solution. The article got decent backlink as well :)

There was another example where a company provided me a statement, all other big sites picked up the story but I never appeared in Top Stories. Nothing works when Google singles you out in a update :)

WhoKnows111

6:12 pm on Oct 21, 2018 (gmt 0)

10+ Year Member Top Contributors Of The Month



People are complaining that their sites which are the best in the World, in their opinion, are not ranking first, as they think it should ...


Ignoring my websites... when I search for information - most importantly research when I travel (most of the year) or opinions on something I don't fully understand. I don't want to see generic garbage by X Newspaper who's author probably never visited the place or used the product. They can't even be bothered to have original pictures and use the stuff from media kit.

I like to see the passionate bloggers and people who specialize in the area or products. I often have to go 3rd page or be very creative with the kws in the search to find them.

F* to see Booking and Tripadvisor plastered all over.

Btw I don't have a website in travel niche, that's purely from my user perspective

whoa182

7:45 pm on Oct 21, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



Traffic is very low today...

I manually checked some of my keywords and there doesn't seem to be much difference in my rankings for my highest traffic articles. But I did drop a few positions on a handful of articles (main keyword), only to be replaced by generic, non-relevant articles which are published by newspapers and big brands (they are very new as well, like 3 weeks old).

It's as if Google is trying to be a bit too clever and devaluing title/h1 and looking at the whole article (trying to understand it), then ranking bigger authority sites above content that is 'all about a given topic' even if the big authority sites only contain just a sentence or paragraph on the topic.

It's funny to see your articles drop like that when they took many months to climb to the top, presumably from Google testing and ranking factors. To get to the top and stay there for months must've meant that users were happy (and I could tell from conversions).

Lately, it seems like this: Write informative articles, rise to the top spot of google, get kicked by Google to the bottom of page 1 or 2. Slowly rise again to the top, get kicked back down again... Rise again, get kicked back down again.

Anyway, I hope this is just a temporary dip in traffic that sometimes happens after updates.

But it is pretty quiet...

broccoli

9:14 pm on Oct 21, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



@NickMNS

For a site to suddenly capture a larger portion of the keywords, what needs to happen is that the algorithm needs return results that are less specific.


I believe so, though that may not be the primary intent. I suspect they may have been testing these algorithm changes on medical and news sites which are all huge sites in a crowded marketplace, and they don’t really care what they do to the rest of us because they’re so focussed on crushing fake news and alternative medical advice. I think this is the spam team getting out of control because they ran out of spam to clean up.

I’ll try and explain what I mean about Pareto distribution. I mean they’ve taken the highest ranking sites for the biggest volume keywords and they’ve pushed them UP the rankings for lower volume keywords where they have pages for those keywords.

For example, take the keyword “quiz” (this is NOT my keyword but in a similar niche). It’s a high volume keyword and Google doesn’t know what you want when you search for “quiz”. There are different kinds of quiz sites - e.g. sports, personality, pop culture. So Google generally returns the top sites in each sub niche, and any generic sites. So imagine someone decides to target the search term “quiz” and makes a site with the top thirty or so quiz related keywords, even though they have little in common with each other. So they have a site with sports quizzes, disney princess quizzes, personality quizzes, etc that ranks highly for “quiz.” We’ll call that Site A.

Before the changes this year, Google would not reward that kind of site in the longer tail/lower volume searches. For example, if I had Site B, which is a pop culture quiz site, I would rank top for things like disney princess quiz, harry potter quiz, etc above Site A. But my site would not rank that highly under the generic “quiz” keyword because it isn’t comprehensive for that term, it doesn’t cover sports or personality quizzes and it isn’t supposed to. But what I’m seeing in my niche is Google has taken Site A that ranks top for “quiz” and has pushed it up to the top above all of Site B’s pages where both sites have a page for “disney princess quiz” or whatever. It’s not just me (Site B) who has been affected. Site C which specialises in sports quizzes, and Site D which specialises in personality quizzes have also been pushed down under Site A where they have the same pages, even though Site A’s content is terrible and Site A has no backlinks. I imagine that this algorithm change in the medical serps would cause big authority medical sites to rise to the top for all queries, which is not such a bad thing. But in my weird little niche it has really messed things up.

That’s really what I mean by Google sharpening the Pareto distribution. They call it “rewarding previously under rewarded websites.” And when they say “nothing can be done about it, just continue making great content,” what they mean is, make all the pages Site A has that we think your site is missing, because we’ve changed the way we rank topical relevancy.

So I think the only thing I can do is become like Site A but better, and hope my backlinks give me the extra power to kick them off the top spot so I get all of my rankings back in my own sub niche.

As I say, I think it’s foolish. It’s going to take the black hats about 5 minutes to figure this out and in six months we’ll have low effort sites galore dominating the niche serps.

For what it’s worth I think Google are probably doing the opposite in local results because of the nature of the spam there.

NickMNS

10:03 pm on Oct 21, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That’s really what I mean by Google sharpening the Pareto distribution.

I don't get what you mean by "sharpening a distribution" what are you distributing exactly? In my post I'm referring to the distribution of keyword frequency where one graphs all the keywords (along the x-axis) and for each, one counts the number of times a user has searched for that key word (y axis). This is something which is outside the control of Google.

I really believe that you are over thinking this. If a site gains in ranking, that gain will be reflected across all the keywords that are relevant to that site, so then it is normal that it will begin to appear in the SERPS where it hasn't previously and by the same token sites that appeared previously that have now fallen in rank will appear in the SERPs for a much smaller set of keywords. That is it, simple and expected behavior.

broccoli

11:15 pm on Oct 21, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



I’m an engineer, it’s my job to figure out how things really work, and the simple explanation doesn’t match the pattern I’m seeing.

I’m talking about the distribution of sites, not keywords. Google score us on the overall topical relevancy, backlink authority, EAT, technical quality, and content quality of our sites. They changed the way topical relevancy works to reward more generic sites, and they probably upped the importance of site scores over page scores. Those changes steepened the Pareto distribution between sites that win and sites that lose. Less sites win, but they win more. That’s all I meant with that comment.

I guess I’m still not doing a good job of explaining it.

But let’s wait and see. If I’m right and I get my rankings back I’ll let you guys know.

aristotle

12:49 am on Oct 22, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



broccoli -- Apparently your intended approach is to try to analyze how google's algorithm works and then design your site accordingly.

In my opinion it's better to design your site to your own liking and your own standards, while mainly thinking about your visitors rather than about google's algorithm.

jmorgan

6:27 am on Oct 22, 2018 (gmt 0)

10+ Year Member Top Contributors Of The Month



Looks like the real-time analytics stats might be back to normal. We can go back to checking it every minute again. :)

broccoli

8:45 am on Oct 22, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



@aristotle That’s what I did previously. My bounce rate is incredibly low so I know users like my site. Google punished me anyway. If I was doing this purely for fun, sure. But I have autoimmune problems and inconsistent health and I need a way to put food on the table without going into employment. My website sustained me with a decent full time income up until spring this year, and I want it back.

JesterMagic

11:36 am on Oct 22, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@aristole and @broccoli

I give the most weight to content but everything else (design, SEO, etc...) comes in a close second. You have to take Google's algorithm in consideration as the top 5 positions are really the only ones that generate any traffic (and it exponential decreases from the first spot). It's not like the old days of just desktop where even being on second page would generate some traffic.

Once you loose those top 3 spots to big brands or some site that has come out of no where, you need to view your site from Google's algorithm perspective to see what else you can change if you think you are already doing a good job in other areas.

BTW @broccoli I have autoimmune issues as well, life certainly likes to throw curve balls

renatovieira

5:36 pm on Oct 22, 2018 (gmt 0)

10+ Year Member Top Contributors Of The Month



Anyone seeing a big drop today?

tourism, usa

jmorgan

6:02 pm on Oct 22, 2018 (gmt 0)

10+ Year Member Top Contributors Of The Month



@broccoli Google's algorithms are geared toward trying to reward sites that provide value to users. So by focusing on your users, you are indirectly working towards the same ends as Google's algorithms. Of course, the algorithm doesn't always get it right and spammy or low-quality sites may sometimes be rewarded (or always be rewarded if you read this forum). But while Google's algorithms will always change, I presume their goal of at least having the intention to reward good quality sites won't.

Shepherd

6:17 pm on Oct 22, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google's algorithms are geared toward... making google money.

Times are a changing, making money from google organic results is going to take some next level thinking, one will need to be slick and sly and not affect google's money.

ichthyous

9:10 pm on Oct 22, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



But while Google's algorithms will always change, I presume their goal of at least having the intention to reward good quality sites won't.


I fear that your presumption might be outdated JDMorgan. Good sites that rank organically are the last barrier cutting into Google's bottom line. They need to squeeze all of us now to keep revenue growing and the party is now definitively over.

There is no reason why my site, that runs no ads whatsoever and that is 15 years old and has tons of excellent broad based links and loads fast should suddenly reverse course after 15 months of slowly climbing, and drop off a cliff and lose one third of its traffic and almost all its business in the course of a few days. It's not improving its getting worse day after day. Nothing changed at all on my site.

Google is running out of ways to increase revenue through search, period. This entire summer I had very high traffic and a suspiciously low amount of inquiries...not the norm for my business. Then suddenly a flood of inquiries in one day, then off again. The patterns keep shifting with clear intent... We are just the lab mice in Google's profit maximization experiments now.

Milchan

9:43 pm on Oct 22, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



@ichthyous - i feel the same . I dont believe quality sites gets rewarded any more. The serps are a mess in that regard and I also have seen many low or lower quality sites get rewarded in the last months. I also experience explainable traffic patterns like traffic numbers increasingly for periods but without any conversions (the last days Im experiencing this - last week when the latest update rolled out I had less traffic and lots of conversions, but then it settled again into the opposite and is worse than ever now).
I have already given notice on my office and having to let staff go this week (including someone who is in hospital currently and will have no way to pay the bills, has a young family etc - its breaking my heart) - this is the real reality of what google has done to businesses that have performed well for a decade and in a matter of months they have destroyed them along with the lives of many people. Im trying to cling on to make enough money to pay my own personal bills now but with a plan to completely bail next year as there is no future in this anymore.

Gregorich SEO

9:46 pm on Oct 22, 2018 (gmt 0)

5+ Year Member



@ichthyous

Same thing happened to me.

My question is, how would Google profit from putting some crappy site above mine? By forcing me to pay for AdWords? Well, AdWords isn't allowed in my niche because it's cannabis related. So that theory doesn't work in my case.

What if Google is simply favoring pages that best answer search intent, regardless of their overall quality?

Then we'd need to search the keyword we lost, see what's ranking, and ask what that page has that ours doesn't.

[edited by: Gregorich_SEO at 10:02 pm (utc) on Oct 22, 2018]

This 553 message thread spans 19 pages: 553