Welcome to WebmasterWorld Guest from 54.90.204.233

Forum Moderators: Robert Charlton & goodroi

Google Updates and SERP Changes - November 2018

     
9:14 am on Nov 1, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Feb 11, 2017
posts:99
votes: 28



System: The following 18 messages were cut out of thread at: https://www.webmasterworld.com/google/4922186.htm [webmasterworld.com] by robert_charlton - 3:04 pm on Nov 1, 2018 (PDT -8)


Facing with big changes. Something is happening again? I'm pretty tired of Google updates for the last few months... Niche: IT how-to's, technology...
4:55 pm on Nov 22, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 19, 2008
posts:1305
votes: 98


Well something to be shared by few


yes on google there are nothing, But they seem to be all at amazon right now. All the lost conversions from google are doubled from amazon. Nice side effect: adwords campaign are paused,

This is they way to make users unsatisfied and lose them, google. Short sighted stock view.
5:03 pm on Nov 22, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2363
votes: 627


@Cralamarre
Let me add, that if you are an AdSense publisher, even if traffic is lower than usual for the next few days RPM is at it's peak in the lead up to X-mas. Once X-Mas arrives traffic will be low and RPM will be at it's lowest level. Today is great day to make small adjustment and optimization to take full advantage of the next few weeks.
6:25 pm on Nov 22, 2018 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Jan 19, 2017
posts:635
votes: 235


@NickMNS
I've already optimized my site the best I can, including ad placements, so my strategy is just to keep up with business as usual, writing and publishing new content. The more content in the lead-up to Xmas, the more potential for revenue.

If AdSense in January 2019 ends up being as bad as it was in January 2018, I'll need all the revenue from the next few weeks that I can get.
9:02 am on Nov 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 19, 2008
posts:1305
votes: 98


I donīt know who you are doing but we got hit again this night. Now we lost 62% since 25. october.

And this although we removed thin content.
Reworked content to be on point.
Decreased keyword density (conent and links )
Merged close items ( color, lenght, units )

But google likes site that have multiple entries for close items where you have to endless lists.
As a user i like to see one item and in this item i have choices. It is much more clearly presented.
11:46 am on Nov 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3526
votes: 325


And this although we removed thin content.
Reworked content to be on point.
Decreased keyword density (conent and links )
Merged close items ( color, lenght, units )

When you make a lot of quick changes to a site, oftentimes the INITIAL result is a big drop in rankings and traffic. You might have to wait several months to begin to see any positive effects from the changes.

Also if you keep making big changes again and again, google's algorithm might eventually lose all trust in your site and it could never recover.
12:04 pm on Nov 23, 2018 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 13, 2018
posts:355
votes: 68


Agreed with aristotle.

Part of trustworthy / authority / ... that Google is taking in consideration is also based on how "stable" a site is. If a site changes too much (I am not talking about new article/page/content) this has a negative impact. Why would you trust a site which is deeply different every day (exaggeration). Also, removing content is not a good idea to me. Google indexes page, just to find out that the page is gone the next day. This can send a signal that the content is not trustworthy.

Also making many changes, often, can look like an attempt to abuse the rank algorithm.
1:18 pm on Nov 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2363
votes: 627


@Martin Ice Web we saw spike in traffic at 5pm EST yesterday (+>100% of same day last week) after which traffic continued at about +20% from pre-spike traffic levels. Overall the day ended down 4% (as compared to same day last week). But it is Thanksgiving, so 4% down is like 20% up. But it is Thanksgiving, and traffic patterns are bound to be unusual. Conclusion, we'll have to wait and see, things looks normal so far today but it's early morning still.
1:53 pm on Nov 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 19, 2008
posts:1305
votes: 98


@NickMns, we are in germany and Thanksgiving is not popular in germany.

@just @aristotle
to be more clear. We did not remove content but we put it together on one page with options.
So instead of having a summary page for lets say network cables in red, blue, green.... we now have one page with all colors. ( You see this often in cloth shops )
The old pages only differ in color. Description was the same. Old pages are gone but 301 to the new page, where the new page is not new but was the subcatogorie page. All sub pages have a correct canonical link to the main page.

I know that bigger changes can have the affect of traffic drops. But it was necessary to maintain the 70.000 items we sell.
Allthough we got rid of duplicate content and click deepness is one lower now.
On mobil phone you donīt have to scroll long list.

Only advantages.
2:15 pm on Nov 23, 2018 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Jan 19, 2017
posts:635
votes: 235


My traffic was down yesterday as expected due to schools being closed in the US, but Google traffic was actually up 26% over Thanksgiving last year. In September, I saw a surge in traffic. Then I lost most of it in October, reverting back to normal levels. But November has seen a slow and steady climb upwards again.
6:50 pm on Nov 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3365
votes: 707


No big changes here. Just the normal, gradual seasonal slide (which happens in the run-up to Christmas every year), albeit with a slight and unexpected boost in the last couple of days that probably has nothing to do with Google.
10:39 pm on Nov 23, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 10, 2018
posts: 161
votes: 24


This is so annoying. I've been told to reduce my keyword density on a handful of pages because I look over optimised on them compared to the rest of the serp. Except the rest of the serp is desperately under optimised and most of the pages yammer on about nothing or wander off topic. So when I actually talk on topic, I look over optimised! There are a couple of pages where I'm literally having to refer to tools I invented in euphemisms for their proper names because a bunch of second raters wrote copycat pages with bad copy!
10:49 pm on Nov 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2363
votes: 627


I've been told to reduce my keyword density on a handful of pages

Hey 1998 is calling, they want their SEO back.... Lol keyword density.
11:34 pm on Nov 23, 2018 (gmt 0)

Full Member

10+ Year Member Top Contributors Of The Month

joined:Dec 7, 2005
posts:296
votes: 47


If SEMrush is any guide, there wasn't any update in the last few days, so if your site took a hit, it may be due to technical issues on your part, or Google specifically targeted your site for a demotion (which is probably doubtful).

In addition, sometimes when you change the content or design of a page, even if it's an improvement, Google sometimes demotes the page temporarily (I assume to collect some more information/signals about it). It's almost like a leap of faith where you have to trust that the improvements you've made are, in fact, improvements and that Google will agree eventually.
2:03 am on Nov 24, 2018 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts:541
votes: 81


Did they move the cheese again?
3:05 am on Nov 24, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 10, 2018
posts: 161
votes: 24


Lol keyword density.


I wish I was joking, believe me, but keyword density is an integral part of penguin and penguin is now an integral part of the core algorithm.

I reckon eighty percent of the people on this forum whose businesses are going down in flames have been hit by penguin without even knowing it. Google have been dialling it up very slowly since it was released at the end of 2016.
4:29 am on Nov 24, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2363
votes: 627


@broccoli
but keyword density is an integral part of penguin and penguin is now an integral part of the core algorithm.

Its time to wake up. When you're messing with things like keyword density and hoping to regain lost traffic from the last few updates then it is time to really take an introspective look and yourself and your business model. I'll give you benefit of the doubt, keyword density is ranking factor (it isn't but I'll pretend). How much of a benefit do you expect to have by optimizing for it. 10% gain, your still like 300% short. How much time and resources do think it will take find the optimal setting for each and every page. How much money will you waste paying for bogus 3rd party metrics who's business models revolves around making you question your compliance with bs ranking factors and metrics such as keyword density.

You need to refocus, and make substantive changes. New features, expanded or more focused targeting, new positioning in your niche/market or a whole new business that builds on the strength of the current one.
4:52 am on Nov 24, 2018 (gmt 0)

New User

joined:Oct 11, 2017
posts:13
votes: 9


@Broccoli You are viewing the results from the wrong light. You say the top pages are underoptimized. Who made you the arbitrator of the serps? I think Google is showing you exactly what they are rewarding. However, keyword density is only a small part of page optimization. Where you place those keywords is much more important than just how many times you use it on a page. You need to run tf-idf on the keywords that have lost rankings and compare them to what Google is rewarding. You cannot dictate what is optimized. You can only take the clues from the serps.

And yes many here will say keywords are dead. Tell that to the guy who ranks for over 20k keywords in 1 year :)
10:25 am on Nov 24, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 10, 2018
posts: 161
votes: 24


@NickMNS

Its time to wake up.


Iím not paying any third parties for anything. I can barely afford to eat right now. I just finally got some help.

I didnít say keywords were a ranking factor (they are), I said they were part of penguin, and penguin is a penalty that takes away your link juice.

If your keyword density is too much higher than the serp average they will treat it as a spam signal and add it up with a bunch of on page trust signals and if they donít like the overall picture they take away the power of your exact match backlinks. They canít tell who is a spammer and who isnít, and they donít even care. They just want their serps to look corporate, so theyíre hitting everyone who doesnít look corporate.

@Pontificus_Maximux

Who made you the arbitrator of the serps?


Congratulations on your rankings!

In this case, Iím talking about original tools Iíve written that I talk about naturally, that people search for by exact match query, then later other people have come along and made rubbish copies of them and written rubbish content which has brought the average keyword density of the serp down. Iím not talking about SEOs or journalists, Iím talking about programmers who donít know how to make user interfaces or write copy or what to write about.

Google is not the arbitrator of what it wants the serp to be, it allows the serp to be the arbitrator. If I have to go through my pages and remove the clear user-friendly labelling from my forms and the name of my tool from the paragraph copy all but once, and refer to it by euphemisms like ďthe toolĒ then yeah, itís a massive weakness in their algorithm. I get a penguin penalty because everyone else in the serp is atrocious at user experience.
12:14 pm on Nov 24, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 10, 2018
posts: 161
votes: 24


I should add to that, there are pages at the other end of the spectrum where I'm competing with big corporate businesses who have created professional tools, where I've written just as naturally but my keyword density is WAAAAY under optimised compared to the rest of the serp.

They've made it like a fiat currency. I'm sure there's a gold standard cut off at some point, but for the most part it's pure relativity. No wonder my site got wrecked by it.
1:38 pm on Nov 24, 2018 (gmt 0)

New User

joined:Oct 11, 2017
posts:13
votes: 9


@Broccoli I personally have found that tf-idf is different for every niche and even varies with every keyword. However, if you make your page keyword usage in different areas, content, images, h tags, ect. no more than 1.5% of what is being rewarded in the serps, you get a bounce upward. One size doesn't fit all.
2:34 pm on Nov 24, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3526
votes: 325


Are you designing your site for google, or are you designing it for your visitors?

Are you writing your articles for google, or are you writing them for your visitors?
3:23 pm on Nov 24, 2018 (gmt 0)

New User

joined:Oct 11, 2017
posts:13
votes: 9


@aristotle That has a fallacy that it must be either/or. I design and write or have written for both/and.
3:53 pm on Nov 24, 2018 (gmt 0)

New User

Top Contributors Of The Month

joined:July 24, 2018
posts:28
votes: 2


In my current niches "beauty and supplements", my competitors are writing for Google and are ranking well. In fact, they have probably made more money in the last 3 months than I have made in a lifetime. Really frustrating, now having said that I am replicating what they are doing and am finally seeing a little bit of income coming back. This includes buying expired domains, spammy redirects 301's from expired domains etc. Doing this takes twice the effort but until my other sites come back (if they ever do) I gotta do what works.

When I see folks fixating on page speed, KW density, authorship it is mind-boggling because in a lot of niches that are ranking well I am not seeing any of that, at all!

I thought the August update would be shortlived and that it was just some ranking adjustments but it is clearly here to stay for a while.
3:54 pm on Nov 24, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3526
votes: 325


Pontificus Maximux -- You're right - I should have said "primarily".

Oftentimes there's no conflict, but if there is, you should usually give priority to your visitors.
4:30 pm on Nov 24, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 10, 2018
posts: 161
votes: 24


@Pontificus Maximux Thank you I will bear in mind to keep below that limit and also treat each element on the page separately.
4:47 pm on Nov 24, 2018 (gmt 0)

New User

joined:Oct 11, 2017
posts:13
votes: 9


@Broccoli, Also, make sure you compare apples to apples. Don't compare blog posts, info content, or affiliate content, to category pages or ecommerce pages. For instance, don't compare your best widgets for blue people page to an Amazon category page or a youtube video. Compare it to what Google is ranking that is similar to your page.

Keep your total word count to within 10% of what the average of the top 10 results are.

I do this.
#1. I compare keywords (Keywords = exact match, partial match, semantic match) in the Title, H1, Meta Title, and URL.
#2 I compare keywords in the h2-h4 tags
#3 I compare keywords in the content.
#4 I compare keywords in <li> tags
#5 I compare keywords in images (file name, and alt tag)
Those are in the order of importance. I do this only if I am struggling to rank. I can usually rank for long tail keywords without link building. If after doing this, and still can't rank, then I work on link building.
8:24 pm on Nov 24, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2363
votes: 627


If your keyword density is too much higher than the serp average they will treat it as a spam signal and add it up with a bunch of on page trust signals and if they donít like the overall picture they take away the power of your exact match backlinks

This is not the case. Google has been quite explicit about this and if I had time, which I don't, I would dig up some links to references of such statements. What Google has said regarding keyword density is if it is pushed to an extreme then it can be used as a signal of pure spam. That is a signal that would earn your site a manually penalty. Otherwise keyword density is ignored.

In addition, the concept of tf-idf is not use as a measure of keyword density it is used as a measure relevance. One takes a corpus or collection of webpages then for each page in the collection one takes counts the occurences of the various "terms" on the page and measures the frequency of the occurences of those terms with the texts and takes an average for the webpages. For example take a topic like baseball and a term like "red socks" of 100 baseball webpages the term "red socks" might appear on average 3.2 times per page. So if you have a webpage that has the term "red socks" on it 3 times the probability that the page is about baseball is relatively high, whereas if it appears 10 times then maybe the page might be about actual sock one wears on ones feet that happen to be red.

Now it should be obvious that there are problems with this metric and so it can and has been refined. Maybe count the co-occurrence of terms. Maybe the values differ for short vs long texts, etc. The best would be to take all the variations and feed them all into one large neural-net call it Rankbrain.

The bottom line is that this measure searches to find the relevance of term basing itself on naturally written texts. Such that a website about socks is not returned for a search for the "Red Socks" baseball team. There is no way to know what the optimal conditions are for any given term. So the next best thing is simply to write text naturally, for humans in proper grammatically correct English (or whatever language your site is in).

In the end if your term-frequency generates a false positive in the search algo, that will become exceeding clear based on the keywords that are driving traffic to your site. You can then go and tweak your text with a few more references to baseball or whatever. I have never seen such a situation, and never heard of such a situation occurring. Google is really good a determining the relevance of a webpage.

But if you feel that adding one more occurrence of "Red Socks" is going to make the difference then go ahead, and good luck.
10:58 pm on Nov 24, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 10, 2018
posts: 161
votes: 24


Thank you @Pontificus Maximux, that's very useful information. I'm going to re-run my tests again and make sure to exclude youtube videos and the like.

@NickMNS

But if you feel that adding one more occurrence of "Red Socks" is going to make the difference then go ahead, and good luck.


Perhaps I could have been clearer, I'm largely removing keywords due to a suspected Penguin penalty. I have a lot of natural exact match backlinks due to the way people share my site. But TF-IDF *is* a ranking factor.

I don't believe a word Google says about their algorithm anymore. They put out misinformation all the time.

Guys,

I'm going to indulge in a bit of carefully thought out speculation here.

Google is stupid and can't tell the difference between legitimate sites and spam sites (that's why they have manual actions). Penguin assesses the trustworthiness of each of your links individually, based on a confluence of factors, including the page they came from, your current on-page keyword percentage, and other trust signals. The people at Google don't care about collateral damage. Matt Cutts was probably the only person who acted as a voice for webmasters.

A lot of long established sites have been on a long slow decline since late 2016 when Penguin 4.0 was introduced, or early 2017, which is when they seem to have added more quality signals into Penguin.

I am convinced that Penguin is affecting a large percentage of sites these days. Google will deign to give you the power from your backlinks if they think you deserve it, and this year that means only if you look like a corporation. If your site is in decline and you can't 100% explain the decline via snippets or other factors, you should look into this.

I'm also convinced that Penguin has an active demoting factor as well as just an ignoring your links factor. Why have some fresh thin content sites started ranking well? Because they got lucky with a few on page trust signals and keyword percentages, and they don't have any backlinks, so they can't get demoted by Penguin. Why have thin content pages from corporations started ranking well? Because corporate signals prevent Penguin penalties.

I also suspect they won't rank you properly in the neural matching / super synonyms algorithm if your percentage of Penguined links is too high, no matter how relevant your page is, which is probably part of the reason why the algorithm doesn't work properly.

A lot of people here seem to have been struggling since March.

We know they did something to Penguin in March, because that's when exact match domains came out of nowhere and teleported in to the top of niches and even got rewarded with site links. Maybe they took out some legacy code by accident, or maybe they made Penguin even more aggressive but they have the problem that they can't properly distinguish between EMDs and brands.

Remember the early versions of Penguin penalised a bunch of brand sites and individuals with a high percentage of exact match backlinks? They have trouble telling the two apart. That's why they keep saying to focus on building your brand. They recognise your website as an entity and they have to give exact match links pointing at your website a free pass so they don't penalise companies like Amazon by mistake. So because they can't properly tell EMDs from brands, the EMDs are getting a free ride, while everyone else is getting various percentages of their backlinks penalised. That's why brands and EMDs are doing so well this year.

Why is it taking so long to get new sites indexed at the moment? Why so many apparently link-based updates recently? Because they know Penguin is messed up and they're trying to fix it, and every time they tweak the algorithm they have to recrawl the entire web and reassess everyone's links against their target pages one at a time. The server load must be insane.
12:57 am on Nov 25, 2018 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Jan 24, 2016
posts: 78
votes: 19


@Broccoli, I think this hits the nail right on the head. I'm going to steal it:

> they have the problem that they can't properly distinguish between EMDs and brands

I kind of wish I could share my niche just so people who might not have been effected by the EMD push since March can really see the damage Google has done. It's absolutely mind blowing how bad they are at figuring this out.
3:01 pm on Nov 25, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3526
votes: 325


Some members keep doing a lot of speculation about what changes google has been making to its ranking algorithm. Most of this speculation doesn't look very plausible to me.

In my view the main thing google has done is to increase the weight of perceived trust and authority. Google recognized the importance of trust and authority many years ago, but has always had difficulty finding ways to measure it. Apparently they think that they've finally developed some reliable methods for doing so. But no one here knows what these methods are.
This 312 message thread spans 11 pages: 312
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members