Welcome to WebmasterWorld Guest from 3.227.249.234

Forum Moderators: Robert Charlton & goodroi

Featured Home Page Discussion

Google Updates and SERP Changes - September 2019

     
8:38 am on Sep 2, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 25, 2017
posts: 161
votes: 38


- Split out of...
Google Updates and SERP Changes - September 2019
https://www.webmasterworld.com/google/4957935.htm [webmasterworld.com]
by robert_charlton - 1:31 am on Sep 2, 2019



I've seen a comparable drop to the first of June. Like alot of people on here (I'm becoming increasingly frustrated) I feel like alot of the drops are down to the size of our brand and not the state of our SEO. Conversations internally in the business mean that i am going to have to lean away from some of the principles that i have leaned on and tweaked for maybe 10 years. I'm going to redesign the site.

We have a high proportion of non indexable pages. Whilst these obviously dont rank, I'm starting to wonder whether the high numbers of these pages proportionally across the site are maybe having an impact on our indexable pages. I know that might sound crazy. In my head, I dont understand how that would be an issue, but maybe Google thinks i'm trying to hide something?

[edited by: Robert_Charlton at 1:56 pm (utc) on Sep 2, 2019]
[edit reason] cleanup after splitting & combining threads [/edit]

9:00 pm on Sept 23, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Dec 18, 2018
posts:89
votes: 75


@NickMNS - Yes, maths is maths.
9:17 pm on Sept 23, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2711
votes: 822


@StupidIntelligent
Yes, maths is maths.

Your math doesn't add up. Feel free to believe in your voodoo traffic throttling, or what ever you would like to call it. The simple fact that Google ranks websites is a kind of "throttling", but beyond that it is simply false attribution of random events. Whether you choose to believe one thing or the other isn't really going to change it (the math is actually pretty clear about this stuff, see wikipedia link from earlier post). But personally, I'm going to spend my time on more constructive things than spending my days watching my analytics graphs, lamenting and theorizing about how Google is doing me harm.
9:25 pm on Sept 23, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Dec 18, 2018
posts:89
votes: 75


@NickMNS - No one is stopping you from living your life. Enjoy.
11:50 pm on Sept 23, 2019 (gmt 0)

New User from AU 

joined:Nov 20, 2018
posts:4
votes: 2


Branding is one of the most important aspects to modern day SEO (Although this is highly dependent on your niche, location and your value proposition).

In the current climate of SEO, you are not going to rank for many head terms unless you can compete with brands who run your niche.

It's an unfortunate truth for small businesses.
1:06 am on Sept 24, 2019 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:Oct 29, 2012
posts:552
votes: 98


Traffic shaping (throttling) had been in place for close to a decade. I believe some of us in certain niches had noticed it far before everyone else. Call it sandboxing, call it freshness, call it voodoo, there is a definite mechanism to tune or shape traffic volume. Sandbox is the most obvious traffic shape mechanism. Maybe the algo is introducing to much broader scale as time goes by.

The traffic ceiling is not a fixed value, but it's a moving target based on various factors. And more often than not, that ceiling is nearly impossible to break without some "voodoo" magic from webmasters, (cough) blackhat, greyhat, viral whatever (cough). Maybe some people do not accept it, because they continue to do magical things (money/marketing magic) that continuously break that ceiling, it's good for them. I personally usually hit the ceiling roughly 6 month~1 year after growing a site, I only do things organically.

The amount of traffic matters in calculating standard deviation, if a site only gets 100 visitors a day, a variance of +/-5 isn't alarming.

But when a site gets 10,000 visitors a day, a variance of +/-5 is much more obvious month/month, quarter/quarter, or even year/year, despite growing content. What I usually observe is shifting traffic towards various content within a site, resulting in the same overall traffic volume. Self cannibalization towards completely different keyword sets. That's where I observe the ceiling locations.

Math does make a difference based on traffic size.

I also believe this mechanism helps some older sites to survive as long as they do. Some really old sites using old tech at super slow speed that's not mobile friendly should not rank as they do now, but they still do. That's the algo holding up their traffic floor. As these #*$!ty site's floor crumble with time, their lost traffic add to our new claimed ceilings. Ceilings exist just as floors exist.

If the algo does not have this type of event / time based damping factor, many #*$!ty sites should die overnight.

I second the view that it's not constructive to look at analytics often, time is best used to make better products and better sites. But discounting that traffic throttle or shaping does not exist is just plain wrong. I also do not think it's a harmful tactic employed by google, but rather a necessary evil for overall stableness of net / serp.
1:09 am on Sept 24, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3329
votes: 546


Branding is one of the most important aspects to modern day SEO

Well ... I have to say these days, IMHO, this is a fallacy except in the Fast-Moving Consumer Goods (FMCG) or Consumer Packaged Goods (CPG) markets.

This is where Google has firmly planted itself, along with with travel, hotels, automotive, books etc, the kids running the Plex and their alogs evidently have zero concept of what actually happens in the real world since their fingers are stuck to their iPhones!

Call me wrong however prove to me otherwise, and I mean prove it.
1:22 am on Sept 24, 2019 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3329
votes: 546


Traffic shaping (throttling) had been in place for close to a decade.


I'll agree with ten years, maybe slightly longer however I do not call it throttling, I call it blatant manipulation and it transverses all industries and countries, its algo is incredibly selective and BIASED.

Is this a US government instruction? I do not know other than watching my sites disappear from G.com and seeing crappy US company sites replace them when they cannot even supply.
2:07 am on Sept 24, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2711
votes: 822


@frankleeceo
Traffic shaping (throttling) had been in place for close to a decade. I believe some of us in certain niches had noticed it far before everyone else. Call it sandboxing, call it freshness, call it voodoo,...

You have just defined a major problem with this "throttling" theory, that is "call it whatever". Or stated differently, it is defined in terms that are vague enough such that anyone can see they situation as fitting the definition. My traffic is stable, it throttling, My traffic drops I'm being throttled and on and on.
...there is a definite mechanism to tune or shape traffic volume.

Yes, it is a the ranking of your website by Google. For any given relative rank one can expect a given amount of traffic. During normal times ones sees little fluctuation in traffic, the amount of variability is specific to your niche, but if one observe that data over a statistically significant time period one will see a mean and variance that remains relatively stable. Then, when an update occurs one may see a sudden shift, after which emerges a new mean and variance. That is the mean and variance based on your current relative ranking. That is it, simple no hocus pocus, no conspiracies nothing just the result of rankings.

Is it unusual that one gets two or even three days in a row with exactly the same traffic? Yes. Is impossible or even improbable? No. Does it prove anything, absolutely not.

The traffic ceiling is not a fixed value, but it's a moving target based on various factors. And more often than not, that ceiling is nearly impossible to break without some "voodoo" magic from webmasters, (cough) blackhat, greyhat, viral whatever (cough).

Well, I have broken this "traffic ceiling" on several occasions, without applying any tactics of any sort, in fact without any action on my part. This has occurred when random current event occurred and many people to started searching for information about a topic that can only be found on my website. I have seen my traffic spike anywhere from 5x to over 10x. But by your definition this wouldn't be possible, that traffic would be throttled away into the ether. But, here again the counter argument will be well, "throttling doesn't apply in such a situation". Of course not. Lesson one in winning an argument is to make sure that your theory is not falsifiable.

Math does make a difference based on traffic size.

No, not in this case, first one is looking for a change in traffic over a period of one day, so unless Milchan has some kind of time accelerator, the data is still in days. And even if size would be an issue, one could collect data over a longer period, such that the data collected would have a sufficient size or volume to be statistically significant. But again, we are getting ahead of ourselves discussing methodology when no testable hypotheses has been put forward.

If the algo does not have this type of event / time based damping factor, many #*$!ty sites should die overnight.

The reason that many #*$!ty sites don't die overnight, has nothing to do with this at all. The reason is simple, the cost in creating such a site is in the creation, building the pages, CMS, writing (or even stealing) the content and so on. Once created, the cost of operating such a site is negligible, and given the unpredictable nature of Google's algo, your crappy site may one day start to get traffic. If the site flops, the cost of creation is already sunk, why kill it, who knows you may eventually start making your money back.
2:43 am on Sept 24, 2019 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:Oct 29, 2012
posts:552
votes: 98


@NickMNS

I don't think I am talking about daily fluctuations, nor most of the people observing such effect. Try months, quarters, and sometimes years across millions of traffic / pageviews. As keyword pool sample size and time frame gets increasingly large, it becomes much more improbable to ignore such effect.

Good for you that you have broken the ceiling multiple times. Then you should observe it trending back into your baseline. Maybe everytime after you break it, you end up with 5~10% more than your previous base, or back to exactly the same. Yes the ceiling can be broken easily via viral sharing, marketing. It is not a fixed ceiling. Certain factor had been changed when ceilings get broken, perhaps you gained a new link, gotten some social mentions, etc.

You must have enough data to go back far enough.

Some of the #*$!ty sites that I have observed should have been hit into oblivion given the better competitors. I am talking about 20 year old sites ranking better than newest competitors.

Yes, I believe we had come to the conclusion years back that how I define "throttling" is how you define "ranking". :). Traffic volume is based on the combination of various ranking factors.

What I am talking about is more along the line about rotation of given SERP keywords, and impression/click cap to reach same traffic quota.

Extreme Example of what I mean: (time frame / click value exaggerated, substitute days with different month if you like.)
On Monday: Site gets 1000 clicks on keyword A, 0 clicks on keyword B.
On Tuesday: Site gets 0 clicks on keyword A, 1000 clicks keyword B.
We update site to add keyword C, on Wednesday, 0 clicks on Keyword A and B, and 1000 clicks on keyword C.
Add keyword D, on Thursday, 0 clicks on Keyword A, B, and C. 1000 clicks on keyword D.
One would assume that the traffic search on keywords should not have changed, but the site no longer ranks for them, and some keywords do end up cannibalize the entire quota, hitting the same traffic ceiling.
I call this traffic shaping / throttling / ceiling.

As time progresses, the same 1000 clicks get rotated between A, B, C, and D. But never break through the overall 1000 clicks. Even when one knows that the true traffic pool is 4,000 total theoretically, at minimum, because site did at one time receive that much traffic via the keywords.

And by not "ranking".
Keyword A, B C D, still shows rank 1.0, but just fractions of the impressions volume individually, to add to the 100% baseline. No one else had experienced this?

This rotation / cap is not random walk.
3:28 am on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


For any given niche, the AI is learning how to place enough of their monetized goodies in front of the first organic results to keep making tens of billions per quarter. It's just that simple.

Proverbs 1:19
6:22 am on Sept 24, 2019 (gmt 0)

Preferred Member

5+ Year Member Top Contributors Of The Month

joined:Oct 29, 2012
posts:552
votes: 98


And to add, I know it's a soft ceiling because I do break it myself often. However, after breaking the defined ceilings a couple of times through the multiple years that they have occurred. If I go back and try to learn the specific conditions, I could project future occurrences with pretty good certainty on what the traffic / keyword spread would look like if same conditions are to occur again. This is regardless if I have made improvement or add content to the site, as the same type of keyword rotation will occur.

Ie. a social share from some places by certain time or people would break it 2x, 3x.
ie. specific event break would get 5x ceiling value.
ie. marketing push would get x ceiling
I see it as a bunch of ranking factors or signals that add up to a specific quota. I stress that it's not a hard ceiling or throttle, but a moving target.

If certain keywords do get spikes, I am fairly certain that some of my other keywords would get significantly less traffic to counteract the spike, to reach the predetermined ?x ceiling.

I did an experiment of removing 50% of the content with traffic on a test site years back. I did not lose 50% of the traffic :). Because of the floor effect, keywords and traffic flowed to other pages. I believe some people have done much crazier content removal on their site, and do not get same percentage of traffic loss.

I don't know about you, but my understanding and testing of the mechanism keeps me sane without stressing over daily fluctuations. I know that I am doing the right thing by adding 2x, 3x, 4x more content and improving the site. This understanding keeps my expectation low that I do not expect to get that same amount of % return in traffic vs time / effort invested.

This mindset has kept me going without giving up. I do feel like I am working for free often, but I do see positive results over time.
7:52 am on Sept 24, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Sept 25, 2017
posts: 161
votes: 38


For the first time in months, I had a really good day of quality organic traffic.

bounce down,improvements on dwell time, average pages per session up, traffic.

some changes to rank over the past 48 hours, but i fear those changes might have already reverted.
8:09 am on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


SERP layout has changed again. No way you can expect consistency with this endless flux.

I have the #1 organic position yet there are 18 various content choices above me. That's the problem, might as well be on page 2.
8:27 am on Sept 24, 2019 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts:604
votes: 112


G may have achieved “quantum supremacy” but they still don’t get shopping as well as others do:

Brands was Eric Schmidt easy solution to spam but what they do not understand is that brands do not need Google the more branded searches the less money Google makes.

For G buyers kept disappearing during the years they ended up on brands email lists and downloaded their app also. Amazon Prime members do not waste their time researching for products anymore and in general no one has the patience to click more than 3 ads these days.
9:58 am on Sept 24, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3476
votes: 781


For any given relative rank one can expect a given amount of traffic. During normal times ones sees little fluctuation in traffic, the amount of variability is specific to your niche, but if one observe that data over a statistically significant time period one will see a mean and variance that remains relatively stable.

That's long been true in the brick-and-mortar world, too. Hotels, airlines, and cruise lines can predict occupancy rates with remarkable precision. For that matter, my son once worked for the owner of a luggage shop who could predict sales with considerable accuracy on any given day based on historical precedent. Why would anyone expect search to have wider daily variances than the "real world"?
11:15 am on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 2, 2014
posts:741
votes: 424


That's long been true in the brick-and-mortar world, too.

Why would anyone expect search to have wider daily variances than the "real world"?

There are hundreds of algorithmic changes in Google per year along with personalization, layout changes, etc. that should impact traffic. In the brick and mortar world this would be like moving a business to a different point on the street, to a different street or a different neighborhood altogether on an hourly basis.

Despite the constant changes in search, many witness comparable levels of daily traffic. Though some notice substantial swings in converting traffic. This leads me to believe Google's AI considers historical traffic data and attempts to fulfill that quota. Why this is needed would be open to debate. My guess is by maintaining daily traffic webmasters feel some sense of security in a digital world that is being over-harvested by a few select multi-national businesses.
11:20 am on Sept 24, 2019 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 24, 2012
posts:94
votes: 40


EditorialGuy : because an hotel can not add more rooms.
We as publishers can add more content to our websites. How can you have the same traffic with 100 ou 1000 pages ? It seems that adding content doesn't matter.
If I add a new content, it gets some pretty traffic, but Google will compensate taking away traffic elsewhere on the website. At the end, the traffic remains the same.

This is throttling.
12:00 pm on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


It's not just throttling, it's redirection of the search. Of the 18 Google affiliate and content choices above the #1 orangic listing only 2 or 3 are directly related to the search. The rest are "somewhat related" because Google had no exact match in their monetized inventory. They are no longer a search engine, rather an ad engine, but I'm preaching to the choir again.
12:06 pm on Sept 24, 2019 (gmt 0)

New User

joined:Sept 24, 2019
posts:6
votes: 12


(Long-time lurker, first-time poster) I finally had to throw my hat into the ring after this last BS update. To add to the discussion: This talk of "throttling" sounded ridiculous to me at first. As an SEO, I normally take the route of accepting responsibility for whatever happens to my analytics and performance, based on my own inputs and actions.

But I've been managing an eComm site for a few months that just tanked with absolutely no logical reason. I've been going over every SEO best practice in the book on this site for months: Content analyses and re-writes, authoritative new content that beats out all its competitors for that topic, pruning bad performers, full audits, technical optimization, speed tests, eliminating keyword stuffing, revamping category pages, mapping good interlinks between content and products, growing total keywords by ~3K in 90 days, *consistently* climbing the most important key rankings to spots 5 to 1 (some coming from page 2). This latest "update"/wizardry just destroyed nearly all of my work. Another site I manage is the #1 site in its market. It just lost spots 1 for a few important keys, too.

Meanwhile, some spammy site floating by with 0-DR backlinks and shady directories, stuffing random tags and categories full of BS keywords, horrible on-page content, even misspellings in meta descriptions on its highest-ranking pages, is now out-ranking. As an SEO, I'm incredibly frustrated by Google's seeming lack of direction. I'm a pretty seasoned business professional, too. You cannot tell me that if Google really is "doing good work", analyzing business verticals and the competitive landscape and appropriately presenting it, that the industry #1 site and a site that has consistently doubled to nearly-tripled its traffic over 3-4 months off every SEO best practice in the book should now be relegated to page 2. /rant
12:16 pm on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


Welcome to webmasterworld and the bizarro world of google Travis.
12:29 pm on Sept 24, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Dec 18, 2018
posts:89
votes: 75


"Keyword Management" is the term I am introducing. Throttling is the wrong word because that connotes that a particular website ranks in the top positions, for all its keywords on the topic, all the time; and Google is impeding its appearance in search.

Taking frankleeceo's insight; which holds quite some value; I would even add that this phenomenon is more prevalent for searches where Google itself has financial interests i.e., the top four ads.

System

12:32 pm on Sept 24, 2019 (gmt 0)

redhat

 
 


The following 22 messages were cut out to new thread by brett_tabke. New thread at: google/4965610.htm [webmasterworld.com]
9:39 am on Sep 24, 2019 (cst -5)
2:53 pm on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


"Keyword Management" is the term I am introducing.


"Smothering Effect" is the term I'd like to introduce.
3:44 pm on Sept 24, 2019 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 24, 2012
posts:94
votes: 40


Remember my japanese issue ? OK, it was a pirate. It's solved.
5:49 pm on Sept 24, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:July 24, 2018
posts:85
votes: 36


Hang on to your seats, here we go again.

[twitter.com...]
5:51 pm on Sept 24, 2019 (gmt 0)

New User

joined:Aug 6, 2018
posts: 22
votes: 2


along with the core update...i'm seeing a design tweak to the Top Stories carousel -> Publisher logo now between the image and article title, looks better.
8:55 pm on Sept 24, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


Amazing stretch of zero sales. Must really be messing with steering users by intent. My rank looks the same or even better' but behavior is off the charts poor. AI is probably using pupil dilation or facial expression. Am I serious?
11:15 pm on Sept 24, 2019 (gmt 0)

New User from AU 

joined:Nov 20, 2018
posts:4
votes: 2


Call me wrong however prove to me otherwise, and I mean prove it.


There's noway I can prove it to you, because it's going to be heavily dependent on industry and location.

I know in my industry and in Australia, smaller companies have been wiped out for head terms but big, trusted AU brands.

Improving our brand has gone a long way with competing with entities in search.

And this is the crux of search, everyone's situation is going to different. So I probably should of said that it's dependent on how Google is ranking those sources above yours.
12:22 am on Sept 25, 2019 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:June 28, 2018
posts: 363
votes: 204


Amazing stretch of zero sales. Must really be messing with steering users by intent. My rank looks the same or even better' but behavior is off the charts poor


experiencing the same thing. Looks like steady enough traffic, no big boost or anything but could end up being a little more overall but no sales what so ever since the update started.
1:02 am on Sept 25, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1484
votes: 599


@milchan - I sincerely feel your pain. God bless.
This 384 message thread spans 13 pages: 384