Welcome to WebmasterWorld Guest from 34.238.194.166

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - September 2017

     
10:45 am on Sep 1, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14913
votes: 491



System: The following 4 messages were cut out of thread at: https://www.webmasterworld.com/google/4860963.htm [webmasterworld.com] by robert_charlton - 3:25 am on Sep 2, 2017 (PDT -8)


How do you compare your pages with those in the top 10?


User intent.
2:33 pm on Sept 13, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 794


@martinibuster
CTR data is used for training, not ranking

I fail to see the distinction. Granted it is not used as a direct ranking factor, but given the use of ML and neural networks nothing is a direct ranking factor. Given that pogosticking is used to "train" the aglo means that it has some influence on ranking. What that influence is, and how much of an impact it has is anybody's guess.
3:29 pm on Sept 13, 2017 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:14913
votes: 491


I fail to see the distinction.


You're not alone. This inability to comprehend the distinction is due to a lack of information about what is going on. It has led to a lot of misinformation about how CTR is used.

I will try to fill you in on this missing information and hopefully you will then be able to see the distinction.

A ranking factor is something that directly affects the ranking of a single website for a specific phrase. Training often has no relation to the sites that are ranked. I'll explain.

  • Training refers for example, teaching a machine to understand that a page with the word BUY in it is commercial and not informational.

  • Training can also refer to the example of a query that refers to an entity that is a brick and mortar store. Training on millions of these kinds of entities can teach the machine that when this kind of entity is queried in a specific kind of a way, that a local result is preferred.


When a pogo hop happens, the machine can look at millions of those events and determine that the searchers preferred an informational site over a commercial site for certain kinds of queries then adjust the algorithm accordingly because it learned what the user prefers.

The machine can also learn that users prefer sites that are local to them and adjust the rankings, based on previous searches, as learned through millions of data sets.

THEN, this can be incorporated into the modification engine which happens AFTER all the sites have been ranked. What happens next is that the sites in positions one through ten could include sites with poor ranking factor metrics.

They can have very few links for example. But the machine knows that this site with a few links (your friendly neighborhood coffee shop) is the right answer because it is local to you and most users prefer coffee shop results that are local to them.

Traditional ranking factors did not matter in the above search results, like h1, links, etcetera. What mattered were factors such as geographic data or entity type data. And that emphasis was learned from pogo sticking that happened on some other sites that are totally unrelated to the website of the little coffee shop on the corner near your house.

Pogo sticking was NOT a ranking factor for that coffee shop in position 1. Pogo sticking DID provide the data to help the algorithm learn that for coffee shop queries, a local result is best.

Pogo sticking was used for training. It was not used for ranking. Do you understand the distinction now? :)

Does that make sense?

[edited by: martinibuster at 3:45 pm (utc) on Sep 13, 2017]

3:33 pm on Sept 13, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1717
votes: 263


@Nick
It's the difference between the specific and the abstract.

Take @Cralamarre's example. That is specific, and you personally stated that CTR will not result in the positions being swapped.

Training the AI means "situations like this" gets "processed like that" - it's abstracted.

That's the theory. I personally think Google would be missing a trick by not using the CTR data, in the way @Cralamarre suggested. I have no evidence, and it would not be actionable even if it were true. I'm not highly persuaded either way, but on balance I think Google could use pogosticking, and if they could, they would.

(And Google has history in the "don't look behind the curtain" misdirection. CTR is too noisy, we don't use it, nothing to see here. Too easy to game. No point trying really. Just ignore Rand, what does he know? And the patents guy, Bill, he never spots anything- too much time at sea. Move along, move along.)
3:45 pm on Sept 13, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Jan 19, 2017
posts: 670
votes: 246


Sorry, I heard my name :) I was actually just thinking, if Google does use pogosticking in their rankings, wouldn't it be very easy for shady competitors to trick the system by having people, or bots, click on your link in the SERPs before clicking the competitor's link? They'd have to do it a lot, probably, but I'm sure there are ways.
4:01 pm on Sept 13, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1717
votes: 263


wouldn't it be very easy for shady competitors to trick the system by having people, or bots, click on your link in the SERPs before clicking the competitor's link? They'd have to do it a lot, probably, but I'm sure there are ways.
Ask Rand Fishkin!

More sensibly, hitting a particular SERP would be detectable. Even hitting a particular site, or promoting a particular site. Weaponising it as an insight would be a major engineering project.

The development would be onerous. By maintaining the pointlessness of doing so in the specific (as opposed to abstract), Google discourages anyone from even trying.
4:37 pm on Sept 13, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 794


I think you are missing my point. I entered this conversation (top of the last page) saying very clearly that there is no "ranking" when it comes to Google and others. So there is no ranking factor in strict sense of the term.

But if Google is using it to "train" a neural network than it must be by definition a feature within that network. Then when the model(trained network) is used to return the search results to the user, it is using data from the pogosticking in some capacity. It is impossible to know what impact it is having if any. But it is using the data to return the search results. Now the rest is simply semantics.

As for gaming the system, since machine learning or most likely neural nets are used it would be very difficult to game this one feature. Simply because you don't know how, when or if the feature is having any impact on the model itself, and how other features outside your control are interacting with this feature. This was my original point. You don't know how any feature can impact the results, links, keyword density, location and on and on. Like pogosticking data they are all fed into the net, and the merged through each layer to provide a model. Gaming the system has become impossible.

I would just like to contrast this with more conventional models such as linear regressions. In a linear regression one can know after the model is trained what weight is assigned to each feature. So it is more apt to describing features as ranking factors. When it comes to gaming the system, since one knows the weight and the direction, one can easily add more or less to the required degree to get the desired effects.
4:59 pm on Sept 13, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 11, 2008
posts:1717
votes: 263


I think you are missing my point. I entered this conversation (top of the last page) saying very clearly that there is no "ranking" when it comes to Google and others. So there is no ranking factor in strict sense of the term.

Ok, I get you.

But there is a ranking, just not a single Ranking. Your SERP is an actual, reified ranking. The actual ranking methodology is just tweaked depending on personalisation.

Now, in the CTR-as-training paradigm, that methodology is impacted by CTR - but it is the entire methodology that is tweaked, with no actual reference to result.

Training = refining the methodology
Ranking = applying the current methodology.

The rest of your post is excellent analysis of the direction Google is moving in, and the obsolescence of the "ranking factor" paradigm.
5:15 pm on Sept 13, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3462
votes: 776



Training = refining the methodology
Ranking = applying the current methodology.

That's a beautiful capsule explanation!
10:33 pm on Sept 13, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts:592
votes: 106


All I see is that the machine is trained to make money when it comes to few commercial searches it has left. Algorithm suggests that Amazon is the best to rank even if the searcher is arround a local shop. Why waste time to go in a local shop when Amazon is half price? And why would the local shop advertise if 9 out 10 sales go to Amazon?

For informational searches there are still opportunities out there, great content on a popular site still does ok if not dragged into answer box.
11:42 pm on Sept 13, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 2, 2014
posts:711
votes: 376


Why waste time to go in a local shop when Amazon is half price?

Amazon and eBay as well are cheap because many of the Chinese knockoffs being sold. That's one part, but the other part is shipping. If I use the US Postal Service to mail an 8OZ package to my neighbor, it costs me $2.66. The US Postal Service collects about $1.50 for a 1LB package if it was shipped from China. Regardless, we in the USA are subsidizing trade with China with increased domestic shipping rates to offset the low fees charged for inbound international shipments. Thankfully many people want their products now, which helps even the playing field to some degree. But Amazon has capitalized on foreign trade in a big way by warehousing foreign manufactured goods and fulfilling orders. I'd venture to say a good portion of the small business goods being shipped from Amazon warehouses are originally from China. So where does this playout in the serps?

I've seen many eBay products ranking on the first page of Google that ship from China using the ePacket service. While these Chinese sellers typically pass along the freight, it's < $2. Does this mean Google cares more about the shipping price then their users having to wait a couple of weeks to receive products they bought? I doubt it. The serps are dummied down so badly for product queries that it's hard to explain it away as coincidence or the learning curve of AI. Google's serps are nothing more than an Adwords funnel for businesses looking for exposure, meanwhile Google's users get the satisfaction of seeing Amazon at the top (sometimes 2, 3 or more listings) for product queries, so what few shoppers Google has are content with what they found. This a problem for those of us that are trying to be found by consumers in a digital market predominantly controlled by two players - Google and Amazon. If the costs of doing business with the big two is too high, relying more heavily on traditional marketing methods may work for some but will exclude many, many other small businesses.
7:26 am on Sept 14, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 15, 2001
posts: 1817
votes: 59


These issues have made me check again just how reliant on G we are.

At the moment 67% of our sessions come from either G organic or adwords.

How reliant on G are you?
2:49 pm on Sept 14, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Jan 19, 2017
posts: 670
votes: 246


Roughly 70% of my traffic is from Google. And speaking of Google, traffic seems to be up today, at least so far.

Actually, I have a question. If Google's search results are now personalized for each user, is it possible anymore to keep track of how your keywords are doing? For example, I just noticed that my site is up 4 spots this morning in the SERPs for its main keyword. In the past, I would look at that and say "Well, that explains the bump in traffic". But these days, I have no way of knowing if what I'm seeing is the same as what anyone else is seeing. So I could be the only one seeing the improvement in the SERPs. For someone else, my site may have dropped off the page. Who knows?

If that's the case, then how does anyone keep track of their keyword rankings? What about all those paid services that claim to keep track of your site's keyword performance? How can they keep track of personalized results?
3:43 pm on Sept 14, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3229
votes: 496


then how does anyone keep track of their keyword rankings?


Clean out the cache and history off your browser(s) everytime you close the window and/or shut down.

Insofar as Google is concerned I also use each country's own G.tld to compare however make sure G does actually take you to the specific G.tld since they have a naughty habit of sending you to your own local.tld and you may have to force it to the one you want.

You could also try the TOR browser, that makes things interesting:-)
4:19 pm on Sept 14, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Jan 19, 2017
posts: 670
votes: 246


But clearing my browser cache and history doesn't clear anyone else's, so everyone else could still be seeing different results.
6:25 pm on Sept 14, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3229
votes: 496


You said it yourself:

But these days, I have no way of knowing if what I'm seeing is the same as what anyone else is seeing.


I thought you wanted to see what the SERPs were like without personalisation?

Whenever I set-up any machine I ensure on shut down that they clear as much cache and history as possible, that's probably about 0.00000000001% of all users ... maybe!
7:53 pm on Sept 14, 2017 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 24, 2012
posts:85
votes: 25


Simply do your query in private navigation. ctrl+shift+n in your keybord to open a new private window (on Chrome). But even this way, you could get personnalized results, in a minor degree.
8:33 pm on Sept 14, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Jan 19, 2017
posts: 670
votes: 246


Sorry, I wasn't clear. What I meant was, if everyone is seeing personalized results in the SERPs, how is it possible these days to track keyword performance overall? Do those websites that offer paid services for tracking your keywords somehow have access to everyone's personalized results?

If I search for my main keyword on my wife's computer (using Google), my site always shows up higher on the page than it does when I do the same search on my computer. What's interesting is, on my wife's search result page, there's a little message below the link to my site that says "You've visited this page many times". So obviously, Google is keeping track of her search history and using it to customize her results. Yet I've only ever seen that message on her computer. I never see it on mine, even though I visit my site a lot more than she does. In fact, she only ever visits it when I ask her to. :P

[edited by: Cralamarre at 9:10 pm (utc) on Sep 14, 2017]

8:59 pm on Sept 14, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts:592
votes: 106


@cralamarre

I think AI knows already that your wife loves you and needs to show her how well you are doing in SERPs but if you really want to know how much AI loves you, you should search on more computers especially for the first time on a computer/IP you will find a lot of love.
1:27 pm on Sept 15, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1382
votes: 506


I would love to see the number of sessions these "affected" sites are getting per day. Clearly niche size, "trendability" of topic and demographics play a huge role in the disparity we see between sites. Expecting the same results we saw 10 or more years ago is not realistic. Mobile and social networks along with a whole new generation of users has changed the landscape. If you're not adapting to that landscape then you are certainly in the "slow death" pattern. Don't get me wrong, G's AI push for profits is another huge factor, but moving forward, tapping into the new sources of traffic is probably your best strategy, rather that continually wondering "what happened"....see what I did there? (unintentionally) lol
11:42 pm on Sept 15, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 794


Thing are looking strange today, my bounce rate is up, and my page per session are down, Traffic is down marginally, too early to know for sure if it is significant. This after a few weeks of slowly but steadily growing traffic.
3:46 pm on Sept 16, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1382
votes: 506


@Nick - same pattern here. Three weeks of "almost recovery" followed by a squash this week. It feels like an algorithmic game of "whack a mole". I am wondering what makes bounce rates vary so much and in almost phase transition patterns? Maybe we are witnessing a new human traffic "murmuration" behavior....or maybe that of the internet itself. ;)

I'm sure (someone will say) the hurricanes have affected traffic somewhat, but what about the rest of the world population? Are they all unable to focus on more than one trending topic at a time? Doubtful. Either way, traffic is at ghost town levels right now. Watching GART is like watching a faucet drip...1 visit spaced by almost identically timed dead zones.

Murmuration for those unfamiliar with term: [youtube.com...]
3:56 pm on Sept 16, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Jan 19, 2017
posts: 670
votes: 246


No drop in traffic here this week. This past week was actually my best week since May. Still down overall from before the May Google update, but the best it has been in a long time. Even today, with Saturdays always being slow, traffic is up a bit from last Saturday.
4:58 pm on Sept 16, 2017 (gmt 0)

New User

5+ Year Member

joined:June 2, 2014
posts:20
votes: 5


Every "weather" tracker is going to very extreme reds today.
9:15 pm on Sept 16, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3462
votes: 776


We're seeing a strong increase yesterday and today (compared to the same days last week), but it's probably because of the big, big increases in Texas and Florida traffic now that the hurricanes have moved on. (Florida is running 122 percent ahead of last Saturday, for example, and the day isn't over.)
9:35 pm on Sept 16, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts:2662
votes: 794


@samwest it gets event stranger. After I posted, the days had only few hours left and my stats mostly recovered, bounce rate finished in a normal range, pages per view remained on the low and traffic finished strong. Today everything is back to normal (for a Saturday) and Adsense revenue is going well too. So maybe it was just randomness or some short lived Google algo shift.

@Lake I have only ever really followed mozcast but it has been showing value in excess of 100 for weeks now. In fact over the past few days it was reporting oddly lower numbers in the high eighties. The fact that it is reporting 106 today is more a sign of things returning to "normal" than an alarm. Or another way of seeing it is, these tools have been reporting rubbish for month.
11:08 pm on Sept 16, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts:592
votes: 106


Last few days I was expecting to see an increase in traffic but it is not happening.

9 out 10 are zombies
2:02 pm on Sept 17, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1382
votes: 506


GA insights is now reporting traffic "anomalies" where pageviews exceed their CALCULATED range. Hmmm, no throttling Google? There's nothing natural about organic traffic anymore.

Pageviews for http/mydomain/yadayadayada.. (btw - technically it's an https page so they are reporting it wrong) For this Page, we forecast Pageviews of 2.22-22.7 for Sept 15, 2017, and your actual Pageviews of 27 is higher than this range. Click below to learn more about how Analytics Intelligence detects anomalies.

Imagine that, using all their processing power to detect an "overshoot" of 4.3 clicks. Incredible.
Do they act on that delta feedback? Absolutely.

I suppose, if you don't want an ant infestation, you have to take away every grain of sugar from the ants.
2:20 pm on Sept 17, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Jan 19, 2017
posts: 670
votes: 246


@samwest, I had the same message on GA a couple of weeks ago when the school year started up and my traffic started increasing. I expected the increase in traffic, but I guess GA was expecting things to remain as slow as they were all summer. Now that traffic has been up for a couple of weeks, I no longer see the "anomalies" message.

I get a similar message in my AdSense reports when earnings are down because of a holiday, especially around Christmas. The message tells me there's something wrong because my earnings were lower than what AdSense predicted and that I should try to fix the problem. Yep, I'll get right on that.
3:41 pm on Sept 17, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3462
votes: 776


For this Page, we forecast Pageviews of 2.22-22.7 for Sept 15, 2017, and your actual Pageviews of 27 is higher than this range.

There's nothing new about forecasts. How do you think hotels, airlines, and cruise lines calculate "dynamic pricing"? For that matter, smart retailers--even small shopowners--were forecasting sales long before computer software came along to help them do it.

If Google can forecast your likely pageviews for a given day, based on the data that it has available, wouldn't you prefer to have Google share that information with you than keep it a secret?
10:21 am on Sept 18, 2017 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:May 27, 2016
posts:68
votes: 22


"Hello the last week I have started to recovery from the FEB UPDATE. I have just been dropping since that date. Starting Last week my traffic has started to go up... 20% up so far average over those days ... and today we are showing another 30% traffic today!.

For the record i should never have got hit in Feb update. Ive changed nothing on my site and had no more links since last year... . Ive only posted more posts..

[edited by: 30K_a_month at 1:18 pm (utc) on Sep 11, 2017]"

Things Reverted to poor traffic on FRIDAY 15th and SAT SUN.. so whatever happened has gone back to crap now. =/
This 363 message thread spans 13 pages: 363
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members