Welcome to WebmasterWorld Guest from 35.153.73.72

Forum Moderators: Robert Charlton & goodroi

Featured Home Page Discussion

Update Maverick : Google Updates and SERP Changes - July 2019

     
3:09 pm on Jul 1, 2019 (gmt 0)

Junior Member

joined:Nov 2, 2018
posts:53
votes: 22



System: The following message was cut out of thread at: https://www.webmasterworld.com/google/4947706.htm [webmasterworld.com] by goodroi - 1:17 pm on Jul 1, 2019 (utc -5)


Many advocate that to combat reliance on visits through Google Search you need to build a brand. Obviously it is not enough.
5:47 pm on July 5, 2019 (gmt 0)

Senior Member from IN 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 30, 2017
posts:1491
votes: 284


@seomotionz Information/news site, not have to bother about conversions :)

- - - -

I published a kind of an exclusive story and 6 sites which covered it linked back to me, and my position is 7th :(

It doesn't matter what I do, Google isn't going to rank me.
5:51 pm on July 5, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Oct 24, 2003
posts:741
votes: 74


Yes, especially over the last few days. Labeled as direct traffic from Chicago and they visit multiple pages on my site throughout the day.


@whoa182 they've been hitting me for weeks now. I blocked the one IP address 23.101.169. 3

But still looking for the second IP from unknown.unknown
7:02 pm on July 5, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3439
votes: 749


That might have made sense 10 years ago but the majority of people use the internet on their phones. Go out to any park or public space full of people enjoying the sun and see how many of them are not glued to their phones.

The fact that people are using their phones for Facebook, Instagram, news apps, etc. doesn't necessarily mean they're using their phones to search Google for Web pages. In any case, the mobile audience doesn't behave the same as the desktop audience:

Mobile vs. Desktop Traffic in 2019
[stonetemple.com...]
7:37 pm on July 5, 2019 (gmt 0)

New User

joined:Dec 18, 2018
posts:37
votes: 27


Nobody uses facebook; except a few in their 30s.
8:51 pm on July 5, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 18, 2005
posts:1862
votes: 86


"Nobody goes to this club anymore, it's too crowded."
4:23 am on July 6, 2019 (gmt 0)

New User

joined:June 26, 2019
posts:14
votes: 16


I published a kind of an exclusive story and 6 sites which covered it linked back to me, and my position is 7th :( It doesn't matter what I do, Google isn't going to rank me.


@MayankParmar Google probably thinks that you are engaged in buying paid links and hence has kept your site under the scanner.

This is the problem. Google is having an extremely hard time figuring out natural links from unnatural links. One of the main reasons for this is that people have figured out ways to make unnatural links look so natural. So much that it is the Natural links that have started looking Unnatural!

For example, if a smaller/medium site gets around 7 to 8 quick links because of a great article they wrote, those links start to look unnatural as this website does not have a history of getting so many links this quickly. On the other hand, a bigger website can go on getting unnatural links at a steady pace and make it look all natural. Just an oversimplified example, but you get the point.

So no matter what, it will be impossible for a Bot to rightly figure out if the link is natural or not. Forget the bot, sometimes even a human cannot rightly figure out if a given link is natural or not.

So I guess Google is ending up with a lot of false positives, penalizing smaller/medium websites that are perfectly good and letting the big fish go unscathed. I know several big websites that have a TON of unnatural links and they are doing just fine in the SERPs. And these guys are spammy too, generally asking their writers to copy and rewrite articles from other websites and doing that on scale.

The other signals including social signals, user behavior signals etc. can all be manipulated too just like the backlinks, making it even harder for Google.

The easy way out of this dilemma for Google, of-course is to simply rank the bigger websites higher and drop smaller/medium websites to the 2nd page. But this way is not good for the future of the internet.
8:11 am on July 6, 2019 (gmt 0)

New User from IN 

5+ Year Member

joined:Apr 10, 2013
posts:26
votes: 3


@MayankParmar sorry, I thought you have an eCommerce site.
10:33 am on July 6, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 13, 2016
posts:1011
votes: 244


Lot of people assume that that all sites are e-comm , but there is a huge variety of sites, and niche, and content.
1:14 pm on July 6, 2019 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts: 1371
votes: 487


Uhh, if that's how it's always been for certain publishers every year, and their traffic consistently falls during the summer and on public holidays (and rises again when it's not), why do you expect them to deny this fact and blame it on the internet dying instead?


Uhh, please read my OP stating "across-the-board"...which you then toss out of context by pointing to "certain publishers".

Flux is one thing, but I have yet to hear anyone cheering a "RECOVERY" in this group....ever.
Once natural patterns were 'manipulated', the free market web for small publishers like us began to die.
So yeah, it's dead...unless you call pizza money a going concern.
2:43 pm on July 6, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:May 25, 2018
posts:128
votes: 25


No sign of an update yet, I was expecting one to come around 4th July.

SEMRush and other tools are really quiet... it's never been so quiet. Is Google finally happy with their results? :)

Is it possible that Google could raise the bar so high for authority and trust that it inhibits natural movement in the SERPs? And we have a situation where the whole thing is quite rigid (no movement) as the biggest sites maintain positions on the first page and little sites (even if they write content that's more relevant) can't compete because the bar is set too high that they are filtered out.

Google is creating a situation where it's going to be much easier for big sites consolidate power?
4:06 pm on July 6, 2019 (gmt 0)

New User

joined:Apr 14, 2019
posts:3
votes: 0


@ichthynous I had these crawling last month. Huge increase in direct traffic from one source.

On another note, still trying to recover from the June Update and looking at two things.

1 Post date. As out posts were from 2011, and might be seen as out of date we're updating the posts, and reviewing the content... Anyone else tried this? I'll review the content anyway, that's just good practice.

2. Google business listing. We didn't have one of these, and perhaps that means we lose authority? While we can list two of our sites, as we have two physical addresses, the other five or so sites (with few more in development) will forever be disadvantaged.

Sites are local outdoor activity information for UK if that helps...
4:36 pm on July 6, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3439
votes: 749


@MayankParmar Google probably thinks that you are engaged in buying paid links and hence has kept your site under the scanner.

Or maybe the six linking sites just don't carry much weight with Google:

All Links are Not Created Equal: 20 New Graphics on Google's Valuation of links
[moz.com...]
4:42 pm on July 6, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 13, 2016
posts:1011
votes: 244


I think that @MayankParmar meant that the 6 sites linking to his article are ranking above him ...
5:15 pm on July 6, 2019 (gmt 0)

New User

joined:Apr 21, 2019
posts: 3
votes: 1


now days every one do blogging even googlebot also.
6:28 pm on July 6, 2019 (gmt 0)

New User

joined:June 26, 2019
posts:14
votes: 16


@EditorialGuy I do agree that not all links are created equal, but irrespective of the backlinks, @MayankParmar said that he published an 'exclusive story'. So he should be ranking number one for that. Why is it that people who copied his story are ranking above him? And what's surprising is that all these guys are linking back to @MayankParmar as the original source. Pretty straightforward as to who should be ranking number one.

I am not sure of the exact scenario, but I am guessing that is the case and I have seen many such cases happening off-late where people coping an article actually rank above the original article.

For instance, I saw an article that was republished on medium and there was a link given to the original source, but the medium article was ranking higher than the original source! Mind you, the link given from medium was a NoFollow, but how does that take away from the fact that the original article needs to rank higher than the copied version? And this is just one example, I have seen many such cases happening.
6:38 pm on July 6, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:May 25, 2018
posts:128
votes: 25


And what's surprising is that all these guys are linking back to @MayankParmar as the original source. Pretty straightforward as to who should be ranking number one.


This is pretty much exactly what I see after being punished by the June core update. They copy me, link to me as the original source, but they rank above me or instead of me.
9:37 pm on July 6, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3439
votes: 749


This is pretty much exactly what I see after being punished by the June core update. They copy me, link to me as the original source, but they rank above me or instead of me.

That's certainly unfortunate (and undesirable). Is it possible that Google hasn't yet figured out where the article(s) originated? I know this kind of thing has happened in the past, but I don't recall anyone giving us updates about whether specific examples have persisted for more than a month or two.
9:52 pm on July 6, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3611
votes: 350


Why is it that people who copied his story are ranking above him?

The most likely reason is that his site is under an algorithmic penalty.

Being outranked by scrapers is a well-known indicator that a site has been penalized.
10:33 pm on July 6, 2019 (gmt 0)

Junior Member

5+ Year Member

joined:Jan 24, 2012
posts:84
votes: 23


If it's a news article, it completely useless for MayankParmar to rank well one or two months after...
Like aristotle, the penalty hipothesis should be considered. One or two authorities websites, OK, but not everybody !
1:57 am on July 7, 2019 (gmt 0)

Full Member

5+ Year Member Top Contributors Of The Month

joined:Dec 11, 2013
posts:349
votes: 94


All sites are under some sort of "algorithmic penalty."
3:32 am on July 7, 2019 (gmt 0)

New User

joined:June 26, 2019
posts:14
votes: 16


Is it possible that Google hasn't yet figured out where the article(s) originated?


That could be the case, but for a search engine as advanced as Google, that actually punishes websites for having duplicate content, it is crucial that they at-least know who the original author is. If you cannot figure out who the original author is, on what basis do you then recognize duplicate content?

Going by the current scenario, it appears that the original author is calculated based on site authority. So even if the article was published first by a smaller site, if a bigger authoritative sites republishes it, it gets recognized the original author, even if it actually links back to the smaller site as the source!

@Aristotle yes I agree the site is under some sort of penalty, and is probably one of many sites that are under penalty for no apparent reason.
7:28 am on July 7, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 13, 2016
posts:1011
votes: 244


That could be the case, but for a search engine as advanced as Google, that actually punishes websites for having duplicate content, it is crucial that they at-least know who the original author is. If you cannot figure out who the original author is, on what basis do you then recognize duplicate content?

Which raised the question is, does Google and users mind who is the original publishers? Two articles about the same subject, Google can pick any of them, it will be the same for the user (sadly).
8:14 am on July 7, 2019 (gmt 0)

New User

joined:June 26, 2019
posts:14
votes: 16


Which raised the question is, does Google and users mind who is the original publishers? Two articles about the same subject, Google can pick any of them, it will be the same for the user (sadly).


Very true. Sad as it is, with the rise of publishes, Google now has the freedom to start taking publishers for granted. The focus has shifted from publishers to the users and advertisers.

This can work short term, but in the long run, this has a huge potential to backfire.
8:26 am on July 7, 2019 (gmt 0)

New User

joined:June 26, 2019
posts:14
votes: 16


The most likely reason is that his site is under an algorithmic penalty. Being outranked by scrapers is a well-known indicator that a site has been penalized.


Ironically, it is the legit site that is under penalty, instead of the scraper sites.

Imagine a school test where the student who writes the original answers is termed the cheater and penalized whereas the actual cheaters who copied from this person are termed as toppers. And the teacher receives the teacher of the year award. :)
10:53 am on July 7, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 13, 2016
posts:1011
votes: 244


Ironically, it is the legit site that is under penalty, instead of the scraper sites.

Ranking lower than scrapers, doesn't meant it's a penalty. It just mean that the scrapers are exploiting/abusing/tricking the ranking algorithm to rank higher... the holy "SEO". But it doesn't mean the scrapers will stay there forever.

Also, scrapers are appearing "after" an original article, and, this might make them look "fresher". In news-related niches, it's possible that the most recent page is getting a boost.
11:48 am on July 7, 2019 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:May 25, 2018
posts:128
votes: 25


While I can't speak on MayankParmar's case I will assume based on what I've seen over the last year, it's more of a feature of the current algorithm, not a bug or because Google doesn't know which one is published first.

Since August 2018 when I got hit by the core update, I had almost fully recovered *twice*. During these recovery periods, Google correctly ranked my articles and not the copies. During the core updates that negatively hit the website, Google started to rank copied articles that linked to me as the source (above me or instead of me).

Maybe it's an unintended consequence of the way they handle trust and authority.
12:02 pm on July 7, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Nov 13, 2016
posts:1011
votes: 244


An other issue can be that, scrapers/copiers are featuring a lot more content than original sites. I mean, for example let's say one run a news site, about xxx, the site might be publishing 5-10 news per day (random example), the scrappers/copiers will may be "generate" 100 or 1000 new pages per days. So, from a user point of view, a scraper site "might" be more "interesting" because, beside the article, you have plenty of additional "content". I might not explain my thoughts well...
1:13 pm on July 7, 2019 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 581
votes: 71


Google should have little trouble finding who is the original author and who the scrapers are. On every new article that we publish, our website will ping the search engines that the sitemap has been updated. Google should be able to easily tell at this point who the good publishers are and who the bad ones are.
3:01 pm on July 7, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Oct 24, 2003
posts:741
votes: 74



Google is creating a situation where it's going to be much easier for big sites consolidate power?


I don't think so, large sites were climbing to the top after the March update but then started to drop back in my niche. The reality is that Google has just turned off the organic traffic entirely. Semrush shows me recoverijg strongly but the traffic isn't there and business came to a complete halt. The real issue is now, who can afford to advertise? If we cannot gain traffic through being the best site and ranking at the top, we also cannot gain traffic through competing against large corporate ad budgets. So consolidation will happen, but because smaller players are being squeezed out entirely in terms of visibility.
3:32 pm on July 7, 2019 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3611
votes: 350


All sites are under some sort of "algorithmic penalty."

Well most of the sites discussed in this thread are probably under a penalty.

Google has created algorithmic penalties as a counter-measure against people who try to "game" their way to higher rankings and traffic than their site deserves. The algorithm looks for certain characteristics in the site's content, structure, backlink profile, etc to try to determine whether a penalty is appropriate.

Examples of what the algorithm might look for are:
-- too much keyword targeting
-- too much artificial link-building
-- churning out a lot of shallow low-quality content at a fast rate
-- subtle signs that the person is posing as more fo an expert and authority than he os she actually is
This 326 message thread spans 11 pages: 326