Welcome to WebmasterWorld Guest from 220.127.116.11
joined:Nov 2, 2018
Yes, especially over the last few days. Labeled as direct traffic from Chicago and they visit multiple pages on my site throughout the day.
That might have made sense 10 years ago but the majority of people use the internet on their phones. Go out to any park or public space full of people enjoying the sun and see how many of them are not glued to their phones.
I published a kind of an exclusive story and 6 sites which covered it linked back to me, and my position is 7th :( It doesn't matter what I do, Google isn't going to rank me.
Uhh, if that's how it's always been for certain publishers every year, and their traffic consistently falls during the summer and on public holidays (and rises again when it's not), why do you expect them to deny this fact and blame it on the internet dying instead?
joined:Apr 14, 2019
@MayankParmar Google probably thinks that you are engaged in buying paid links and hence has kept your site under the scanner.
joined:Apr 21, 2019
And what's surprising is that all these guys are linking back to @MayankParmar as the original source. Pretty straightforward as to who should be ranking number one.
This is pretty much exactly what I see after being punished by the June core update. They copy me, link to me as the original source, but they rank above me or instead of me.
Is it possible that Google hasn't yet figured out where the article(s) originated?
That could be the case, but for a search engine as advanced as Google, that actually punishes websites for having duplicate content, it is crucial that they at-least know who the original author is. If you cannot figure out who the original author is, on what basis do you then recognize duplicate content?
Which raised the question is, does Google and users mind who is the original publishers? Two articles about the same subject, Google can pick any of them, it will be the same for the user (sadly).
The most likely reason is that his site is under an algorithmic penalty. Being outranked by scrapers is a well-known indicator that a site has been penalized.
Ironically, it is the legit site that is under penalty, instead of the scraper sites.
Google is creating a situation where it's going to be much easier for big sites consolidate power?
All sites are under some sort of "algorithmic penalty."