I wonder - is it still true that (NOINDEX, FOLLOW), if enough time elapses, will eventually be treated by Google as (NOINDEX, NOFOLLOW), ie. almost like 404 status?
Selen, this change in the treatment of nofollowed robots meta tags is something that Google announced in late 2017... so the robots noindex,nofollow meta tag is no longer good for anything more than temporarily keeping a page alive while keeping it out of the visible index. Barry quotes John Mueller about this here...
Google: Long Term Noindex Will Lead To Nofollow On Links Dec 28, 2017 [
seroundtable.com...]
-----
Pjman posted... I manage a bunch of sites that are in a similar niche. Sometimes I get so much content thrown at me from my writer's that I lose track which of the sites I throw the content on.// We are doing an SEO audit and realized that 3 of the sites have 20% of the same exact PDFs live on them...
Pjman, I'd try to keep better track of this content. There are more or less two approaches to handling multiple sites in the same niche, and I've needed to use spreadsheets to track the content if very many sites are involved.
- If they're openly related, like franchises of a large company, I try to keep them as individualized as possible, adding as much localized differentiation, eg, as possible. In this situation, I don't necessarily hide the fact that the sites are related, but I might use nofollow links to interlink them.
- If they're not openly related, then I do all I can to keep them separate... avoiding common hosting, common inbound linking sources, and avoiding dupe content... all as separate as I can keep them.
If your situation is the latter, and if you're competing for the same search queries, then I think it can only hurt you to have very many elements in common, and I'd strive to eliminate dupe content on pages I want to rank. Various factors can determine whether it's natural or unnatural for independent sites to share the same content.
PS: I should add that much of my concern is not the dupe content per se (albeit if duplication leads to various of the duped pages dropping out of the serps, and you have no control over which versions drop out and which stay, then that certainly is a problem.
For me, though, the greater problem would be the signs of coordination that common content over a bunch of niche properties would indicate to Google.