Welcome to WebmasterWorld Guest from 3.80.38.5

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Problems with ranking and traffic in old established sites

     
4:01 pm on Sep 26, 2016 (gmt 0)

Junior Member

joined:Sept 25, 2016
posts:60
votes: 18


Mod's note: Moved from the September 2016 Google Updates and SERPs thread [webmasterworld.com...] as it deserves a separate discussion...

lightnb posted...
@westcoast: what you described is exactly what we're seeing. We're an e-commerce with over 20 years online, and we always follow the rules and get no where. Every time we do what Google says, Google traffic drops. Every time we do nothing, Google traffic drop.

You are not the first to describe this, either. I have now heard of a number of circumstances like ours. The factor that seems to be the same in all of these "constant irrational downward ranking pressure" cases is sites with old AGE and large SIZE. It's my theory that there is some algorithmic bug or algorithmic interaction specific to Google which negatively effects sites in the 15-20 years old range.

Our site's #1 keyword which we held for 19 years suddenly gave way to a new website 2 months ago. This new site has nearly no backlinks and no content. The ONLY thing it has going for it is that its name is "KEYWORD.XYZ". This indicates to me that our site has been hit by massive algorithmic penalties, the type of which we have no insight into.

Our site's second most relevant keyword has slipped from #1 down to #22 over a period of 2 years on Google. Bing still has us #1, Yahoo has us at #2 (and the #1 site on Yahoo has legitimate argument to that spot). The algorithmic penalties are constant, and ranking pressure is steadily down.

My theories are:

1. Backlink complexity: Webmaster tools shows 836,729 backlinks to our site. The root domain has 118,279 backlinks across 1,988 domains. We have a widget (yes, links in it are NOFOLLOW) which account for 300,000 or so links.

120,000 backlinks are assigned to a website that cloned our site a year ago and inserted malware into their copies of our pages (why on earth Google is listing a hacked site that THEY identified as such and are counting backlinks from it when it has cloned our site is beyond me). That hacked site has been offline for months, but Google still retains their URLs, still retains the backlinks, etc. God knows what sort of penalty we are under for that. God know why Google ignored our every attempt at having them remove that site's ranks. Since Google is listing this hacked site's links as backlinks in WMTools, I have no confidence that they aren't penalizing the crap out of our site.

Anyway, over a period of 20 years you accumulate a crap load of spam links. We are in part an informational/educational site, and so a ton of spammers use links to us to try to make themselves look more real. i.e. they'll spam some allergy medication, and then link to a page on our site somewhat related to allergies. Year after year, it's just what happens. We collect a lot of stuff that is clearly not designed to advantage us.

Using one of those "link detox" type sites, I found 480 domains linking to us that looked either spammy or fishy or whatnot. Some are a little fishy, but may be considered good backlinks. The dillema is, do I disavow all or some of these? And if yes, which ones? And who knows how many more were missed because of incomplete backlink data? Can we hurt our site by disavowing these? Google says "yes. yes you can hurt your site with disavow". Disavow when you're talking about 1900+ linking domains is a complete nightmare.

It seems to me to be completely unfair (and increasingly, impractical because of scope) that WE are responsible for identifying 20 years of crap links that other people have linked to our site.

So who knows what sort of penalties Google has levied upon us for using and or not using disavows (I switch back between our 480 domain disavow list and none every 6 months or so because frankly neither seems to help and I keep hoping at some point Google will treat this stuff properly).

301 complexity: Old sites like ours have reorganized, moved stuff around, etc many times. We have extremely large numbers of 301s (all valid, with no more than 2 "hops"). It is my fear that Google has some bug in its 301 code that becomes problematic with large 301 landscapes. Our site has something like a million active 301 URLs. Add in all the 301s that translate domain.com to www.domain.com and other such combos, and we're looking at a few millon 301s. Is Google confident that massive number of valid, legacy 301s are being handled correctly? What about 301s that once existed and were taken down years ago? Are they causing issues today?

404 linking bug: As mentioned, we had 90,000 URLs pop up in WMTools recently. When you click on the "linking from" tab for these, they show URLs that have not contained links to this page for YEARS. In fact, some of our "linking from" URLs have NOT EXISTED for 18 years!

Google thus appears to have a memory of every link a 404'd page has EVER had, and does not update the existence or linking status of those pages. A true "real 404" is a page that still has a REAL, LIVE page linking to it. Those are the URLs a webmaster cares about.

So I am truly worried that Google is counting these "phantom linked from pages" as penalties in some way or another. Google has that data, and shows that data in WMTools. The number of historical 404s old/large sites accumulate over the years can be quite massive too, which is another reason I think this might be a google bug.

Content freshness run amok: We have a TON of content labelled with posted dates starting 1999 and going forward. We do our best to keep such content up-to-date, and have a good system in place for users to report inaccuracies. So, we keep stuff as accurate as possible. I wonder at times if Google has labelled our entire site as "unfresh" because of the large number of old timestamps on some of our posts. It appears that if Google sees 2 dates on a page (created/posted, last updated/modified), it only cares about the created/posted date. At least, that's the one that seems to pop up in the SERPS. I wonder if having old created dates slowly penalizes a site year after year after year. It might explain the constant downward pressure.

Anyway, just some ideas. It does seem to be that Google has some unintended algorithmic issue with some larger/older websites.

The most demoralizing aspect of this is that there is no way to lay out this issue or case to Google. If you get a manual penalty, at least you are told what it is, and have recourse to converse back with Google. Some of these algorithmic penalties, particularly when layered on top of each other and obfuscated seem to be even WORSE than manual penalties. The difference is there is no recourse with algorithmic penalties. There's no feedback, no way to say "hey, this penalty makes no sense". There's no feedback-corrective mechanism, so bugs and unintended consequences could last for years. Intentional black-hat (manual action) gives you a feedback mechanism, whereas following the rules and getting caught up in opaque algorithmic webs does not -- which seems a bit upside down.

<Google needs to take a look at how it> treats older websites with large amounts of legacy 301/404/usage/dated content/history data in your system. As systems have become more complex there is a greater chance for unintended consequences.

.

[edited by: Robert_Charlton at 6:07 pm (utc) on Sep 26, 2016]
[edit reason] moved from another location, fixed formatting, and some Charter issues [/edit]

6:40 pm on Sept 26, 2016 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12225
votes: 361


westcoast, I feel that this is a topic that deserves a discussion of its own, and have moved it from the Google Updates and SERPs thread [webmasterworld.com...] where the discussion began.

Also, as our Forum Charter [webmasterworld.com...] states, this forum is not a place to send messages to Google. The Charter suggests ways of doing that... and Google now has a number of channels open for feedback.

Regarding your particular issues, it's probably more productive to discuss what you see and get feedback from members here about what they might mean, rather than to complain about them.

It may well be that in a 20 year old site, there's a lot of housekeeping you've neglected to do, and issues from that may be catching up with you.

Again, your 404 issue specifically was at least partially covered in your post in our legacy crawl errors discussion [webmasterworld.com...] but you haven't acknowledged that you've at least read those comments.

...large amounts of legacy 301/404/usage/dated content/history data
If you have large amounts of chained redirects, at some point Google will lose that trail. We've had numerous discussions on the topic, and our site search is your friend. Here's a recent discussion in which I also point to some eariler threads...

Multiple 301 redirects bad?
June 2015
https://www.webmasterworld.com/google/4751457.htm [webmasterworld.com]

Link rot is another topic that's been discussed from time to time. I'm sure members here have lots of insight into problems which can affect an old and perhaps complex site.

Ranking changes in old sites are a topic in themselves. Google has changed the way it looks at many queries. See this discussion...

Google Quality Rater Guidelines Update March 28
April 2016 - July 2016
https://www.webmasterworld.com/google/4799052.htm [webmasterworld.com]

Unfortunately, many old-school SEOs are still not picking up on this issue. Complaining to Google is not going to solve your problems. Beyond technical issues, the problems involve old ways of looking at a site and at online user experience. I think that systematically examining and discussing the likely issues, with less complaining, might help more.

I'm sure members here have lots of insight into problems which can affect an old and perhaps complex site, and I hope we can productively discuss and share them. We're looking for good discussion though... not for dealing with anti-Google rants.


Removed some dupe copy and paste errors.

[edited by: Robert_Charlton at 10:28 pm (utc) on Sep 26, 2016]

6:56 pm on Sept 26, 2016 (gmt 0)

New User

Top Contributors Of The Month

joined:July 14, 2015
posts: 23
votes: 2


Regarding the 404's, Google likes to come back to those to see if they are still gone. We changed our server to return a 410 instead of a 404 and then Google figured it out.

I'm pretty sure we found a bug when we launched our new site. We did https everywhere at the same time we launched the responsive layout update. We think Google traffic dropped because of a duplicate content penalty vs the non-https version because Google didn't understand that it was a move (despite 301's) and GWMT has no change of address for http to https moves. Another site had this problem, but because they were a big company, Google just went in a fixed it for them. I posted on the Google forum and tried to message a Google engineer on G+ and got crickets.

And you're right, Google's lack of any type of investigative support desk is really unacceptable. But what can you do? Blast spam at yourself so you can get a manual penalty? I think that even if you get the manual penalties removed, the algo ones still apply anyway. You could always complain to the FTC: [ftc.gov...] If they get enough complaints maybe they will do something.
7:16 pm on Sept 26, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3524
votes: 324


So is this a "content farm"? That's the impression I get from reading between the lines.
8:21 pm on Sept 26, 2016 (gmt 0)

Junior Member

joined:Sept 25, 2016
posts:60
votes: 18


Guys, I was merely responding to another user in that other long thread [the text of which would make this thread make more sense] who stated he was suffering from a similar situation as us. It was not "whining" or "complaining", we do not run a "content farm", and I am most definitely not "anti-google". Please note that I did NOT create this thread, nor was I trying to create a big fuss.

In fact, to the contrary, I'd love to make Google happy. I use it all the time, think it generally offers the best results outside of some edge cases, and try to abide by its guidelines.

My response was merely a suggestion that it is not out of the realm of possibility that over time Google's algorithmic complexity may create unintended consequences when dealing with 20 years of data. Clearly this is not a provable (and perhaps maybe not even probable) circumstance, but it is something that is most definitely possible. If many other large, old sites suffer from degrading rankings in Google and only Google *after taking care of all the standard SEO stuff*, then perhaps there is an issue. If not, then not. It is not at all unreasonable (nor "whiny") to suggest that complex algorithms dealing with complex data may result in unexpected results.

Now, on to the actionable SEO stuff:

- 301 redirects: yes, we try to keep redirects to 1 hop. I think we have some that are 2, but certainly avoid chains where possible.

- content farm: no.

- 404s: My concern lies primarily with the "linked from" data, which shows URLs that have not existed for years. On the surface this would indicate that 404 data is stale. Perhaps it's an irrelevant display bug/issue. Perhaps not. Was just mentioning it.

- 20 year housekeeping: yes, totally agree. We are working to update broken citation URLs from external webpages that no longer exist, keep data fresh and accurate, etc. It is an ongoing battle, and clearly something that is our priority.

- link rot: we would appear to have a pretty healthy link profile, even with ancient links disappearing over time. The SEO firm we dealt with thought our backlinks were good and healthy. The 120,000 backlinks from the now-offline hacked site are still a major concern of mine, but there's no way to prove or disprove that it is harming us.

- changing standards over time: while we didn't think Panda effected us, we hired an SEO firm who told us we had pages that should not be in the index. We ended up NOINDEXing a large number of pages which, while useful to our users (like user profiles) were repetitive and template-looking to search engines. In the old days having a lot of pages mattered and was advantageous. We have shrunk the number of pages indexed to remove some of that non-important stuff. No changes from that Panda-correction stuff though, so perhaps "realtime rolling panda" hasn't gotten to us yet.

- The backlink mess though that accumulates over time is troubling. Given approx 2000 linking domains, which specifically should we disavow and which shouldn't we? I can create a list of over 400 domains with links to us that look spammy or low quality, but Google states that disavow should only be used if absolutely necessary. So is it absolutely necessary for us? I can't tell. Understand that we have never advertised on another site, let alone bought a link or participated in any sort of link networks. Weird, spammy looking links out there were created by people trying to make their sites look more real, or by page-spinning bots that simply did queries and sucked our URLs along with lots of others into spun articles.

BTW thanks for posting the "quality rating" guideline document. Will have to read that completely.
9:07 pm on Sept 26, 2016 (gmt 0)

Junior Member

10+ Year Member

joined:Aug 14, 2008
posts:80
votes: 4


Westcoast - what you are describing sounds like you work with us! We are a 16 years old informational type site and have since Jan 2015 been getting less and less organic google traffic no matter what we do. We suffer the exact same pains as you described. Sorry this post isn't more useful but I want to say "me too!" :)
9:32 pm on Sept 26, 2016 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12225
votes: 361


Please note that I did NOT create this thread, nor was I trying to create a big fuss.
westcoast, thanks. I created this thread as the most efficient way, I felt...
a) to move the discussion off the Updates thread, where it was taking the entire thread off topic. I realize you didn't start the discussion, but your post was the easiest place to split the thread...
b) to bring attention to the problems your own site is facing
c) to also bring attention to problems that similar sites are facing.

The comments about complaints and whining were addressing members in general who like to use this forum as a platform. This skews the entire forum, and it needs to be stopped. Sorry if you were an innocent bystander, but thank you for the clarification.

404s: My concern lies primarily with the "linked from" data, which shows URLs that have not existed for years.
Again, you should have no concern, but you will have to do some reading to catch up with the current discussion on the topic here...

Massive jumps in GSC legacy crawl errors - who sees this?
9/8/2016
https://www.webmasterworld.com/google/4817870.htm [webmasterworld.com]

I know you're aware of the discussion because you posted on it, and I'd posted a long answer to your question, along with further article suggestions from John Mueller. Your issue is the one most clearly identified as something not to worry about... unless for some reason these urls are showing up in the first page of the GSC's report, in which case I would contact John Mueller.

less and less organic google traffic no matter what we do
Regarding traffic questions in spite of rankings, I'm coming around more and more to thinking that many members with the problem aren't making good use of analytics.

Topics like conversion optimization and user experience aren't even being mentioned. They are much more important these days than keyword density... and many site owners here aren't getting that point.

[edited by: Robert_Charlton at 10:29 pm (utc) on Sep 26, 2016]

9:35 pm on Sept 26, 2016 (gmt 0)

Moderator This Forum from US 

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 11, 2000
posts:12225
votes: 361


PS, OldFaces, it's also possible that queries for your subject area have dropped off. I don't know that, but I've seen many cases where this is the reason.
9:44 pm on Sept 26, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3524
votes: 324


One possible explanation for a long-term traffic decline is that the competition has steadily gotten better over the years. I've certainly seen this happen in several of the niches that I work in. If someone publishes an article on a big news site or educational institution site, it almost automatically gets a high ranking.
3:35 am on Sept 27, 2016 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Feb 3, 2014
posts:1299
votes: 380


I've told a similar story for years on my 16 year old site. Same problems you describe. So this begs the question, if all these sites are bleeding traffic and internet usage is rising, then where is it all going? Hint, G does 20% increases almost every quarter. The landscape has changed and what was 5 or 10 years ago is no more. Adapt or perish.

In my case, competition hardly exists in my vertical so Google slowly filled in page 1 with mega brand sites that hardly require a search engine to be found....like Pinterest. This pushed us down and out of page 1. Presumably these mega brand sites churn ads better and apparently ads beat slinging our content for free. Pretty simple. The mechanism seems to reside in the portion of the algo that deals with semantic and long tail search, which is were we continue to slowly lose these days.
3:55 am on Sept 27, 2016 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Oct 3, 2015
posts:132
votes: 64


@westcoast,
One of the things I like to consider with my older domains is how well they work in today's environment. Meaning when I developed people were solely using desktops and were looking for information on a subject. That was fun while it lasted. The environment today is increasingly mobile, so a publisher needs to provide their information somewhat based on user location. And here is where we need to stop. And think.
It's probably not enough in today's environment to just use a database and draw city,county,state, country and plug all that together with a product line and expect decent results. You want that user to click your link and never go back to search results.

Best of luck with it.
4:34 am on Sept 27, 2016 (gmt 0)

New User

Top Contributors Of The Month

joined:July 14, 2015
posts: 23
votes: 2


404s: My concern lies primarily with the "linked from" data, which shows URLs that have not existed for years. On the surface this would indicate that 404 data is stale. Perhaps it's an irrelevant display bug/issue. Perhaps not. Was just mentioning it.


We had this problem too. For us, it was because URL's 301 redirected to 404 pages. Google is not smart enough to treat the source page as a 404, and instead sees the 301ing url as a "link" to a missing page. We solved this by building a script called GoogleHandHolder.php that gets called whenever the server can't find a page outright. The GHH does a look-ahead using CURL to see if the redirection would go to a 404, and if so, returns a 410 instead of the 301. This cleaned up that issue for us.
4:42 am on Sept 27, 2016 (gmt 0)

New User

Top Contributors Of The Month

joined:July 14, 2015
posts: 23
votes: 2


Google states that disavow should only be used if absolutely necessary. So is it absolutely necessary for us? I can't tell.


Regarding disavow files, around May of this year we found a competitor (we presume) building lots of comment spam with EMAT for a certain keyword, which dropped from about #20 to #50, then #70 as they built spam. We had never used disavow before. We pulled all ahrefs and WMT supplied links, and disavowed several hundred domains, including the perfectly legit sites that had the EMAT comment spam. After about three weeks, this keyword returned to about #25. So I think it does have some limited effect, unless those comment spam links were actually helping us and Google just did their demote-to-confuse then boost routine. But I think not, because it fell to fast, where as normal link building just causing jiggling not plummeting.

As for Penguin, we have over 700 domains disavowed. Google traffic since last Tuesday is crap, and we we're expecting a recovery when Penguin re-runs. But nothing exiting yet. I'll let you know in a couple weeks if we see any recovery.
4:54 am on Sept 27, 2016 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Oct 3, 2015
posts:132
votes: 64


For us, it was because URL's 301 redirected to 404 pages. Google is not smart enough to treat the source page as a 404, and instead sees the 301ing url as a "link" to a missing page

Why would you 301 to a 404 page?
Regardless of how "smart" Google may be, linking a 301 redirect to a 404 page is lame.
5:13 am on Sept 27, 2016 (gmt 0)

New User

Top Contributors Of The Month

joined:July 14, 2015
posts: 23
votes: 2


Why would you 301 to a 404 page?
Regardless of how "smart" Google may be, linking a 301 redirect to a 404 page is lame.


Because when you migrate a huge site from one shopping cart architecture to another, you have to use scripts to do the redirects as a matter of practicality. If a product was on the old cart, but then is no longer carried (new cart page deleted), there is no matching new page and the old page redirect goes to a new URL 404.
6:11 am on Sept 27, 2016 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Oct 3, 2015
posts:132
votes: 64


410 is different than 404, and there isn't much point in a redirect to a page that isn't there.
6:54 am on Sept 27, 2016 (gmt 0)

Junior Member

joined:Sept 25, 2016
posts:60
votes: 18


"We had this problem too. For us, it was because URL's 301 redirected to 404 pages. "

In our case, many of the "Linked From" pages are one of two things:

- Pages that are empty 404s and have been for quite some time. There are no links to the 404 page in question, and haven't be for some time. WMTools reports phantom links... lots of them.

- Pages that are http 200 active pages, but have not had links to the "Linked To" page in many, many years.

It's just very old data. Clearly Google has crawled the "Linked from" pages since, but the WMTools page doesn't reflect links getting updated. It's as if it's only showing links as they appeared when the page was crawled for the very first time. Subsequent crawls in many cases don't appear to get picked up in that data.
9:07 am on Sept 27, 2016 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts: 541
votes: 81


@westcoast

120,000 backlinks are assigned to a website that cloned our site a year ago and inserted malware into their copies of our pages (why on earth Google is listing a hacked site that THEY identified as such and are counting backlinks from it when it has cloned our site is beyond me).

Are you seeing in your WMT links from completely different site that has been hacked and has cloned your site as well? If this is true has the clone done 301 into your site at all that will explain the links transfer? Maybe it is a sort of negative seo attack google thinks that your site is not secure either since a hacked site has been redirected into it?
5:16 pm on Sept 27, 2016 (gmt 0)

Junior Member

joined:Sept 25, 2016
posts:60
votes: 18


"Are you seeing in your WMT links from completely different site that has been hacked and has cloned your site as well? If this is true has the clone done 301 into your site at all that will explain the links transfer? Maybe it is a sort of negative seo attack google thinks that your site is not secure either since a hacked site has been redirected into it?"

The cloned site simply took our pages, copied them exactly, and inserted some javascript malware on their domain.

Googling site:theirdomain.com still shows 45,000 pages from our site still in their results, even though the pages have been 404ing for months now. The screwiest bit is that if you do a google cache lookup of THEIR domain, all of the pages cache-redirect to OUR domain's copies, which don't include the malware insertion.

In our WMTools, 120,000 backlinks ("Links to your site") are being attributed to that hacked/clone site. The reason there are backlinks is that some of our pages used absolute URLs, and since they copied our pages exactly, they ended up having absolute URLs back to our site. So, Google thinks there are 120,000 links from a clone site to our site.

I can only assume we are being penalized because the last year has been particularly dire for traffic, but there's no way to tell,. and it has been completely impossible to get anyone at Google to look into the matter. We tried filing DMCA requests, but they kept asking for one URL at a time. Submitting 48,000 DMCA requests is impossible. I finally got someone in DMCA, but then they told me that they "could not validate the infractions because the infracting pages are not responding". Well yeah, because we had them shut down.

The problem is, the 48,000 page cache is still there. The site:copieddomain.com pages are still there in the index. The 120,000 shady backlinks from there to our site are still there. And there's no way to have anyone at google do anything about it.

We have sent a few notifications to Google's spam / malware forms over the past year, every month or two, but nothing ever back from them.
6:28 pm on Sept 27, 2016 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts: 541
votes: 81


@westcoast

Google prefers to treat spam algorithmically but just because a junk site has copied your content to have their back links transferred to your site is really crazy and opens the door for negative seo. Have you got problems with zombie traffic as well like few of us here? We get a lot of traffic that it coverts 1 day and than conversions go off for few days .
1:19 pm on Sept 28, 2016 (gmt 0)

Junior Member from US 

10+ Year Member

joined:Oct 31, 2005
posts:196
votes: 11


westcoast - It was really interesting for me to read this post of yours because we had a similar problem, see my post here [webmasterworld.com...]

After a lot of money and time and work - and I mean a lot of all three - my final conclusion on the slow death of a 12-year-old site of ours is that Google has 'tagged' the site and it will never, ever, ever get any significant Google organic visits again, no matter what we do. That is the important part: "no matter what we do'.

Maybe the 'tagging' is because of the 404's, the 301s, the structure, the old content, the link profile, it doesn't really matter. What matters is that it is forever 'tagged'. We changed everything and more (and did disavows) and nothing ever made any impact at all.

It was sooooo hard for me to give up on that site because it was like a baby of mine in a way and I was very fond of it and had put so much into it for so many years, I just didn't want to quit on it. But after years of struggling and never getting anywhere with it (never seeing an improvement), after tons of money, testing, disavows, structure changes, link cleaning, new links, etc., and seeing nothing work, I finally gave up and started a new site with a clean slate.

There is definitely something going on with some old sites and Google's lack of love, I just don't know what exactly. But I am sick of trying to figure it out. I know it's hard to hear, but you ought to start over too. Good luck.
3:06 pm on Sept 28, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 12, 2014
posts:384
votes: 67


Maybe the 'tagging' is because of the 404's, the 301s, the structure, the old content, the link profile, it doesn't really matter. What matters is that it is forever 'tagged'. We changed everything and more (and did disavows) and nothing ever made any impact at all. 



or maybe your site was tagged as possibly trying to game the search engine and put in limbo for a while. When you responded with everything you mentioned it confirmed, to google, that you were indeed gaming.

My site is 21 years old. It has disapeared from the serps a number of times over the years and i just ignored it and kept doing what i do even though ever fiber in my body wanted to react and make changes to fix it. The site always reappeared back in the same places in serps. Now i rarely look at the serps because i don't know how they reflect on my site. Maybe when my site disappeared google was testing to see if another site was a better fit or maybe they were suspicious and were testing
my motivation.

I get to talk to my viewers and they rave about the site, so why would i change anything because of google? My site was around before google and it will be around after google is gone.

if you truly built a site for the consumers and you are proud of your work, why would you change it?
6:55 pm on Sept 28, 2016 (gmt 0)

Junior Member from US 

10+ Year Member

joined:Oct 31, 2005
posts:196
votes: 11


Oh yeah, now I remember why I don't post here very often...
7:05 pm on Sept 28, 2016 (gmt 0)

Junior Member

10+ Year Member

joined:Aug 14, 2008
posts:80
votes: 4


Hi Robert- Sorry for the delay. No, queries remain consistent for our keywords. Good point though - I agree that search volume is often overlooked during analysis.

Mosxu brings up another great point. One thing we've noticed with our 16 year old site is that in the last two years only we've began to see clone websites. We do our best to find them then send DMCA requests (which normally work) but not sure the impact of these clones.

Another point I'd like to bring up with large scale old sites - we've noticed our overall index rate has declined significantly over the last few years. Pages are updated - but only a small subset as users find the pages - so maybe it's the age of some of our pages that haven't changed much (besides programatic layout changes, etc.).

Again, really appreciate you getting so many great minds on this subject WestCoast.
10:11 pm on Sept 28, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3365
votes: 707


My site is 21 years old. It has disapeared from the serps a number of times over the years and i just ignored it and kept doing what i do even though ever fiber in my body wanted to react and make changes to fix it. The site always reappeared back in the same places in serps.

That happened to us once, more than a decade ago, when Google indexed www and non-www versions of our site separately. Apparently a duplicate-content penalty raised its ugly head, because we lost about 90 percent of our Google traffic over a two-month period.

(The problem was solved with a tweak to our .htaccess file. Nowadays it probably wouldn't happen, thanks to a setting in Google Search Console that didn't exist in the olden days.)
1:26 pm on Sept 29, 2016 (gmt 0)

Preferred Member

Top Contributors Of The Month

joined:Sept 12, 2014
posts:384
votes: 67


@aok88
Sorry if i offended you but you don't have a 12 year legacy site after you 'changed everything and more'. You have a 12 year old domain with a new site, per your description.
9:08 pm on Sept 29, 2016 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 517
votes: 48


@westcoast A disavow file worked pretty well for my 14 year old site in 2015. I initially disavowed about 100 domains and urls. I only picked the worst offenders.

I just recently disavowed another couple 100 domains (mainly .website and .xyz domains) that was pure spam created by someone pointing to my site and a bunch of others. A lot of the spam sites did not work but Google was reporting the links so I have added them to the disavow file. Hopefully this will make a difference during the recent flux.

BTW did you ask Google about the one url DMCA requests? This drives me nuts. Why can't we specify an entire site? We get people throwing up Wordpress sites with our content but different layouts, etc.. (our site is not Wordpress). These sites have stolen 100's of pages of content which makes it impossible to go url by url to have all the pages removed from the search results.
10:34 pm on Sept 30, 2016 (gmt 0)

Junior Member

joined:Sept 25, 2016
posts:60
votes: 18


"my final conclusion on the slow death of a 12-year-old site of ours is that Google has 'tagged' the site and it will never, ever, ever get any significant Google organic visits again, no matter what we do. That is the important part: "no matter what we do'. "

YES! This is EXACTLY the problem. We have made MASSIVE changes to our site to improve quality and authority over the years, and it has had no effect whatsoever. I assume if we had done nothing we'd be exactly where we are today. It's crazy. It really does seem like some part of the Google algorithm is just dominating all other signals. The idea of moving to a new domain is an interesting one, but would be one massive hassle. I have no doubt though that if we did that, our rankings would skyrocket.

"BTW did you ask Google about the one url DMCA requests? This drives me nuts. Why can't we specify an entire site? We get people throwing up Wordpress sites with our content but different layouts, etc.. (our site is not Wordpress). These sites have stolen 100's of pages of content which makes it impossible to go url by url to have all the pages removed from the search results."

Yes, it is immensely frustrating. We have now had 2 cases where sites have copied thousands of URLs (the current one is 50,000 URLs.) It would take someone at Google precisely 5 seconds to verify that yes, an ENTIRE DOMAIN has copied our site. Heck, the page titles even have our domain name plastered all over them! But no, there is no way to get Google to remove this stuff from the index or cache. They ignore comments to the malware / spam addresses they provide. We are 8 months in on the current cloning attack and Google still hasn't done a thing about it.
1:06 am on Oct 1, 2016 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3524
votes: 324


We have made MASSIVE changes to our site to improve quality and authority

What did you do to improve its authority?
1:17 am on Oct 1, 2016 (gmt 0)

New User

Top Contributors Of The Month

joined:July 14, 2015
posts: 23
votes: 2


The idea of moving to a new domain is an interesting one, but would be one massive hassle. I have no doubt though that if we did that, our rankings would skyrocket.


I'm testing this theory on a new site to see how a new one can rank. Just be sure not to use a change of address / 301's or WHOIS records that will give you away and could pass the "taint" on.
This 31 message thread spans 2 pages: 31
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members