Welcome to WebmasterWorld Guest from 18.206.15.215

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Traffic May 9 dropped 75% after a year's climb - are we penalized?

     
6:10 am on May 10, 2008 (gmt 0)

New User

10+ Year Member

joined:Dec 31, 2006
posts:29
votes: 0


The site is white hat, largest authority site in it's niche (information). Over the past year or more we have been enjoying a steady climb in traffic and traction in the serps. It is an old domain, PR5, recently acquired sitelinks.

Yesterday, we had some performance problems in the afternoon as a rogue spider decided to crawl our site without throttling and basically took out the site. By the time I got it rebooted it was down for an indeterminate amount of time (the logs have it still accepting connections but it was totally locked and basically unable to process the requests)

So I think, OK, fine no problem. I look at GA this morning, and traffic is down by 75%. Which seemed kind of high to me for the amount of time I thought it was down, but still was thinking, OK, 1 day hiccup.. Sucks but what can you do...

Today though, I noticed that the server's outgoing transfer rate was less than half of what it should be on a typical day. Uh oh...

So then I start looking up some common queries for the site, and suddenly see we have been demoted in the serps big time, not on every single term, but pretty much across the board. Didn't seem to be specific urls or subdirectories. The home page is 'penalized' for sure, or at least re-ranked... we went from #1/#2 on a major 2-word industry term where the term is in the title and domain, to #7/8 or even nowhere on some datacenters.

It looks like a really dramatic shift in rankings, and which I have no idea what the reason could be. I am kind of at a loss and I would really appreciate hearing any opinions. I am waiting for Analytics to update this morning and then I will have a better picture then of an entire day's traffic without any server problems, but I have a feeling it is not going to be pretty. If nothing else, this post is therapy while I wait for 6AM :)

Fun facts:
-Acquired and enabled 'faster' crawl rate in GWT about 3 months ago
-Faster rate expired approximately May 5th? I re-enabled it today (forgot it was going to expire, I'm wondering if this could be related)
-Acquired a good new PR5 authority link yesterday
-Search function was attacked in last week putting about 1000 stub urls into google on urls like /widget/x100 /widget/x101 (these are now 404d)
-Related search function spammed with some <adult> keywords recently (also 404d)
-Have been growing pretty steadily and organically for a couple of years
-We were having performance problems with one large CMS-generated url on the site in the last little while. On this page (which happens to be in the Sitelinks) there is a large generated widget list with links to widget pages. We have lots of these widget lists by category, but this is an especially big one. There is a HTML list in a display:none div for browsers that don't support javascript, and then the list is replicated in javascript format for browsers that do (with some additional interactivity). So in other words, it shouldn't violate the guidelines. Anyway, on this major URL about 2 weeks ago, I took out the javascript part of the list generation (cause the load on the server to generate it was getting to be crazy) but left the HTML linked list in place (but it was still in a hidden div)
-Not aware of any recent loss of major links
-We just added a site wide function over the last few weeks which is causing thousands of stub urls to be generated, some with content but many not ... but with 200k pages in the index this seems a safe amount to add without triggering anything?

[edited by: tedster at 6:09 pm (utc) on May 15, 2008]

4:30 pm on May 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:July 26, 2006
posts:1650
votes: 3


I noticed a drop in traffic for us the last two days here in the US. No major changes to site, but did do pricing updates. Not sure why we've lost traffice. Our's was only about 20% loss. Maybe a new algo for May?
5:33 pm on May 10, 2008 (gmt 0)

New User

10+ Year Member

joined:Dec 31, 2006
posts:29
votes: 0


Update: no change overnight, traffic in the dark, devastated in SERPs.

Wondering if I should file a reconsideration request.

6:05 pm on May 10, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


The bigggest suspect you reported is that rogue spider taking down your site for a while. Google is very quick to temporarily move websites out of the rankings when they have server trouble - understandably so, since results that don't resolve are a disservice to their end user. But unless the problems are ongoing, this is not a permanent situation.

I'd suggest waiting a couple more days before jumping into action. If googlebot comes back and all is well, you will probably see your rankings and traffic return soon after.

5:23 pm on May 12, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 6, 2003
posts:1493
votes: 135


Strange traffic in GA since friday
~20% loss
6:13 pm on May 12, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 26, 2005
posts:75
votes: 0


We have similar problem. Server was down for complete 2 days. Google stopped sending visitors from 2th day and till now after 20 days not recovered, although googlebot regularly crawling, moved to better hosting
8:42 pm on May 12, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:June 20, 2005
posts: 83
votes: 0


I don't think the issue is related to your site being off-line. While this is not good, in all honesty I think you're spending too much time worrying about what you've done wrong when the issue is really Google.

My site has been up for +6 years and while it's not large I've been at this long enough to chart exactly how Google's 'improvements' affect it's traffic - starting with the 'Jagger' updates a few years back.

What I'm saying is this, anytime our site traffic drops dramatically it's been tied back to Google doing one their 'improvements'.

In the past we've tracked these back to the office of Matt Cutts and these 'improvements' are usually done in an effort to curtail spam or punish sites that steal content.

Unfortunately, with every one of these 'improvements' the exact opposite has happened - the scraper sites that stole our content rise to the top of the ranks and our original content drops to the bottom of the ranks.

It gets worse though - I use Google several times each day for research and often enter the same keywords several times over a week (instead of bookmarking a page I just search it again).

What I've found over the last few days is that Google's index is completely screwed up - major content that used to rank #1 from Microsoft (the articles I need) are now no-where to be seen.

So on the one hand, Google has again killed off our site and on the other, they've made their index useless for people using Google as a search engine.

As a webmaster, I feel for you. You've noticed the hit in traffic by seeing a massive drop in server bandwidth utilization. I know what that's like. Unfortunately it's caused you to put your site under a microscope and wonder what you've done wrong when in fact - you haven't.

What you're combatting is Matt Cutt's continual efforts to 'improve' Google for everyone never understanding that his efforts actually make Google worse.

If we could roll back time several years to just before his 'Jagger' updates most people would be very happy. Google certainly worked better before Matt got his hands on it. What's surprising is that no-one has caught on to this fact yet and he seems to be encouraged in his efforts to 'improve' something that was already nearly perfect.

The fact of the matter is that before the 'Jagger' updates people stole our content left right and center. However, our content was always number 1 with the scrapers several pages back in the listings. This was fair and correct.

The day that Matt released the 'Jagger' updates was the day that our traffic was cut in half and the immediate result was rewarding the scrapers as their listings now eclipsed ours.

When the next series of 'upgrades' occurrred the remaining traffic was halved again and the scrapers were rewarded even further.

This has continued ever since. With each and every 'upgrade' that Matt undertakes to punish scrapers and 'improve' Google the exact opposite occurs - the scrapers get rewarded and Google's usefulness declines dramatically.

In all honesty, I've never seen anything like it before in my life. The task is to improve Google and yet his very actions result in the exact opposite - every time.

I just don't get it. Google (before the 'jagger' updates) was just perfect. It always yielded excellent results and original content was rewarded with top rankings. Yet for some reason, the company started messing with what was a really good thing.

The net result of that has been to punish original content creators and reward thiefs. In the process Google's usefulness gets Cutt in half.

Sorry about the rant, but I've been here before. There's nothing you can do about it, it's not your fault and it's nothing you've done. This whole recurring mess can always be tracked back to Google, and more specifically, to Matt Cutts.

Google just doesn't get it. Everything was fine before they began 'improving' things. It's very much a case of the emporer has no clothes.

8:55 pm on May 12, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


I understand your feelings, mmiller, but we don't want to turn the forum into a collection of rants, and ranting doesn't get rankings return, at any rate.

It's possible that what you describe is also what QE's site is suffering. But the timing in his case is such that patience is in order. It's only three days (weekend days at that) since the server problem was fixed, so it's too soon to know for sure.

12:23 am on May 13, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:June 20, 2005
posts: 83
votes: 0


I understand Tedster - there's no doubt I'm pretty up front about my feelings in this matter but please remember, it's based on having been around the block several times with Google and their updates - and seeing the cooresponding drop in numbers each time.

I'm by no means alone in this - many honest sites have been damaged by Google's 'updates' as evidenced by the popularity of these threads...

The main thing I wanted to mention to QuantumEntanglement is that unfortunately this has become a 'normal' cycle and the fact that it happened to coincide with some issues on your end is likely to be just that - a coincidence.

[edited by: tedster at 3:25 am (utc) on May 13, 2008]

6:42 am on May 13, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2004
posts: 1939
votes: 0


Hello Quantum. Maybe some of us can shed some light on some possibilities. I have had a major shakeup before, and it turned out to be a very issue that the website was having that triggered bigger issues, including ranking problems. Here are some insights and suggestions, some you maybe have already arrived at.

If you have been able to fix any issues with CMS performance and have benchmarked the website under load, and it performs well, then turning on the faster option in G sitemaps is usually a good idea. I would work to hotrod the system out so that it is very responsive, quick and hardened.

Do some searching around hardening the site against rogue spiders.

Generally if a website has thousands of quality links, then one pr5 link will only help. Make sure the link was not placed sitewide on the website inevitably, if only to rule out this being an issue. Most times with authority websites, my experience is that even sitewide links will not budge the site in thh SERP's, although this is not written in stone.

Double check the search function and harden any other points of entry on the website. Stub pages will not help rankings, and can cause issues with rankings, although rarely (imho) to that degree.

"I took out the javascript part of the list generation (cause the load on the server to generate it was getting to be crazy) but left the HTML linked list in place (but it was still in a hidden div)"

Hidden links in div are not good. I would work to find a better option. How many total links are you estimating are on this page?

"We just added a site wide function over the last few weeks which is causing thousands of stub urls to be generated, some with content but many not ... but with 200k pages in the index this seems a safe amount to add without triggering anything?"

You may have 200 000K pages in the index, but only 1500 that are non-supplemental. Looking at it this way, adding thousands of further stub pages, which can be looked at as duplicate pages, might tip the scales.

As a general rule, it is best to not create stub pages - each page needs to have content to fill it when it is going live. Obviously there are exceptions to the rule, and some authority websites get away with having thousands of ranking stub pages by way of massive trust and pagerank, but for 99% of the websites out there, stub pages do more harm than good.

I have had servers go down for almost 24 hours once. It was a fiasco, and my website dropped about 8 positions as it was spidered down. But within about a week it returned back.

How long was the website down for?

5:34 pm on May 15, 2008 (gmt 0)

New User

10+ Year Member

joined:Dec 31, 2006
posts:29
votes: 0


@mmiller:
The main thing I wanted to mention to QuantumEntanglement is that unfortunately this has become a 'normal' cycle and the fact that it happened to coincide with some issues on your end is likely to be just that - a coincidence.

Thanks for your post, it made a lot of sense. And it let me step back for a minute from tearing my hair out and take in some perspective :D And the frustrating part is, of course, at least if it was something we did, it would be something we could do something about :)

@Cain:

I would work to hotrod the system out so that it is very responsive, quick and hardened.

Do some searching around hardening the site against rogue spiders.

Good suggestions.

Hidden links in div are not good. I would work to find a better option. How many total links are you estimating are on this page?

~3000

I have had servers go down for almost 24 hours once. It was a fiasco, and my website dropped about 8 positions as it was spidered down. But within about a week it returned back.

How long was the website down for?

Hard to say exactly, but probably somewhere between 2 and 12 hours.

Situation is still the same... We are currently taking tedster's 'wait and see' approach. Still looking for possible causes and other people affected; hoping that G will realize that whatever change it made on Friday was all a big mistake and revert the index. :)

One weird thing which I'm wondering is related is the recent GA problems... again, might be looking for an association here where none exists. But what got me thinking was when I logged into GA today -- the data for Apr 30-May 2 is corrupted as per this notice Google Analytics Bug Dropping E-commerce Data? [webmasterworld.com]. We are not an e-comm site, but for those 3 days, our bounce rate shows as 100%, time on site is 0, av. Pageviews per visit is exactly 1. While visits are normal and seem to be unaffected. This data DID look fine for those days, but they say they are reprocessing it so maybe this is in the middle of their reprocessing. Or maybe that data is just lost. My point is, if anybody who has experienced these major traffic drops (20%-75%), and particularly on May 8-9th, also sees these anomalies in GA, please speak up.

Thanks for your help everyone.

[edited by: QuantumEntanglement at 5:36 pm (utc) on May 15, 2008]

5:54 pm on May 15, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member bwnbwn is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Oct 25, 2005
posts:3591
votes: 48


-Search function was attacked in last week putting about 1000 stub urls into google on urls like /widget/x100 /widget/x101 (these are now 404d)
-Related search function spammed with some <adult> keywords recently (also 404d)

The above statement stood out to like a big red flag can you explain this more as to how long the url's were in the index how they got there and what if anything is being done with the search function.

[edited by: tedster at 6:09 pm (utc) on May 15, 2008]

3:49 pm on May 17, 2008 (gmt 0)

New User

10+ Year Member

joined:Dec 31, 2006
posts:29
votes: 0


Things seem to be back to normal now, as of sometime during the 15th.
2:42 am on May 18, 2008 (gmt 0)

Junior Member

10+ Year Member

joined:June 20, 2005
posts:83
votes: 0


Our traffic levels are also returning to normal, albeit slowly. There are still quite a few scraper sites that are ahead of us (with our content) but as mentioned, we believe that over time we'll recover as more and more people click on the proper links (ours) and we get voted back up in the charts.

At least until the next 'upgrade' ;-)

Good to hear you're back on track :-)