Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

For Those Hit Around September 4th. Recovery Story



4:52 pm on Nov 5, 2013 (gmt 0)

5+ Year Member

One of my sites was hit on September 4th, I came here and poked around a little and noticed others had also been hit around the same time with something.

So I figured I would come back and post this just in-case someone else was hit around the same time, and maybe this could help.

My sites are always pretty clean, so the penalty threw me off guard a little. Here are the before and after stats.

[dl.dropboxusercontent.com ]

[dl.dropboxusercontent.com ]

The only thing I did to recover was I started in the HTML Improvements area of Webmaster Tools and started cleaning up the errors. The area I think that caused the initial penalty, as well as the recovery, was the 404 errors. Once the 404 errors reached 0, two days later the site recovered.

This is just my opinion though, as everyone knows, with Google, it could be a million things.


6:00 pm on Jan 11, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

4 months later, hundreds of hours in fixing html errors, titles/meta descriptions duplicate and too short, THOUSANDS of disavowed domains and NO recovery.

What is more puzzling about the 4th of September Google update is that there is no confirmed recovery.

Is like a massive manual penalty!

To make thing "worse" (I already lost 90% of the traffic), I can see new signs of penalties! Is like getting rid of the site from index!


9:58 pm on Jan 11, 2014 (gmt 0)

10+ Year Member

Our site was hit then too. Our traffic is down alot since then. The only changes I made, were, I deleted bad links out of Google (Wordpress pages and attachment IDS etc),

I also removed sitewide links (that were not varied) from other sites (blogs etc).

I added Yoast plugin.

I'm about to redesign the 10 year old site from scratch in html 5 (wordpress genesis), and try to have as many plugins removed.

I'm still stumped, but never give up!


7:35 am on Jan 22, 2014 (gmt 0)

I don't want to get too excited just yet, but a check of my stats today and I'm seeing a 300% rise on visits for this time of day so far. Some of my keywords have had some major jumps too.

Maybe this is recovery... will keep you posted.


7:52 am on Jan 22, 2014 (gmt 0)

I wouldn't get too excited, we've had a nice boost but the signs of an update yet again are there (foreign junk traffic through the roof today). I wouldn't even cause these updates any more, the frequency of them is now nearly weekly so it must be how the algo treats us now.


10:54 pm on Jan 22, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

After 4th of September we are @ Junk, Ignore, Filth category.

After 15 January I'm lower and lower....

I did not think I can get so low, but Google wants to show me:
"you can't even imagine what the bottom will be like"


7:38 am on Jan 23, 2014 (gmt 0)

24 hours later and I've had a day of pre-September 4th traffic. I will give it a week to see if it stays like this and I will post some stat screenshots and a list of all the changes I made.


3:19 pm on Jan 23, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month


One of three affected domains is back!
I must tell you that is the only subdomains with no content change, only some 404 fixed and a lot of link cleaning...

This is an old subdomains and I must tell you that the links pre 2009 most of them!


P.S. In Analytics, I see 6x increase on 22 January!


5:20 pm on Jan 23, 2014 (gmt 0)

5+ Year Member

Yesterday my site recovered too.. last thing i did was adding some additional domains in disawov file and removed few links from sidebar on other sites..


5:30 pm on Jan 23, 2014 (gmt 0)

I'd be more interested to learn whether anyone that was hit & did nothing has seen any recovery?


8:30 pm on Jan 23, 2014 (gmt 0)

5+ Year Member

No recovery here.

My main site was hit badly on September 4th. Visits dropped from 12K/day to 4K/day overnight, mostly valuable US traffic.

This is a unique informative site about "how to disassemble and repair widgets" running since 2006. All content is 100% unique and self-created with 10-40 unique disassembly photos in each post. The site has only 330 posts but they all are high quality content.

I didn't use any dirty SEO techniques to promote my site. All incoming links are naturally built.

Here's what has been done so far:
1. Removed all 404 errors (scanned with Integrity on Mac).
2. Removed outgoing links to what I thought were low quality but related sites. All links had nofollow attributes before the hit.
3. Removed all EPN (eBay) affiliate links. All links had nofollow attributes before the hit.
4. Cleaned up categories. Before each post appeared in multiple categories and now each post appears only in one category.
5. Removed a few old and not popular posts.

All these was done about 2-3 months ago. No recovery so far. :(

I'm still writing new content once in a while but I really have to force myself to do that. It's so discouraging. I just don't have much inspiration left.
Before I could spent countless hours wringing new content and I was really enjoying doing that. Now I just don't want to touch it anymore.


8:46 pm on Jan 25, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

1. Download Latest Links csv file from Google Webmaster Tools.

2. Open your prefered Text Editor and do some Seach and replace. It will hem you sample the data on the next step.
I use pipe char |


.com/ -> .com|/
.net/ -> .net|/
.org -> .org|/
.info/ -> .info|/
.biz/ -> .biz|/
.us/ -> .us|/
.de/ -> .de|/
.ru/ - > .ru|/
.nl/ -> .nl|/
.fr/ -> .fr|/
etc (a lot of TLDs)

3. Open a new excel and import the data Data - Get External Data
Import the CSV file and select pipe I as a separator.


Now you will have on the first column the domain and o the second column the URL structure.

Select the second column (B) and Sort alphabeticaly!


Now you can easily spot the domains that share the same URL scheme structure.

[u]Use search and clean to exclude one by one the domains that you add to the disavow file.[/u]

I used this technique to spot tons of spam domains that had only one link to me, so It was like a needle in the haystack, but I spotted them by the URL structure.




3:30 pm on Feb 7, 2014 (gmt 0)

@Mentat @my_name

I've vound your problems here and wanted to ask you guys something as i have same issues around here.

I own a news website that received a redesign and moved it from wordpress to another platform. Meanwhile, i lost several URLs and the gallery plugin which showed some f#$d up URLs was replaced, meaning that i had thousands of non-existent URLs. Anyway, i cleaned up the website etc, and then i found that the non-existent URL's were actually shpwing a 200 OK status. Solved that too.

Back in November, i got an increase of 57K not found errors. Started to take a closer look and found some bad URLs, solved those too.

Last week, i used to robots.txt to block some old categories i had on the website. Now, the errors were reduced to around 50. Still, i keep getting old URLs that don't exist anymore, i guess these will continue to reappear for a while.

MY QUESTION IS: is it ok to block the old categories with robots. txt so that google stops showing the old URLs?

I also started to see a small increase in traffic, but too little for the moment.

How did you guys do it?


3:51 pm on Feb 7, 2014 (gmt 0)

Apologies, I know I said I would be back in a week with an update but I've been putting all my time into working on the web site... it's now been 2 weeks and I have recovered.


I didn't block any pages using robots.txt so I can't really comment. For the pages I removed I returned the 410 header, these were slowly removed from Google. Towards the end to speed things up I used the URL removal tool on around 2,000 links (the copy and pasting was painful!). These were removed within 24 hours.

I am still receiving GWT errors for the removed links (around 5 to 20 a day) but I keep marking them fixed and hope they won't show up again.


4:06 pm on Feb 7, 2014 (gmt 0)


My problem with most of the URLs, is that they didn't even existed. They appeared when the website was transferred.

I return 404 header on all the pages. Why did you use 410? More efficient than 404?


6:27 pm on Feb 7, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

Use 410, not 404


6:45 pm on Feb 7, 2014 (gmt 0)

410 is the gone header... as I understand it, when you return this header tells Google that the page is gone for good and will not be returning. The page will naturally fall out of the search results (I did this for about 3 months before using the URL removal tool). All the pages that used this header have been removed from the index.

I also made sure to remove any links to the 410 pages within my web site so Google wouldn't try crawling them again.

Google also says you can return a true 404 in the HTTP header but I went with the 410 header after reading somewhere that the header exists for this reason. All I can say is that 410 header works.

Are any of your non existent URLs indexed?


7:02 pm on Feb 7, 2014 (gmt 0)

@my_name No, these URLs are not indexed anymore. I don't find them in search results, it's just that google keeps getting them in the WMT.

I want to try and use a 410 header instead of 404, maybe it will work on my website too...


1:04 pm on Feb 8, 2014 (gmt 0)

@andrewc it's gotta be worth a shot :)


1:15 pm on Feb 8, 2014 (gmt 0)

@my_name I reduced the number of errors, i still get few per say, but still, the traffic is not up.

I think that blocking the old categories from the wordpress site was not a good idea using robots.txt. I read about it and found out that it is better to use 410.

As John Mueller from Google said:

For large-scale site changes like this, I'd recommend:
- don't use the robots.txt
- use a 301 redirect for content that moved
- use a 410 (or 404 if you need to) for URLs that were removed


1:22 pm on Feb 10, 2014 (gmt 0)

@my_name, @Mentat

I have a question for you guys, is it ok if i use only 410 for the whole website and drop 404 header?


2:31 pm on Feb 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

The happiness was short.


4th of september 13 DOWN, middle of January UP, now DOWN again.
So, after 6 moths I can tell you that the 4th of September update was a dedicated Panda for old sites.


2:41 pm on Feb 16, 2014 (gmt 0)


I don't get it... what happened there? Why the website dropped again?


6:32 pm on Feb 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

On 13 Feb was Panda (quality algo). I suspect that on 4th of september was a "dedicated" Panda & Penguin just for some sites (old sites).


7:31 pm on Feb 16, 2014 (gmt 0)

So what is your next move now? What are you planning to make it work again?


10:35 pm on Feb 16, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

The truth is that the only thing I can do right now is to rewrite ALL my content on the affected subdomains.

This does not guarantee a success, but is a huge task.
We are talking about 120 000+ pages/subdomain.

Captain Salami

10:01 am on Feb 18, 2014 (gmt 0)

My site 9 years old, also was penalized 4-6 September and after 3 days was recovered all positions back. (I'm do nothing for this)

After 1 month - 15 October my site was penalized by google again and i can't get recovery...traffic down from 20000 to 5000 unique visitors from google. No manual actions.

All this months to get recovery i doing much work with site:
- clean my site link profile (removed more than 4000 domains)
- removed some of ads banners.
- much work with structure of site.

But still nothing...
In this month i see just very small growth from google 5-7%.

At this time i also think it was panda....


3:58 am on Feb 22, 2014 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

the only thing I can do right now is to rewrite ALL my content on the affected subdomains.
- This does not guarantee a success, but is a huge task.
We are talking about 120 000+ pages/subdomain.

@Mentat - What hope do you realistically have of re writing 120k quality pages ?

I've seen sites invest $100k's of dollars covering their sites in new content, and far less pages, without a dot of difference to rankings and/or Panda. So you're right - it guarantees nothing, especially in competitive segments, where variations of the same are found all over the place.

Why do you have so many pages? I think your focus needs to be on something else, but some further disclosure will help you obtain better quality inputs. Or else hire a good SEO to compare notes with you, or lastly , put your site up for review over here : [webmasterworld.com...]

There's some excellent feedback here for folks, and various tools that can be used to support that feedback. Why don't you give it a go ?


9:14 am on Feb 22, 2014 (gmt 0)

My rankings are still back to pre-September 4th traffic. But I'm on tenterhooks...


9:21 am on Feb 26, 2014 (gmt 0)

Uh ohhh... looks like I've been hit again :(


9:31 am on Feb 26, 2014 (gmt 0)

Sorry to hear that but not surprised one bit.
This 63 message thread spans 3 pages: 63

Featured Threads

Hot Threads This Week

Hot Threads This Month