| 9:26 pm on Apr 11, 2012 (gmt 0)|
Interesting - I see the exact opposite on a few of my directory sites - spike in traffic on the 22 then drop back to normal on the 23. Same on the 5th - spike the back to normal on 6th
| 10:13 am on Apr 14, 2012 (gmt 0)|
It seems that the situation is even getting worser: this time the filter wasn't lifted and the traffic stays at 25% for the 4th day.
| 10:20 am on Apr 14, 2012 (gmt 0)|
Google rolled out Caffeine to be able to index the web at lightning speed and then rolled out a periodic update system(s) like Panda. I don't suggest you try to understand them all too much because they don't want you to.
Do the dance - see what the site looks like to Googlebot, rule out a hack, see if your competitors are up to no good, see if they have done well and passed you in rank(top 3 get most of the traffic, even a 2 spot drop can cut you down 75%) etc.
What you are describing doesn't sound like a penalty to me, but when you've done the dance and are sure your clean file a reconsideration request and move on.
| 11:01 am on Apr 15, 2012 (gmt 0)|
It's definitely some kind of filter. One can verify that by searching "site:example.com keyword" or "brandname keyword" which both bring up the regular page while it's completely removed from the SERPs if you are searching just for "keyword" (the first result from the domain example.com is now some different page which is not part of that subdirectory).
Btw, the subdirectory which is affected was created 2004 and ranked well for many years. For one and two keywords the results are filtered out (it a filter because they are completely removed - it's not about being passed by competitors and going down a few positions e.g. one page was #4 for a single keyword and it's now not within the first 1000 results). The remaining traffic (25%) is from long tail searches which are not affected.
Anyway, of course I try to overcome the filer - it's more important than understanding it. (My guess is that it might be related to this discussion: Google to Target Overly SEOd Sites [webmasterworld.com]) However, so far changes donít solve the problem.
| 7:05 pm on Apr 15, 2012 (gmt 0)|
Have you cross-referenced the host server logs with the filtering you experience? Anything unusual there? Any errors in GWT?
| 10:33 am on Apr 16, 2012 (gmt 0)|
I found absolutely nothing unsual - I checked everything twice as for example GWT. As already said the pages are online since 2004 and just small changes were made a few times. No changes were made the last months.
I believe that the drop is related to the story "Google To Target Overly SEO'ed Sites Within Weeks" which was published on 03/16 (the first drop was 03/22). Moreover, I now have an idea what caused the problem. I just fixed it and now I'm waiting if the pages are recovering or not. If they reappear in the index I'll let you know.
| 12:42 pm on Apr 16, 2012 (gmt 0)|
|Have you cross-referenced the host server logs |
A better question would be "have you crossed your fingers and hoped for the best?"
This is happening across the board with different types of sites and different types of keywords both single word and long tail.
Every time you think you have one part figured out everything changes!
@doc_z are you going to tell us what caused the problem and how you think you fixed it? i.e. was it keyword density, links, tags etc..
| 3:35 pm on Apr 16, 2012 (gmt 0)|
Have you been doing aggressive link building? Or any link building using blog networks? This could be related to / tied to the recent google update targeting blog network link building
| 3:59 pm on Apr 16, 2012 (gmt 0)|
Yes, aggressive link building can cause problems especially for new domains. But I didn't do any link building at all for the affected subdirectory. There are many internal links to the subdirectory and few external links. Even for the domain there was no aggressive link building - the domain is old (199x) and has many natural old links.
| 4:32 pm on Apr 24, 2012 (gmt 0)|
The subdirectoy is coming back after 13 days - one week after I made some changes. However, I have to wait to see if SERPs are stable.
I'm almost sure now that the pages was hit by this update: New Panda Rollout 3/23/12 [webmasterworld.com]
| 8:01 pm on Apr 24, 2012 (gmt 0)|
@doc_z would be very interested to know what you changed. I was hit on the 23rd and also think it was Panda.
| 11:02 pm on Apr 24, 2012 (gmt 0)|
at doc Z
There are two penalties at moment from what I can see on own sites.
The first one we all know
The second one seems to devalue directory scripts that thinks are selling outbound for anchor or a niche
I suspect G wants us to be selective rather than highest bidder for directories
Can you add a "no follow" or would that defeat purpose of directory? I bet I can guess the script too
| 3:28 pm on Apr 25, 2012 (gmt 0)|
I have to wait at least one week until I'm sure that SERPs are stable. However, so far it looks fine.
There are probably more than one thing which was targeted within this update. However, one part of the update seems to target very fast and massive internal link structure changes. I guess this was one thing when they were saying:
|We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. |
I have several hints which confirm this theory. Of course, to verify the theory I have to go a step back and see if the filter reappears. However, I'll not do that :-)
| 3:28 pm on Apr 25, 2012 (gmt 0)|
Without going into detail: the theory also explains why the filter appeared periodically. It was some interaction between by own algorithms and Google's filter leading to an up and down.
| 7:14 pm on Apr 30, 2012 (gmt 0)|
Doc what was it you changed? It'll help us all.
| 9:06 am on May 1, 2012 (gmt 0)|
In my case the solution was to avoid changing too many (internal) links too fast.