| 3:07 pm on Sep 10, 2006 (gmt 0)|
There was a related discussion in August: Is Google penalizing only parts of sites? [webmasterworld.com]
Perhaps some of the filters that make up the "sandbox effect" could be getting applied more selectively within a domain, instead of domain-wide. I haven't seen this effect myself, but a few reports are coming in, so it seems like this may be happening. If so, it's nicer than whacking an entire domain's trust factor just for a boat load of new urls in one directory.
I note that you used the word "ban" in your title. I think it is more like a filter (if we are understanding it correctly) than a ban, and once Google gains some trust on the new pages they could start to show up.
Are you watching your server logs for googlebot requests in that directory?
| 3:48 pm on Sep 10, 2006 (gmt 0)|
Thanks for the input, tedster. It looks like the GoogleBot did crawl that directory on September 7. Apparently, after that crawl everything from that directory was removed from the index. Now, the question is (not that anyone can answer this exactly, but perhaps someone can offer some relevant experience):
1) Did Google decide that the new content was spam and permanently remove the directory? In this case, I should delete all the new content and possibly move the old content to a new directory.
2) Did Google simply raise a flag because of the number of files added, and that flag will be turned off after some waiting period? If this is the case, then I am probably best just leaving everything as it is.
3) Is Google penalizing the directory because it doesn't like the new content? If so, then perhaps I should delete the new content and just sit on the old content, hoping that Google forgives me.
| 3:54 pm on Sep 10, 2006 (gmt 0)|
Check this out also.
| 4:16 pm on Sep 10, 2006 (gmt 0)|
Quick question, what types of pages are in that directory? Do they have a lot of differentiated content or can they somehow be seen as fairly similar to one another?
| 4:43 pm on Sep 10, 2006 (gmt 0)|
Another issue would be if the urls in that directory have unique title tags and meta descriptions. Even if the rest of the on-page content is differentiated, title and meta description have become essential "quick differentiators" for Google in the past year.
| 5:06 pm on Sep 10, 2006 (gmt 0)|
I have seen this happen in numerous situations. Mod_rewrite, sitemap, new content, consolidated content, etc...
It seems if you introduce many links at one time that you will get flagged for sure. It definitely is an attempt to curb spam but some legit sites are getting hit when they just try to make simple corrections to their site. Lesson number one is, "take it slow" :)
[edited by: Aforum at 5:06 pm (utc) on Sep. 10, 2006]
| 7:59 pm on Sep 10, 2006 (gmt 0)|
jimbeetle - Some of the pages are fairly similar, I must admit. The have legitimate content, but some of the text duplicated.
tedster - We have unique titles and meta tags on all the pages.
| 8:56 pm on Sep 10, 2006 (gmt 0)|
Back in the June SERPS disaster ( I forget what we all named it) I had a sub-directory that previously ranked well totally dissapear. So in answer to your question wether a part of a website can get hit, I'd say yes, for sure.
I made it back on July 27th, and actually have a boost in rankings now.
I'll never forget this very unpleasant lesson.
| 10:20 pm on Sep 10, 2006 (gmt 0)|
dibbern2 - did you do anything to get back into the SERPS or did you just wait it out?
| 10:29 pm on Sep 10, 2006 (gmt 0)|
Do you have AdSense spots or affiliate referral links on the several thousand pages you added?
| 10:31 pm on Sep 10, 2006 (gmt 0)|
|I had a sub-directory that previously ranked well totally dissapear. So in answer to your question wether a part of a website can get hit, I'd say yes, for sure |
I think we should carefully differentiate something here: Does Google target, for whatever reason, a specific directory for deindexing? Or, is it that pages within a specific directory share certain characteristics that lead to their being deindexed?
My experience tells me it's the second option.
|The have legitimate content, but some of the text duplicated. |
I'd start with taking care of this. It probably won't make the whole problem go away, but cleaning this part up should help somewhat. After that, more investigation might help to expose the underlying problem. I don't think there's going to be a quick fix, more like a step-by-step process.
| 11:32 pm on Sep 10, 2006 (gmt 0)|
reseller - No, no ads or affiliate links at all.
jimbeetle - all of the pages have common navigation, so all of the "safe content" (pages that are undoubtedly high-quality and previously-indexed) point to index pages for the questionable content. I'm wondering if the very fact that they are linked to the other content is causing the problem. Perhaps the best path to take is to just delete all of the new content, restore the old content to its original form, and then resubmit a new sitemap.
| 2:37 pm on Sep 11, 2006 (gmt 0)|
FYI...we're back in the index this morning. I'm confused.
| 5:24 pm on Sep 12, 2006 (gmt 0)|
google may be filtering massive new changes to " compare them to the older copies" to see what changes have taken effect.
I run a site that has a directroy, and since my staff is limited to... me i do updates about once a month to 50k pages. i noticed that traffic has taken a hit after this last update i pushed out...
maybe triggered a filter maybe not, glad i don't depend 100% on the serps.