Welcome to WebmasterWorld Guest from 23.22.250.113

Message Too Old, No Replies

Will Google Ban a Single Sub-Directory

     

ryanfromaustin

2:54 pm on Sep 10, 2006 (gmt 0)

10+ Year Member



We have a subdirectory on our site that contains most of our static content. As of about a month ago, we had only about 100 pages in the directory and all were indexed in Google. We then added several thousand pages at once into this directory. Now, Google appears to not be indexing any files in that directory, though they are still index files in our root directory as well as other subdirectories.

My question is, has anyone heard of Google removing just a subdirectory from the index? Is it possible that the one directory got flagged for "over optimization" or is the whole site on the way out and we're just seeing the removal of the one directory so far?

tedster

3:07 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



There was a related discussion in August: Is Google penalizing only parts of sites? [webmasterworld.com]

Perhaps some of the filters that make up the "sandbox effect" could be getting applied more selectively within a domain, instead of domain-wide. I haven't seen this effect myself, but a few reports are coming in, so it seems like this may be happening. If so, it's nicer than whacking an entire domain's trust factor just for a boat load of new urls in one directory.

I note that you used the word "ban" in your title. I think it is more like a filter (if we are understanding it correctly) than a ban, and once Google gains some trust on the new pages they could start to show up.

Are you watching your server logs for googlebot requests in that directory?

ryanfromaustin

3:48 pm on Sep 10, 2006 (gmt 0)

10+ Year Member



Thanks for the input, tedster. It looks like the GoogleBot did crawl that directory on September 7. Apparently, after that crawl everything from that directory was removed from the index. Now, the question is (not that anyone can answer this exactly, but perhaps someone can offer some relevant experience):

1) Did Google decide that the new content was spam and permanently remove the directory? In this case, I should delete all the new content and possibly move the old content to a new directory.

2) Did Google simply raise a flag because of the number of files added, and that flag will be turned off after some waiting period? If this is the case, then I am probably best just leaving everything as it is.

3) Is Google penalizing the directory because it doesn't like the new content? If so, then perhaps I should delete the new content and just sit on the old content, hoping that Google forgives me.

Any thoughts?

Green_Grass

3:54 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Check this out also.

[webmasterworld.com...]

jimbeetle

4:16 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Quick question, what types of pages are in that directory? Do they have a lot of differentiated content or can they somehow be seen as fairly similar to one another?

tedster

4:43 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Another issue would be if the urls in that directory have unique title tags and meta descriptions. Even if the rest of the on-page content is differentiated, title and meta description have become essential "quick differentiators" for Google in the past year.

Aforum

5:06 pm on Sep 10, 2006 (gmt 0)

5+ Year Member



I have seen this happen in numerous situations. Mod_rewrite, sitemap, new content, consolidated content, etc...

It seems if you introduce many links at one time that you will get flagged for sure. It definitely is an attempt to curb spam but some legit sites are getting hit when they just try to make simple corrections to their site. Lesson number one is, "take it slow" :)

[edited by: Aforum at 5:06 pm (utc) on Sep. 10, 2006]

ryanfromaustin

7:59 pm on Sep 10, 2006 (gmt 0)

10+ Year Member



jimbeetle - Some of the pages are fairly similar, I must admit. The have legitimate content, but some of the text duplicated.

tedster - We have unique titles and meta tags on all the pages.

dibbern2

8:56 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Back in the June SERPS disaster ( I forget what we all named it) I had a sub-directory that previously ranked well totally dissapear. So in answer to your question wether a part of a website can get hit, I'd say yes, for sure.

I made it back on July 27th, and actually have a boost in rankings now.

I'll never forget this very unpleasant lesson.

ryanfromaustin

10:20 pm on Sep 10, 2006 (gmt 0)

10+ Year Member



dibbern2 - did you do anything to get back into the SERPS or did you just wait it out?

reseller

10:29 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ryanfromaustin

Do you have AdSense spots or affiliate referral links on the several thousand pages you added?

Thanks!

jimbeetle

10:31 pm on Sep 10, 2006 (gmt 0)

WebmasterWorld Senior Member jimbeetle is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I had a sub-directory that previously ranked well totally dissapear. So in answer to your question wether a part of a website can get hit, I'd say yes, for sure

I think we should carefully differentiate something here: Does Google target, for whatever reason, a specific directory for deindexing? Or, is it that pages within a specific directory share certain characteristics that lead to their being deindexed?

My experience tells me it's the second option.

The have legitimate content, but some of the text duplicated.

I'd start with taking care of this. It probably won't make the whole problem go away, but cleaning this part up should help somewhat. After that, more investigation might help to expose the underlying problem. I don't think there's going to be a quick fix, more like a step-by-step process.

ryanfromaustin

11:32 pm on Sep 10, 2006 (gmt 0)

10+ Year Member



reseller - No, no ads or affiliate links at all.

jimbeetle - all of the pages have common navigation, so all of the "safe content" (pages that are undoubtedly high-quality and previously-indexed) point to index pages for the questionable content. I'm wondering if the very fact that they are linked to the other content is causing the problem. Perhaps the best path to take is to just delete all of the new content, restore the old content to its original form, and then resubmit a new sitemap.

ryanfromaustin

2:37 pm on Sep 11, 2006 (gmt 0)

10+ Year Member



FYI...we're back in the index this morning. I'm confused.

carminejg3

5:24 pm on Sep 12, 2006 (gmt 0)

10+ Year Member



google may be filtering massive new changes to " compare them to the older copies" to see what changes have taken effect.

I run a site that has a directroy, and since my staff is limited to... me i do updates about once a month to 50k pages. i noticed that traffic has taken a hit after this last update i pushed out...

maybe triggered a filter maybe not, glad i don't depend 100% on the serps.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month