homepage Welcome to WebmasterWorld Guest from 107.20.25.215
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Will Google Ban a Single Sub-Directory
ryanfromaustin




msg:3078033
 2:54 pm on Sep 10, 2006 (gmt 0)

We have a subdirectory on our site that contains most of our static content. As of about a month ago, we had only about 100 pages in the directory and all were indexed in Google. We then added several thousand pages at once into this directory. Now, Google appears to not be indexing any files in that directory, though they are still index files in our root directory as well as other subdirectories.

My question is, has anyone heard of Google removing just a subdirectory from the index? Is it possible that the one directory got flagged for "over optimization" or is the whole site on the way out and we're just seeing the removal of the one directory so far?

 

tedster




msg:3078041
 3:07 pm on Sep 10, 2006 (gmt 0)

There was a related discussion in August: Is Google penalizing only parts of sites? [webmasterworld.com]

Perhaps some of the filters that make up the "sandbox effect" could be getting applied more selectively within a domain, instead of domain-wide. I haven't seen this effect myself, but a few reports are coming in, so it seems like this may be happening. If so, it's nicer than whacking an entire domain's trust factor just for a boat load of new urls in one directory.

I note that you used the word "ban" in your title. I think it is more like a filter (if we are understanding it correctly) than a ban, and once Google gains some trust on the new pages they could start to show up.

Are you watching your server logs for googlebot requests in that directory?

ryanfromaustin




msg:3078073
 3:48 pm on Sep 10, 2006 (gmt 0)

Thanks for the input, tedster. It looks like the GoogleBot did crawl that directory on September 7. Apparently, after that crawl everything from that directory was removed from the index. Now, the question is (not that anyone can answer this exactly, but perhaps someone can offer some relevant experience):

1) Did Google decide that the new content was spam and permanently remove the directory? In this case, I should delete all the new content and possibly move the old content to a new directory.

2) Did Google simply raise a flag because of the number of files added, and that flag will be turned off after some waiting period? If this is the case, then I am probably best just leaving everything as it is.

3) Is Google penalizing the directory because it doesn't like the new content? If so, then perhaps I should delete the new content and just sit on the old content, hoping that Google forgives me.

Any thoughts?

Green_Grass




msg:3078079
 3:54 pm on Sep 10, 2006 (gmt 0)

Check this out also.

[webmasterworld.com...]

jimbeetle




msg:3078105
 4:16 pm on Sep 10, 2006 (gmt 0)

Quick question, what types of pages are in that directory? Do they have a lot of differentiated content or can they somehow be seen as fairly similar to one another?

tedster




msg:3078122
 4:43 pm on Sep 10, 2006 (gmt 0)

Another issue would be if the urls in that directory have unique title tags and meta descriptions. Even if the rest of the on-page content is differentiated, title and meta description have become essential "quick differentiators" for Google in the past year.

Aforum




msg:3078141
 5:06 pm on Sep 10, 2006 (gmt 0)

I have seen this happen in numerous situations. Mod_rewrite, sitemap, new content, consolidated content, etc...

It seems if you introduce many links at one time that you will get flagged for sure. It definitely is an attempt to curb spam but some legit sites are getting hit when they just try to make simple corrections to their site. Lesson number one is, "take it slow" :)

[edited by: Aforum at 5:06 pm (utc) on Sep. 10, 2006]

ryanfromaustin




msg:3078235
 7:59 pm on Sep 10, 2006 (gmt 0)

jimbeetle - Some of the pages are fairly similar, I must admit. The have legitimate content, but some of the text duplicated.

tedster - We have unique titles and meta tags on all the pages.

dibbern2




msg:3078266
 8:56 pm on Sep 10, 2006 (gmt 0)

Back in the June SERPS disaster ( I forget what we all named it) I had a sub-directory that previously ranked well totally dissapear. So in answer to your question wether a part of a website can get hit, I'd say yes, for sure.

I made it back on July 27th, and actually have a boost in rankings now.

I'll never forget this very unpleasant lesson.

ryanfromaustin




msg:3078317
 10:20 pm on Sep 10, 2006 (gmt 0)

dibbern2 - did you do anything to get back into the SERPS or did you just wait it out?

reseller




msg:3078329
 10:29 pm on Sep 10, 2006 (gmt 0)

ryanfromaustin

Do you have AdSense spots or affiliate referral links on the several thousand pages you added?

Thanks!

jimbeetle




msg:3078330
 10:31 pm on Sep 10, 2006 (gmt 0)

I had a sub-directory that previously ranked well totally dissapear. So in answer to your question wether a part of a website can get hit, I'd say yes, for sure

I think we should carefully differentiate something here: Does Google target, for whatever reason, a specific directory for deindexing? Or, is it that pages within a specific directory share certain characteristics that lead to their being deindexed?

My experience tells me it's the second option.

The have legitimate content, but some of the text duplicated.

I'd start with taking care of this. It probably won't make the whole problem go away, but cleaning this part up should help somewhat. After that, more investigation might help to expose the underlying problem. I don't think there's going to be a quick fix, more like a step-by-step process.

ryanfromaustin




msg:3078367
 11:32 pm on Sep 10, 2006 (gmt 0)

reseller - No, no ads or affiliate links at all.

jimbeetle - all of the pages have common navigation, so all of the "safe content" (pages that are undoubtedly high-quality and previously-indexed) point to index pages for the questionable content. I'm wondering if the very fact that they are linked to the other content is causing the problem. Perhaps the best path to take is to just delete all of the new content, restore the old content to its original form, and then resubmit a new sitemap.

ryanfromaustin




msg:3078845
 2:37 pm on Sep 11, 2006 (gmt 0)

FYI...we're back in the index this morning. I'm confused.

carminejg3




msg:3080374
 5:24 pm on Sep 12, 2006 (gmt 0)

google may be filtering massive new changes to " compare them to the older copies" to see what changes have taken effect.

I run a site that has a directroy, and since my staff is limited to... me i do updates about once a month to 50k pages. i noticed that traffic has taken a hit after this last update i pushed out...

maybe triggered a filter maybe not, glad i don't depend 100% on the serps.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved