Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Massive Rankings Drop for a Single SubDirectory



2:31 pm on Feb 11, 2009 (gmt 0)

5+ Year Member

All, I need your help (or google's help, or someone's help!)

Here's what's happened:

A friend and I run a DIY home improvement blog. We strive for high quality content that is completely custom (not scraped, not simply duplicated). Part of what we do is review tools, frequently tools available for purchase on Amazon. Most all of these reviews are in a single subdirectory on our site:


Pages in this directory used to get 2000 hits / week for a wide variety of tool-related terms.

Last Friday, a "bomb went off" and google basically devalued every page in our /tools/ directory, except for one.

The blow has been pretty amazing in terms of lost revenue - I hardly even want to check our affiliate programs...

We've been trying to think of theories - could it be that Google penalized this directory because it has mostly affiliate-type posts in it? We don't think so because the posts don't violate the quality guidelines (e.g. they are rich posts, often with 5-6 meaty paragraphs of useful information).

Could if be that Google is treating the directory as "news" -- e.g. it thinks that the stuff in that directory is now just old news and it doesn't need it anymore?

We've contacted Google through webmaster tools but there has been no response.

We've read all about the supplemental index and I've even used some of the tricks to check which pages are fully indexed (using the site: www.sitename.com/* query). Some of the pages in the tools directory are there, some are not... but none except one of them ranks for anything worthwhile.

For instance, if I search for

"this unique eight word phrase in the title" in quotes, I get a whole bunch of other pages on the net (including pages on things like blogcatalog) which rank #1 instead of us for that page. Our page ranks #11!

Please help if you can!



10:30 pm on Feb 11, 2009 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

I've read various Google people mentioning that they can apply a penalty to a domain, a directory, a page, a link, or a keyword.

...devalued every page in our /tools/ directory, except for one.

Now that part seems quite unusual. Is the page that wasn't devalued in some way unique for that directory?


12:08 am on Feb 12, 2009 (gmt 0)

5+ Year Member

Tedster, thanks for the reply....

There are a few differences in the page that stayed in the rankings. 1) Its old. 2) It is an aggreagate view of a product type (e.g., think of it as a post that highlights all of the discounted products in a segment... therefore it has a lot of outgoing affiliate links), 3) it has a sitelink on our site from every individual article (its in the sidebar of the blog). Also, of all the posts it might be seen as one that has few recent duplicate content.

There are other sitewide linked-posts that have fallen into oblivion... despite being everywhere on our site, they are nowhere on Google.

Here's the steps I'm taking:

1. Going back to older articles on our site that rate pretty well in related topics (e.g. How To posts) and linking to our relevant tool reviews.

2. Retooling Robots.txt to ensure no duplicate content is reported to the search engines.

3. Going through our themes (layout) with a fine-tooth comb ensuring that all non-relevant links (e.g. internal, unimportant duplicate links) are nofollowed.

4. Networking with a couple bloggers in our domain and trying to get relevant incoming links into our articles (certainly the hardest part of this... especially when I try to explain what's happened to us to folks that aren't necessarily in it for profit).

5. Reading... reading... reading... trying to find every site that writes about penalization, the supplemental index, "google hell", etc.

6. Writing Google in Webmaster Tools to see if we have been penalized, why, and how we can fix it.

7. Going back to the posts that have basically single outgoing links to Amazon and modifying them to have relevant outbound links to other non-sales sites, and also multiple affiliate targets (e.g. giving readers options for where to buy)

8. Praying... especially that I would learn whatever God is trying to teach me through this time. And also working towards not obsessing, making sure I stay very focused at my day job (which has been extraordinarily blessed depsite the 2008-09 recession).


12:11 am on Feb 12, 2009 (gmt 0)

5+ Year Member

Oh.. and one more thing... some of the posts that have been obliterated actually self-report pagerank in the toolbar (e.g. some are pagerank 2 pages in the toolbar)


5:23 am on Feb 12, 2009 (gmt 0)

5+ Year Member

Is your pages dynamic generated or static pages? what type of extension of your pages?


5:29 am on Feb 12, 2009 (gmt 0)

5+ Year Member

The site is run on WordPress. The content is unique (4 of us write it), but the pages are served dynamically (PHP).

The latest info:

I have done my best to create a robots.txt file that eliminates dupe content. Unfortunately, some major pages that were dupes were in the index (for instance, my "author" page was indexed, which has a copy of all the posts).

I'm waiting for Google to incorporate all my latest robots.txt changes into the index.


12:53 pm on Feb 12, 2009 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member

Google started bashing affiliate-based sites a year or more ago. It may have since turned the dial in its algo on what it considers valuable content.

Tell us the linking structure in your tools/ folder. How many keywords are repeated in the anchor text (e.g., red ridgets, blue widgets on the same page)?

Can you see anything in it that looks spammy? (Some old algo penalties arrive late.)



1:10 pm on Feb 12, 2009 (gmt 0)

5+ Year Member


Thanks for discussing with me.

Here's a few things about the content:

1) In general I had a good number of pages (especially recently) that were reviews of products on Amazon. These articles were well-SEO'd (title, h1 text match with descriptive keywords, h2 text supports the review with some additional keywords) paragraphs are rich with keywords but very human readable... and written for a human reader. I would not consider it stuffed by any means.

Inter-linking comes a few ways: I have a related articles plugin at the bottom of the site that automatically links to similar posts on the site (this is on all 447 pages of content we have). It uses the title of the post in the anchor text of the link.

Many of the tool posts were otherwise not linked to other things other than their amazon target product. I have since started going back into the posts and trying to add more rich outgoing links to authoritative information sites (e.g. The Wood Flooring Association of America, Wikipedia, etc.)

I've also got links to the tool posts coming in from other project posts on the site (e.g. projects where we would use tools like these, if not these very tools). These are generally in-line in the text of those posts.

I tend to not repeat keywords in the same page very much - instead I try to use more adjectives to describe the product we're reviewing. In some cases this leads to some dupe-style content (for instance, there might be a post on "inexpensive pneumatic tool combo kit" and another on "pneumatic three nailer kit by porter cable")

Since the time of this writing I've gotten 4 incoming, PR3-4 links to at least one post that focuses on a genre of tools (it gives a list of 8 tools required to do a complex job)... that post has yet to reappear in the index even with the support of links.


1:12 pm on Feb 12, 2009 (gmt 0)

5+ Year Member

Sorry - just to clarify, I should have said "I tend to not repeat keywords in the titles of pages... e.g. I would have a title that was "phillips screwdriver, flat head screwdriver, screwdrivers". instead I might have "Inexpensive Phillips/Flat Head Reversible Screwdriver by Craftsman" or something like that.


1:57 pm on Feb 12, 2009 (gmt 0)

5+ Year Member

The site is run on WordPress. The content is unique (4 of us write it), but the pages are served dynamically (PHP).

Can you clarify a bit whether your pages are ending with .php?

or without any extension but just a "slash" at the end...?

For example:


2:46 am on Feb 19, 2009 (gmt 0)

5+ Year Member

Took me a while to get back to this. Yes, the pages end with just a slash/

I've written google a more elaborate reconsideration request. Haven't seen any action on that yet. They say 3-4 weeks, we'll see what happens.


4:04 pm on Feb 20, 2009 (gmt 0)

5+ Year Member

Your problem is because of duplicate content of your pages which accessible with slash/ and without slash.(not your fault)

To fix this, you need to put canonical tag on your pages, as discussed in:

I already done this few days ago and just get back my ranking yesterday.

Wish you good luck.


7:09 pm on Feb 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Many of the tool posts were otherwise not linked to other things other than their amazon target product.

I had a similiar situation some years ago, and I think it was penalized as a thin affiliate site, since its only link scheme was to AMAZ and internal cross-links. I tried adding more links, and I started to bounce back-and-forth between penalty and no penalty.

I finally realized G was right, it was a thin site, and served no purpose than to sell stuff that anyone could find at Amazon on their own. I dropped that business, and am glad of it. There were better things to pursue.


Featured Threads

Hot Threads This Week

Hot Threads This Month