Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Amazon Reviews Effecting Rank



7:59 pm on Jan 13, 2005 (gmt 0)

10+ Year Member

My site "widget school resources" provides resources for teachers. As part of the site we have reviews that pull from Amazon's Web Services (But just include products related specifically to widget educators). This review section also contains eBay affiliate links. The reviews section has been up since early November.

In the recent Google update My Site Has dropped from 3rd to 6th. It has also dropped from 1st to 2nd or 3rd in several other keywords.

If I use "&filter=0" my site still shows up as 3rd in the main keyword and 1st on other keywords. I have looked at everything and can't figure out why it would unless I am getting some sort of duplicate content/affiliate penalty. Is their a penalty for this? If so wouldn't it only effect the page with the duplicate content/affiliate links not the entire site?


1:45 am on Jan 15, 2005 (gmt 0)

10+ Year Member

It fell another spot today, this is really frustrating any thoughts?


3:41 am on Jan 15, 2005 (gmt 0)

10+ Year Member

My site had always done very well for a variety of keywords. On Sept 23rd, I had some sort of a sitewide penalty where I can't rank higher than 5th for any keyword. For my site name, I have ranked anywhere from 5th to 20th since that time. I ran an amazon product feed on the site, and I believe that has been the main culprit for the site-wide penalty. I'm not sure if it is the high number of duplicate content, or the template based approach of links back to my home page - but for some reason, I have only been able to find one site that has a significant number of amazon pages that hasn't had a site-wide penalty. I have seen sites with fewer than 10k pages do okay with still having amazon content, but more than that - all seem to have some type of filter/penalty.

After that time, I immediately blocked out that section with my robots.txt file and submitted it to google to have the sections removed. My site still hasn't recovered - but there are too many variables to be certain that is the only factor at play. I had been hoping it would have only been a 90 day penalty, but I'm still waiting.

So, if you have a large number of pages that are generated by the amazon content, then the duplicate content may be what is hurting you. If you really only have amazon pages that are "widget educator" related - and the number is less than 10k pages being created, I think you need to explore elsewhere. The 10k page limit is an arbitrary number, and it may be closer to 4k or less.


5:22 am on Jan 15, 2005 (gmt 0)

10+ Year Member

I'm sorry to people who've lost money, but this is a welcome change for searchers. Searching for information about books has become near impossible, with page after page of Amazon content crowding out reviews and comment, no matter how obscure the book.

I recently did a search for _In Search of Sunjata_, a book about oral poetry in the Mande language. Very obscure. I'll bet only a few thousand people have read it. But I got NINE page of Google results--bilge water every one of them. Many promised reviews, interviews with author, etc. etc.

My favorite was a top-ranked page titled "Oral & Maxillofacial Surgery." Of course, it wasn't about that topic at all, but merely a collection of books, some medical, some not. And how did it pick up a book on West African oral poetry? Oral. Get it? Cute.

So, for those of us who both benefit from and *use* Google, it's a welcome change.


5:11 pm on Jan 15, 2005 (gmt 0)

10+ Year Member

The interesting thing is that if you search for almost any of the books that I have listed on the site it comes up as 1 or 2 usually right behind amazon's own listing.

I would expect it to give a duplicate penalty to the page but not to the whole site.

Does any one know if the damage is done? or if I exclude the reviews via robots.txt or set up a seperate domain and linked to it would that eliminate the penalty?

Thanks for the info.


2:50 pm on Jan 17, 2005 (gmt 0)

10+ Year Member

Just to update my site is now back to # 4 for its main keyword.

"&filter=0" puts it back to 3. So there is still a penalty.

It is still lower on other keywords.

I am thinking about either a) trying to exclude the review section via robots.txt or b) setting up a "widget reviews" site and linking to it via a script that google can't follow.


2:58 pm on Jan 17, 2005 (gmt 0)

10+ Year Member

well said, I know the feeling and it is shared by many people in the book trade. Google gives too much weight to affiliate sites in this area.
It is almost pointless searching on Google for well known authors.
I am sure if people want to buy a book on amazon they can probably find the site themselves.

Perhaps if Google could bring them in to play when someone searches to 'buy joe soap books' etc that would be an improvement


Featured Threads

Hot Threads This Week

Hot Threads This Month