homepage Welcome to WebmasterWorld Guest from 54.145.252.85
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Amazon Reviews Effecting Rank
mcockrel

5+ Year Member



 
Msg#: 27521 posted 7:59 pm on Jan 13, 2005 (gmt 0)

My site "widget school resources" provides resources for teachers. As part of the site we have reviews that pull from Amazon's Web Services (But just include products related specifically to widget educators). This review section also contains eBay affiliate links. The reviews section has been up since early November.

In the recent Google update My Site Has dropped from 3rd to 6th. It has also dropped from 1st to 2nd or 3rd in several other keywords.

If I use "&filter=0" my site still shows up as 3rd in the main keyword and 1st on other keywords. I have looked at everything and can't figure out why it would unless I am getting some sort of duplicate content/affiliate penalty. Is their a penalty for this? If so wouldn't it only effect the page with the duplicate content/affiliate links not the entire site?

 

mcockrel

5+ Year Member



 
Msg#: 27521 posted 1:45 am on Jan 15, 2005 (gmt 0)

It fell another spot today, this is really frustrating any thoughts?

Rick_M

10+ Year Member



 
Msg#: 27521 posted 3:41 am on Jan 15, 2005 (gmt 0)

My site had always done very well for a variety of keywords. On Sept 23rd, I had some sort of a sitewide penalty where I can't rank higher than 5th for any keyword. For my site name, I have ranked anywhere from 5th to 20th since that time. I ran an amazon product feed on the site, and I believe that has been the main culprit for the site-wide penalty. I'm not sure if it is the high number of duplicate content, or the template based approach of links back to my home page - but for some reason, I have only been able to find one site that has a significant number of amazon pages that hasn't had a site-wide penalty. I have seen sites with fewer than 10k pages do okay with still having amazon content, but more than that - all seem to have some type of filter/penalty.

After that time, I immediately blocked out that section with my robots.txt file and submitted it to google to have the sections removed. My site still hasn't recovered - but there are too many variables to be certain that is the only factor at play. I had been hoping it would have only been a 90 day penalty, but I'm still waiting.

So, if you have a large number of pages that are generated by the amazon content, then the duplicate content may be what is hurting you. If you really only have amazon pages that are "widget educator" related - and the number is less than 10k pages being created, I think you need to explore elsewhere. The 10k page limit is an arbitrary number, and it may be closer to 4k or less.

suidas

10+ Year Member



 
Msg#: 27521 posted 5:22 am on Jan 15, 2005 (gmt 0)

I'm sorry to people who've lost money, but this is a welcome change for searchers. Searching for information about books has become near impossible, with page after page of Amazon content crowding out reviews and comment, no matter how obscure the book.

I recently did a search for _In Search of Sunjata_, a book about oral poetry in the Mande language. Very obscure. I'll bet only a few thousand people have read it. But I got NINE page of Google results--bilge water every one of them. Many promised reviews, interviews with author, etc. etc.

My favorite was a top-ranked page titled "Oral & Maxillofacial Surgery." Of course, it wasn't about that topic at all, but merely a collection of books, some medical, some not. And how did it pick up a book on West African oral poetry? Oral. Get it? Cute.

So, for those of us who both benefit from and *use* Google, it's a welcome change.

mcockrel

5+ Year Member



 
Msg#: 27521 posted 5:11 pm on Jan 15, 2005 (gmt 0)

The interesting thing is that if you search for almost any of the books that I have listed on the site it comes up as 1 or 2 usually right behind amazon's own listing.

I would expect it to give a duplicate penalty to the page but not to the whole site.

Does any one know if the damage is done? or if I exclude the reviews via robots.txt or set up a seperate domain and linked to it would that eliminate the penalty?

Thanks for the info.

mcockrel

5+ Year Member



 
Msg#: 27521 posted 2:50 pm on Jan 17, 2005 (gmt 0)

Just to update my site is now back to # 4 for its main keyword.

"&filter=0" puts it back to 3. So there is still a penalty.

It is still lower on other keywords.

I am thinking about either a) trying to exclude the review section via robots.txt or b) setting up a "widget reviews" site and linking to it via a script that google can't follow.

phantombookman

10+ Year Member



 
Msg#: 27521 posted 2:58 pm on Jan 17, 2005 (gmt 0)

Suidas
well said, I know the feeling and it is shared by many people in the book trade. Google gives too much weight to affiliate sites in this area.
It is almost pointless searching on Google for well known authors.
I am sure if people want to buy a book on amazon they can probably find the site themselves.

Perhaps if Google could bring them in to play when someone searches to 'buy joe soap books' etc that would be an improvement

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved