Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
In the recent Google update My Site Has dropped from 3rd to 6th. It has also dropped from 1st to 2nd or 3rd in several other keywords.
If I use "&filter=0" my site still shows up as 3rd in the main keyword and 1st on other keywords. I have looked at everything and can't figure out why it would unless I am getting some sort of duplicate content/affiliate penalty. Is their a penalty for this? If so wouldn't it only effect the page with the duplicate content/affiliate links not the entire site?
After that time, I immediately blocked out that section with my robots.txt file and submitted it to google to have the sections removed. My site still hasn't recovered - but there are too many variables to be certain that is the only factor at play. I had been hoping it would have only been a 90 day penalty, but I'm still waiting.
So, if you have a large number of pages that are generated by the amazon content, then the duplicate content may be what is hurting you. If you really only have amazon pages that are "widget educator" related - and the number is less than 10k pages being created, I think you need to explore elsewhere. The 10k page limit is an arbitrary number, and it may be closer to 4k or less.
I recently did a search for _In Search of Sunjata_, a book about oral poetry in the Mande language. Very obscure. I'll bet only a few thousand people have read it. But I got NINE page of Google results--bilge water every one of them. Many promised reviews, interviews with author, etc. etc.
My favorite was a top-ranked page titled "Oral & Maxillofacial Surgery." Of course, it wasn't about that topic at all, but merely a collection of books, some medical, some not. And how did it pick up a book on West African oral poetry? Oral. Get it? Cute.
So, for those of us who both benefit from and *use* Google, it's a welcome change.
I would expect it to give a duplicate penalty to the page but not to the whole site.
Does any one know if the damage is done? or if I exclude the reviews via robots.txt or set up a seperate domain and linked to it would that eliminate the penalty?
Thanks for the info.
"&filter=0" puts it back to 3. So there is still a penalty.
It is still lower on other keywords.
I am thinking about either a) trying to exclude the review section via robots.txt or b) setting up a "widget reviews" site and linking to it via a script that google can't follow.
Perhaps if Google could bring them in to play when someone searches to 'buy joe soap books' etc that would be an improvement