Forum Moderators: open
Nothing can be proven, but I've got a gut feeling that they're going after duplicate or near-duplicate pages. Don't take my word for it, but see if your dropped pages come close - check percentages and see if that's a possibility; it can't hurt.
Disclaimer: Not intended to be taken as fact. When all else fails, the Accidental_ SEO relies on hunches but doesn't recommend the practice to anyone else.
The site is using the same template throughout (as do most shopping sites).
Would that trip the dupe penalty?
I cannot find any other reason as to why a site I care for has dropped.
The site in question had a PR5, was assigned a PR1 (even though the same quality links are present and seen in backlinks). The site performs as a PR1 site would - awful in a competitive area.
Have any other OSC sites suffered?
I wonder if Google is trying to cut out duplicate content generally on the web, and has just gone too far, with innocent sites suffering.
- I did not see a Google deep crawl for about 3-4 weeks.
- The site was down for about 12 hours recently.
- About 6 weeks ago I added a large amount of visible common text to almost all of the pages, whereby only 10-15% of each page's content has remained unique. The rest of the content is identical throughout the site.
Is this a duplicate penalty or lack of deep crawling or something else?
I have already eliminated the comomn text. It was great for site users to have that instructional text on each of individual pages, but if it gets me in trouble with great Google - I guess I have no choice!
Perhaps in the future I will consider 'wrapping' that text inside a javascript or flash so that Google would not spider it...
Any suggestions would be appreciated.