| 7:01 pm on Oct 3, 2005 (gmt 0)|
How big is the text content of each page?
| 7:19 pm on Oct 3, 2005 (gmt 0)|
50 words seems low, but I would be still be worried if I were you.
Have you seen a drop?
| 3:07 pm on Oct 4, 2005 (gmt 0)|
I have recently seen that almost all of my pages are on a gradual march towards the supplemental results when I do site:mysite.com. So much so that only 3 pages are displayed outside of the supplemental pages. This makes me think that dup content penalty is being applied to these pages because all of them have a common disclaimer in the footer.
| 3:09 pm on Oct 4, 2005 (gmt 0)|
|How big is the text content of each page? |
I would say the page count ranges from about 300 to 1000 words per page.
| 3:21 pm on Oct 4, 2005 (gmt 0)|
1) 50 out of 300-1000 words is a lot of duplication.
2) If you're using generic text which is found on tons of sites you might be setting yourself up.
I'd create a link from every page to a disclaimer / ToS page.
| 3:23 pm on Oct 4, 2005 (gmt 0)|
Duplicate filter is going very sensitive, sometimes beyond any logic. It especially cares for duplicate meta descriptions and beginning of BODY content. But footer, perhaps, could also be a problem. In one site, I did everything to prevent duplicate content in the beginning of the document, and Google put the footer text to snippet in site:search!
There might be a bit of controversy if this could be considered black hat, but in fact, it's just client side include, and could be done not only for SEO, but for easier updating, like server side includes. I use it technique to avoid duplicate content from header menu, and site ranks well, no penalties.
And agree, 50 words out of 300 is a lot of duplicate, at least for Google!
| 3:41 pm on Oct 4, 2005 (gmt 0)|
|And agree, 50 words out of 300 is a lot of duplicate, at least for Google! |
It's a lot for users, too. Why not just have a "Disclaimer" link, as Shri suggests?
| 5:27 pm on Oct 4, 2005 (gmt 0)|
I have the same problem, but I'm thinking of a different way of solving it. My subject is a little touchy -from a legal standpoint- so I really think its best to have the disclaimer clearly on each page.
I'm thinking I'll just use a .gif of the necessary text. No viewer will be able to tell the difference from real text, so there's no design issue.
Sure, it's still a repeat item, but a link to my disclaimer page would also be a repeating tag.
This opens up a new worry: would G perceive a repeating image elemnet (like a logo) as duplication?
Any thoughts? Thanks.
| 5:32 pm on Oct 4, 2005 (gmt 0)|
|would G perceive a repeating image elemnet (like a logo) as duplication? |
| 6:36 pm on Oct 4, 2005 (gmt 0)|
Yeah, image is a good idea, I thought about saying it in my previous post. However, browsers for seeing impaired won't read it, unless you copy the text to alt attribute. And whether Google penalize duplicate long text in alt attribute or not - this is an open question.
Google won't penalize repeating images, almost every site has plenty of them.
| 7:15 pm on Oct 4, 2005 (gmt 0)|
|There might be a bit of controversy if this could be considered black hat, but in fact, it's just client side include, and could be done not only for SEO, but for easier updating, like server side includes |
You shouldn't get in trouble with either - server or client. Server side would be preferable imo.
| 7:19 pm on Oct 4, 2005 (gmt 0)|
|You shouldn't get in trouble with either - server or client. Server side would be preferable imo. |
Agree, but there's a problem of hiding duplicate content. Server side are undetectable for crawlers, however only client side can be used to hide duplicate content (like header, footer or navigation) from them. Unless you apply cloaked server side includes, but this would be apparently against Google guidelines.
| 7:23 pm on Oct 4, 2005 (gmt 0)|
Dump the disclaimer in an IFrame.....problem solved.
Having said that, I doubt it is the cause of your Google woes.
| 9:59 pm on Oct 4, 2005 (gmt 0)|
>>All of my pages are in the supplemental results. I am worried that G is applying duplicate content penalty to my website.
hmm ... seems that 90% of websites (pages) are in supplemental results
just check this : it's a simple search with this phrase "Obesity surgery lowers heart risk, US study shows"
u can see, that 99% of pages are with supplemental results mark. even pages from Yahoo and reuters :-)