Welcome to WebmasterWorld Guest from 22.214.171.124
I HAD a website with a long menu:
Car lawyers in Florida
Car lawyers in Michigan
Car lawyers in Georgia
(Around 50 states)
As you can see, we have around 50 pages with duplicate content, THE STATE MENU
Unique Title tag
Unique Meta tag description
Unique text body around 150 to 200 words (few words)
The pages are new around 1 month, we have got traffic before Christmas, but right now many pages disappear from Google Index and many pages are as Supplemental result.
I have used < an online tool > to get the percentage of duplicated content, and we got around 78% to 80% of duplicated content for these pages, we compare Car lawyers in Florida and Car lawyers in Michigan
Well, right now, I got rid the “STATES MENU” and the percentage of duplicated content is around 40%, I will see if this situation change….
I have other pages with around 19% of duplicated content and these pages are indexed OK on Google
I have checked another website pages, and have duplicate content with around 80%, but they have PR3 or PR4 (old pages).
Anyone knows about percentage of duplicated content?
<Sorry, no tool names.
See Forum Charter [webmasterworld.com]>
[edited by: tedster at 12:28 am (utc) on Jan. 8, 2007]
1. No-one knows for sure what is 'acceptable'
2. No-one knows for sure what it's a percentage of - all the visible content? all the code? Code minus js etc? We don't know.
3. In view of the above, it might be a tad unwise to put any reliance on a 'tool', which may (or may not) use similar methods to google.
But if you apply the Golden Rule - "content is king" - and you build your site for visitors, as Google strongly advises, then common sense says you really do not need a figure; your visitors will be bored / irritated / confused if you duplicate too much, and avoiding unhelpful duplication isn't difficult.
I'm sorry if that's not what you wanted to hear ... but just try conversing with someone who constantly repeats themselves ... you'll soon be planning your escape ;)
re: wanting specific percentages (re: duplicate content, boilerplate stuff, etc.)
There aren't any. Again, too many variables.
In my example about Lawyers directory, so I'm showing “Lawyers” in a city called Astor in Florida. (I don’t know if Astor is a small city, but in my example I use this beautiful city as small town)
I was looking around the yellow page (paper), internet, and many resources; maybe I went to Astor to find information. Ups! Bigggg problem, just I found two lawyers, ok fine! I wrote a small summary about these lawyers, around 100 / 150 words in total, something like the address, phone number, skill, clients, etc (UNIQUE CONTENT).
For people living in Astor, this information is “short but Gold content”, right? Why? For example I’m unique on internet, and before Christmas I was first! They can get the information to call for an accident, etc etc.
But now, I have a nice but long MENU showing my 50 states, this menu is duplicated on each page, of each state and cities….
Google say, hey you! You haven’t quality content; you have around 80% of duplicate content on your pages! Now let go to Supplemental result! My small page cry and cry, my visitor cry, and I am crying LOL.
This is my example about useful but short content.
Solution: maybe I should get rid the menu, or try to do it with Flash :)
Sorry, sometime, I am crazy. :)
Repeating 'car lawyer' 50 times is pointless, unsightly and positively suicidal, of course!
well thats subjective not a point of fact.
However it may well be an example of how you cannot design just for users any longer (if u ever could) and not for search engines. In the old days if the algo decided that menu was spam it was most likely to ignore it for ranking purposes. Currently is most likley to cause a drop in rankings if unliked by the algo.
well thats subjective not a point of fact
But of course; I make no claim to know the internal details of the Google algo.
But it does not take rocket science to recognise that
Car lawyers in Florida
Car lawyers in Michigan
Car lawyers in ... x50
is not the way to go.
Unless, of course, you do know the internal details of the Google algo. I'm guessing your comments are equally subjective, as are the comments of every member here when discussing the internal details of the Google algo?
When it comes down to it, we are ALL making best guesses based on our knowldge and experience. But why do we need this conversation at this time, in this thread?
[edited by: tedster at 11:51 pm (utc) on Jan. 8, 2007]
...for 50 rows.
The SE's may treat a repetition of "car lawyers" 50 times as a mere template/boilerplate, or they may penalize you for "keywords stuffing" or the like. In any event, I don't think it does you any good, and some members think it harms the site, by reducing the site's credibility -- it makes the site look like it was designed for SEO purposes, rather than for users.
There was a thread a few months back directly on this same issue; it used "hotels" or some other travel related keyword as the example, you might want to look for that thread and read the comments.