Forum Moderators: Robert Charlton & goodroi
think about SEs only when you buy links.
That actually made me laugh ot loud :)
Duplicate content filtering seesm to take two shapes.
1. The stuff that algo finds.
2. The stuff a human reviewer finds.
Older sites seem to suffer more for the type 2 filtering since they've been around longer, and competitors have filed complaints against them.
I have a competitor who has 4 sites, all for the same product line, but "creatively named" to look different. The site author uses frames, keyword loading, and cross linking between all 4 sites. He also hosts them on different services to make them appear different. Pretty tricky eh?
So far he has not been discovered. It's been a few months now and his secondary sites are being pullerd WAY up into page 1 results. In several serps, he ranks #1, #3 and #5. The first two being the primary domain, the #5 is his other product (same product line) site. IMHO, these are poor results (not for him though) and are a manipulation of the search engines. However, it goes continues to pass the algo and go unchecked.
I figure in time, he'll hang himself.
Any similar situations out there?
That actually made me laugh ot loud :)
I have a competitor who has 4 sites, all for the same product line, but "creatively named" to look different. The site author uses frames, keyword loading, and cross linking between all 4 sites. He also hosts them on different services to make them appear different. Pretty tricky eh?So far he has not been discovered. It's been a few months now and his secondary sites are being pullerd WAY up into page 1 results. In several serps, he ranks #1, #3 and #5. The first two being the primary domain, the #5 is his other product (same product line) site. IMHO, these are poor results (not for him though) and are a manipulation of the search engines. However, it goes continues to pass the algo and go unchecked.
i was wondering if people report bad behaviour to IRS with same energy
I'm not reporting him. As I said, He'll hang himself. I can guarantee I've been reported. Hence my 5 year #2 Yahoo rank earning an overnight demerit to -30. That was over a year ago. I had two sites, a sales site and a private customer site. At the time, it was OK to do, even though it was not (IMHO) dulicate content, but anyone could make a case that it was, if you wanted to be really picky.
I've long since merged the two sites and discarded the extra. Yahoo seems to have hard coded a penalty and stuck me with an outdated DMOZ title that not even a "noodp" can correct.
BTW, I'm sure people "squeal" on their competitors, neighbors and even friends if it stands to benefit them or their ego. It's become a twisted world, if you have not already noticed.
Either you don't understand the problem, or I don't understand your solution.
Let's say I have 10 different products in the widget category. Each of those product pages contains a unique product description and the widget category description. You're saying I should noindex 9 out of 10 of my widget pages?
9 out of 10 of my widget pages?
Yes, if they are exactly the same. Then do everything you can to optimize the one you are still going to have indexed. Good title, good meta description, some good inbound links if you can get them and so on.
You don't really need to have 9 pages that are the same or almost the same indexed anyway. You need one page with the information to rank well for the product or what ever you are featuring.
You've got it exactly right. Sorry to all if I wasn't clear. Each of the 10 pages in my example are for a unique product. Each of those products are in the same product category so each of their pages includes the category description. The duplicated descriptions are separated a bit from the page's main description.
The category description needs to be included on those pages for the sake of informing my customers, but I don't use frames, javascript, text in images, or pop ups at all and I really don't want to start. It seems like this is a valid, useful thing to be doing and I don't think a human at Google would penalize me for it. The algo apparently hasn't gotten upset over it so far either.
Not sure what to do about this yet.
I did what Annej has already advised you: built a strong 'index' page that included widget product info along with classification/category info. I noindexed all the individual product pages -about 99% of the directory.
It has been one of the most successfully ranked product sections of dozens I've made over the past 5 years. I continue to add new categories using exactly this method.
This seems to be problematic to you, but I don't understand why it should. Your alternative -given that you don't like the suggestion of using i-frames or art-text and the like- is to just wait and let the duplicate ogre get you. There seems to be agreement that your category decription IS duplicate content. The unknown factor is the per-centage of the total page content.
So you're suggesting I noindex all of my product pages and draw organic SE traffic with my category pages only? At this point, a good portion of my organic SE traffic lands on my product pages, possibly a majority. Sounds like a crazy idea to be honest, but is that what you did and it worked for you?
The duplicated category description definitely makes up the majority of the content on each of my product pages.
BTW, how did you know you had been penalized for duplicate content?
Do you actually have a problem with your site? Or is this more of a theoretical worry?
I don't think I have a problem at this point. I am getting traffic, but I thought the duplicate content could be lowering my rankings. Would a penalization look more like a huge drop in referrals than a somewhat lower ranking?
I don't want to tempt the "duplicate ogre", but you say it's pretty rare?