homepage Welcome to WebmasterWorld Guest from 54.211.219.68
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Does Google Penalise Replicated Pages?
How similiar must the pages be to be penalised
LukeC

10+ Year Member



 
Msg#: 15426 posted 9:15 am on Jul 21, 2003 (gmt 0)

Excuse me for me being a complete newb on this issue.

I have read that unless you completely rewrite fresh content on each new page, google will penalise you for having duplicate content.

Does anyone know to what extent you can get away with changing a few keywords here and there? Or is google clever enough to notice you have just changed the main keywords in creating a new page and therefore viewing it as spam (if it is spam?) it penalises you.

This intrigues me as on the one hand I can see how it would be spam and it would make sense for google to prevent people creating millions of very similiar pages, but then on the other hand surely there will be large chunks of information you might want on every page, because they are genuinely relevant to every page.

Also sometimes I see people doing this and seem to be getting away with it?

 

lazerzubb

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 9:25 am on Jul 21, 2003 (gmt 0)

If it's 100% the same the biggest chance is that you wont get a penalization, the thing that will happen is that Google will just pick one of the pages and drop all the others (The one's which is exactly the same)

I can't remember in which paper it was (It might have been an Altavista paper)
Where it was quoted on how many pages where duplicates on the web (I remember it as quite a large bit)

If you just change the title or something there is quite a good chance that Google will either drop the pages because of duplicate filters, or they will index them, but penalize them, so they will have a big trouble ranking for any keyword.

johnser

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 10:02 am on Jul 21, 2003 (gmt 0)

What happens if every page is different by 20% or so?
J

caine

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 10:08 am on Jul 21, 2003 (gmt 0)

Why would you want pages that are 20% different it would dilute the content of the pages. I've long had a theory that if you have to many pages that are similar, but not penalised by G, then G will attribute various weightings to all the pages, that you would prefer to have on one.

LukeC

10+ Year Member



 
Msg#: 15426 posted 10:08 am on Jul 21, 2003 (gmt 0)

"If you just change the title or something"

Thanks for your reply, that seems to be a fair way of google dealing with spammers.

But I wonder to what extent you need to change new pages, though it's probably a case of how long is a piece of string. If google seriously penalises you if they think you are spamming then it is not worth experimenting.

What I did was change the title, the headings, and the main keywords, google gave me good listing for all the new pages, but then with a new freshbot it would drop them, then include them again, then drop them again.

I concluded that either google is just playing around with my site because it is fairly new, or it could still see tracts of those pages had fairly similiar content and so was penalising them.

Ultimately I suppose I need to stop being lazy and write totally unique content for each page.

chiyo

WebmasterWorld Senior Member chiyo us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 15426 posted 10:18 am on Jul 21, 2003 (gmt 0)

I'm with caine. I'm not sure why you would need pages which have 80% common content? Are your pages too short, or is it really necessary to have those many pages and instead consolidate them into fewer pages?

I'm of the view, with little hard evidence, that Google just ignores "common text" repeated on all pages for indexing purposes, but still counts it when working out what the focus of page is for ranking purposes. If this is the case, looking at it from a SEO point of view, you are just diluting the chance of every page competing for the keyword(s) you are shooting for.

I would look at the site structure and work out what you are doing wrong that means having to repeat so much content on different pages.

johnser

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 10:30 am on Jul 21, 2003 (gmt 0)

Hi guys - 2 cloaked sites. 1 .co.uk & .com

Trying to make sure the second site doesn't get dropped so I need to differentiate it sufficiently so as not to trip a dup filter. Each of the hundreds of pages (on both sites) are optimised for a particular phrase.
J

peewhy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 10:45 am on Jul 21, 2003 (gmt 0)

You can easily spend as much time trying to make them 20% or so different as you can making them completely different...make them completely different. Don't forget search engines don't see the image, they only know there is an image - so that bit isn't too much of a problem.

Body copy, title and description need to be different. The point is you increase your chances of more business because if you treat them as different, so do search engines and directories - take advantage of that.

caine

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 10:50 am on Jul 21, 2003 (gmt 0)

If 20% change is the target, then spend 19.9% changing the content, its distribution around the page, navigational layout, and structure of the page, title, meta's, are not so important when trying to get around the spam filters.

johnser

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 15426 posted 11:05 am on Jul 21, 2003 (gmt 0)

Doing a find a <find & replace> to insert an extra 3 paras is so much easier though ;)

(Theres no images on the pages)

OK. Point taken. Nav & structure it is.
Thx folks
J

Toasted

10+ Year Member



 
Msg#: 15426 posted 9:31 pm on Jul 21, 2003 (gmt 0)

There are plenty of legit reasons to only have a page that is 20% different or so - think of all the sites that offer free content... get an article off one of these sites, and you've basically got duplicate content with plenty of other sites, only difference being your page setup, navigation etc.

While using free content is a bit of a lazy way out, it's a great site promotion method for the person who writes the article...

...and what about press releases - plenty of them that need to be replicated across multiple sites (though it's doubtful these are optimized for SE anyway).

SEO practioner

10+ Year Member



 
Msg#: 15426 posted 10:20 pm on Jul 21, 2003 (gmt 0)

It's always important to refresh new content on a page, be it 5%, 10% 20%- whatever. New content or fresher content is just that: it creates better value for your users.

chiyo

WebmasterWorld Senior Member chiyo us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 15426 posted 12:48 am on Jul 22, 2003 (gmt 0)

>>get an article off one of these sites, and you've basically got duplicate content with plenty of other sites, only difference being your page setup, navigation etc. <<

I can understand copying articles like this is OK for magazines or books, where distribution is limited, but on the web, with the tiniest exceptions, each page on the web is available to everyone, and by linking it can be found quick.

Why gum up the web with multiple copies of the same article when a simple link will do? (Err.. on second thougts, don't answer that.. i think i know?!)

If i was a search engine, these duplicate filters are important as we would end up being presented with SERPS full of the same articles with only the pretty bits on the side to differentiate them. Incredible waste of time on all sides.

>>what about press releases?<<

Absolutely no reason these need to be on more than one page or just a few if it is expected to be hit heavily. The obvious place is either the site of those who released it, or their news release aggregator. Again a simple link will do the job.

Toasted

10+ Year Member



 
Msg#: 15426 posted 1:01 am on Jul 22, 2003 (gmt 0)

Yep - that's fair enough I guess, it would be pretty ordinary if the top 10 places in a SERP were all for the same article, just at different sites.

But the SE's need to still be careful how they handle this - these are still legitimate pages, I don't agree with just putting a link in to an article (a lot of the time that's a good idea, but it doesn't apply all the time) - why should a user have to go to 10 different sites to find the info they are looking for? if a site has a good mix of original content with a few articles (used with permission) from other sites, than that creates a far better user experience than needing to jump across several sites.

Also, Press releases are specifically written to be distributed and shown in as many places as possible - again, in many cases a link would suffice, but in many cases it would not.

Yes, the SE's need to protect against showing duplicate results in their SERPS, and yes they need to penalise and protect against spammy doorway pages etc. but they also need to take into account legit reasons for duplicate content, and ensure there is no ongoing effects across the rest of the site (the duplicate page itself not ranking is fair enough)...

I'm pretty sure this is how it works, but most of the questions seem to be 'if I duplicate some content, will my whole site be penalised?' - this is something I don't think should happen... as long as it's kept to individual pages, I think that adequately serves all purposes.

Herenvardo

10+ Year Member



 
Msg#: 15426 posted 3:16 pm on Jul 23, 2003 (gmt 0)

I want to ask a new question:
In the rankings for my main keywords, i'm always compiting with another site, at www.competitor_domain.com. Looking a bit below in the results, i can see a page with the URL www.competitor_domain.net.
I've visited both pages and they where IDENTICALL. The same content, format, etc.
Is this spamming? If yes, can I inform Google and take away the competitor?
I don't like dirty playing (like publishing the dark things of a competitor); but if they play dirtier...
Thanks
Herenvardo

[edited by: Herenvardo at 3:21 pm (utc) on July 23, 2003]

Herenvardo

10+ Year Member



 
Msg#: 15426 posted 3:21 pm on Jul 23, 2003 (gmt 0)

Hi! It's me again
I've another question; now it's more about redirection than duplicating:
There is a page with the URL www.domain.org. Below in the results appear a page with the URL www.otherdomain.com. When I enter otherdomain.com, there's only a great text and image link to domain.org.
Is this spamming?
Thanks
Herenvardo

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved