Welcome to WebmasterWorld Guest from 3.227.3.146

Forum Moderators: not2easy

Message Too Old, No Replies

Duplicate content and robot.txt

Could robot.txt disallow avoid content duplicate?

     
8:12 pm on Jan 23, 2005 (gmt 0)

Senior Member from FR 

WebmasterWorld Senior Member henry0 is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 19, 2003
posts:4449
votes: 11


A “Print page” content duplicate question:

Site: PHP and MySQL using Object Oriented Programming
The print page is dynamically generated

Without debating on print page technique

I would like to be sure that if I disallow the print page in robot.txt
My print page will not be considered as duplicate content

Is the print page needed? Absolutely it is a major factor in this site; I have a program that upon payment delivers a set of pages with answers tailor-made to each user’s special requirements.

<<< edit
While thinking about it I am wondering if there is any content duplicate at all?
Actually I do not think so
since each page will have the same paragraph headers
but a content based upon user's requirements
so possibly I should not worry?
/edit>>>
Thank you

Regards

Henry

9:15 pm on Jan 24, 2005 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:26471
votes: 1080


Correct, there is no real need to worry. In any case, you don't really want visitors coming into the site from the search engines to the print page.
9:56 pm on Jan 24, 2005 (gmt 0)

Senior Member from FR 

WebmasterWorld Senior Member henry0 is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Apr 19, 2003
posts:4449
votes: 11


Thank you very much
I can now move forward

regards

Henry

3:01 pm on Jan 25, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Feb 19, 2004
posts:207
votes: 0


Very interesting idea. I have a site that contains affiliate links and adsense. I want to provide the same content without the commercial stuff to a small group of people in a listserv I belong to. No commercial stuff allowed there.

The site is making money, so I don't want to take the ads off. I just want to copy it to a subdomain without the ads and tells my colleagues the url.

If I disallow robots on every page of the copy, can I escape the duplicate content penalty?

8:18 pm on Jan 25, 2005 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:26471
votes: 1080


Yes, because the search engines will be banned from spidering the duplicate.

If they can't spider it, they can't run any dupe page checking.

9:06 pm on Jan 26, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Feb 19, 2004
posts:207
votes: 0


Thanks, engine. I was thinking that would be the case, but I wanted to hear it it from an expert.