Forum Moderators: open

Message Too Old, No Replies

Are duplicate pages formatted for printing penalty targets?

         

Jon_King

12:10 am on Feb 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have a good site where the PR has dropped to 0.

The only thing I can figure besides a possible server outage during a crawl is that many pages on the site are almost exact duplicates with 'tighter tables' so to fit a standard page when printing. These are linked to each product page.

Is this a problem to Google?

Macguru

12:15 am on Feb 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Jon_King,

I dont know about getting PR 0 for this, but similar content in thighter tables is risky.

I keep them in a disalowed folder, and had no problem yet.

>>besides a possible server outage during a crawl

Hosted on M$ garbage? ;)

The idea is if body text is similar, it's almost the same page to indexers.

Jon_King

1:44 am on Feb 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Macguru,

It's real easy to hide em in a folder, so I will do that and as far as M$ garbage... yes it is IIs. Oh well.

If I have an occasional link I don't want spidered haven't I read about java links doing the same thing?

Macguru

1:48 am on Feb 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



AFAIK, spiders wont follow links 'cloaked' in JS. Also, if you dont want certain pages to be indexed, you can use the robots.txt file or the robots meta tag.

The Contractor

2:13 am on Feb 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Easiest way to keep duplicate pages formatted for printing is simply by adding: <meta name="robots" content="noindex,nofollow">

Jon_King

2:26 am on Feb 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As I thunk myself upside the head, thanks Contractor.