Forum Moderators: open
How about the code and the page template? Are those the same? And are there elements in the site navigation that are the same, like link text in global navigation and page and/or directory names?
U.S. Patent Office page [patft.uspto.gov]
Patent
The Patent [patft.uspto.gov]
allows them to compare documents by assigning a number of fingerprints to a given document.
I am not sure what they already implemented but at least in future you might get in trouble.
[edited by: Marcia at 4:19 am (utc) on Feb. 18, 2004]
[edit reason] Fixed side scroll. [/edit]
How about the code and the page template? Are those the same? And are there elements in the site navigation that are the same, like link text in global navigation and page and/or directory names?
They are all going to be running the same template, but the nav options are all different. the Directory structure is all the same.
Are the sites still in google and ranking well?
Currently we are blocking those pages from being indexed via the robots.txt file until we know for sure that it will not adversely affect the parent site, or the affiliates.
I would also recommend you to use different titles and Meta Tags.
They all have unique title tags, and we don't use meta tags due to their lack of SEO value.
I would also highly recommend not to use the same server and IP's. That is another way duplicate pages can be detetced.
Unfortunately, this is not possible. It is all run off the same backend on the same server.
It should help for you know my purpose of making these syndicated pages available to search engines. Basically, the affiliate sites are not being deep crawled and one of the reasons suggested is due to the lack of PR and backlinks from other sites. So what I want to do is make a section of a few web directory pages where every one of the affiliate sites directly and legitimately link to the other 250 sites hopefully boosting PR. What do you think of this idea, and what would be the results of implementing this idea? Good or bad?
One objective of gianing good PR is to get your site recognised as having authority on a subject. If you dilute that down across several duplicate sites and domains the whole exercise dilutes the authority that you are attempting to gain in the first place.
In days of old we used to split our results across several domains (identical IP or not) - however the G algo soon illustrated that all we were doing was dissolving our own relevance.
We now concentrate on one domain and work it really hard with on topic content. A far better model - less work - very consistant and G loves it.
8>)Sunny