Forum Moderators: Robert Charlton & goodroi
The only thing i can figure out is that both are .com [ other sites in the network have regional TLD's ] ;
both have the same top 3 level URL structure e.g.
www.mydomain.com/ABC/XYZ/123/
www.yourdomain.com/ABC/XYZ/123/
both have the same template system, with sort results navigation bars across the body content.
Is this enough to trigger one of Google's "duplicate content" or "boiler-plate" -like filters.
with sort results navigation bars across the body content
By this, do you mean that content search results from a database can be sorted in different orders after they are first presented to the user? Are each of those "sorted" urls also allowed to be indexed? If that's what's happening, then this kind of pattern can indeed create duplicate content problems. Shuffling the order does not make each url's content unique.
The TLD is not a likely problem, and common infrastructure used to present different source content is definitely not a problem, or else all commonly used CMS such as WordPress, Joomla, etc would all be in trouble.
However, the categorisation is duplicated and heavily repeated on the page plus hyperlinked to the next level or directly to the subject page / URL .
Site A
My Widget Shop - Location A Suburb B
Site B
A Different Shop - Location A Suburb B
Location A and B are the same