There's a rationale consensus that SE algos/filters appear to be - or ought to be - filtering out or weighting down sites that are mere copies of one another. What good is it, to searchers, to offer up the next - the 238th copy - of an affiliate site?
There are no doubt other factors that come into play: First to appear in the SERPs, most IBLs, unique IP, CC TLD, hosting/IP within country, whether the searcher choses to localize their search to a country, etc. There's no clear answer for the masses to your specific question, though consensus suggests "don't do it" - don't make duplicate sites. Those who have earned the insights needed to make well informed decisions have spent thousands of dollars and many hours testing hundreds, if not thousands of domains/sites, to see how small changes effect SE "digestion". That type of insight is not shared in public forums. If you want the best answer you have to run tests - lots of them.
The bottom line: SE's have an interest in not serving up multiple copies of the same material in their SERPs so that dictates that something - algo, filter, etc. - is needed to sort out the multiple copies.
I'd vote for tailoring the material a bit for the 2 sites.
If all you are looking for is type-in traffic then you can have all the duplicate domains you want.