It is different.
Yes and no. Google looks at various kinds of patterns as indicators of coordination, and it regards them as
possibly manipulative. Some patterns are easier to observe and more indicative of spamming than others.
In link building, Google looks not only at anchor text repetition, but also at the rate of acquisition, eg, as an indicator. It also looks at similarity in text surrounding links as a possible indicator of coordinated linking. As your question suggests, there are many more possibilities.
In computational terms, it may or may not be worth Google's while (nor dependable enough) to factor all observable patterns into its algorithm. Not all patterns are spam. Some are a natural part of link building. When the link building isn't natural, it's generally a good practice to avoid extremely frequent repetition of obvious signs that might catch Google's eye.
IMO, repeated keywords in a file-path aren't going to give you a large ranking boost, but they are easily observed and look spammy enough that Google is likely to note that there might be a common source for these articles. We don't know when or if Google might do something about it, but I don't think the gain in this case is worth the risk.
I'd avoid any monolithic strategy.