aakk9999, thanks for finding that article again. I'd been looking for where I'd referenced it.
if you have more than 100-200 landing/product pages their automation begins to fall apart
Nutterum, your observation about the tool helps a lot. I'm sure you'd be right for most cases, as the process as described in the SEL article would be very hard to automate. I get a strong impression from the article that the process is as much an art as a science, and depends a lot on the specifics of the site being analyzed and on the problem being solved.
In the case of the site used in the article, an important clue for the writer was the naming convention of the page urls... which was keyword-oriented in a way that allowed some correlation between the targeted pages and the keyword traffic. Hard to say how often this is going to happen or how likely this might hold up over a lot of pages (and, more precisely I think, over a lot of keywords).
The analysis in that article, btw, was for spotting pages that were
not performing... not for correlating all WMT/GSC and Google Analytics data.
I've speculated elsewhere that, since Screaming Frog can now connect to the Google Search Analytics API and pull in various data, it might be useful in connecting Analytics with other data... but I've not been able to look into that further. It would be very helpful to know what might be done.