There have been a few earlier mentions of Lighthouse from the developer’s side, but none have addressed the question:
Is there any reason not to block the UA by name? In general I see the “Chrome-Lighthouse” element in blocked robots, out of sight out of mind. But recently I got a blizzard of fully humanoid and at the same time highly suspect (this description is intentionally opaque, but you know what I mean) visits from a tag team of
Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3694.0 Safari/537.36 Chrome-Lighthouse
and
Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3694.0 Mobile Safari/537.36 Chrome-Lighthouse
Both from the googloid ranges of 66.249.81 and ..84; the first few came with an X-Forwarded-For header that shed no light (haha).
They caught my notice because they requested the same page, with supporting files, a total of 46 times in the course of an hour. This strikes me as excessive. There were also multiple requests for robots.txt, where they would have seen a comprehensive Disallow.