I'll begin by disagreeing, you can use JS provided that all links are added to the pages before completion of the page load ("load" event) and the page loads quickly. Googlebot will "typically" sees these links and be able to crawl them.
Now for the agreeing, in the case of non-critical content I would be happy relying on the "typically" of above, but menu links are critical and even if it would be fine, I would always be worried that maybe, it isn't. So despite what I said I would not "program my menu in JS".
Let me add, the statement:
programmed in Java[Script].
is very vague.
What is the JS used for? Typically JS is added to menus by fancy web-designer to make the pull down menu scroll down smoothly, and to add spiffy hover effects. If this is the case and all the menu links are hard coded in the html then I would not be too worried. But if the JS is used to dynamically populate the menu links, then you need to be concerned.
When Googlebot crawls a page it does not immediately render the JS, it simply collects the data and indexes it. Googlebot will then render the page at some later point if it deems it necessary. This can take weeks or can never occur. Google recommends that if you are providing client side rendered content that you create a second version of the page for Googlebot (and other bots) that is pre-rendered server side.
have you tried "fetch and render" in GSC?
Given the statement above, one cannot rely on Fetch and Render as one can be certain that the "render" portion will occur.