Can someone suggest what should be done so that Google crawl the pages properly and frequently.
Best answer is probably to improve the quality and uniqueness of your pages to attract more and better natural links. "Properly" is Google's decision, not yours.
Over the last couple of years, Google has put major limitations on how many sites and pages submitted manually it will consider. Various tools, like the Fetch tool, were being misused by spammers, and also by third party "SEO" submssion services, to the degree that they were creating a considerable load on Google's resources. Google found it necessary to impose severe limitations on manual submssions, and has made it clear that it wants to keep manual submissions for "emergencies" only.
While the preferred method of manual submission now, if I remember correctly, is via sitemaps, Google makes no guarantees it will index all pages submitted.
Google does regularly crawl the web looking for pages it considers worth indexing, and while the exact algorithms are complex and secret, I'm sure they include the number and quality of inbound links, and the quality of content... how unique it is, and how useful. I'd guess also that Google keeps some historical notes on the quality of previous submissions from various sources. Very probably, the quality of the "network" with which the pages might be associated is also considered.
The following thread, from early 2018, presents a fairly complete overview of Google's policies and how the evolved, and is worth a read. I suspect, if anything, that the guidelines have only gotten stricter.
Big reductions in crawl-to-index limits on Google Fetch tool March, 2018 https://www.webmasterworld.com/google/4893740.htm [webmasterworld.com]