Is adding a meta tag of noindex for google bots only considered cloaking?
We want to add this for our search results page, but only when there are no results to show, so that google does not index those no result pages. However we do not want to add this for users also, the reason being, this will break progressive rendering. i.e We have to wait sending any bytes (like Header, top navigation links etc) to the browser until we make a search on our index and it returns 0 results. The time to first byte for users will increase and this will result in poor site speed.
However we can wait for the search to complete for bots as we can be afford to be slower on site speed front for bots.
Also curious as to how google validates the Meta tag difference - if it is indeed considered as cloaking?
Doesn't sound like you'd be dinged for this. The pages with no results would otherwise be flagged as "soft 404" pages, so you're right to take some sort of action to keep the content from being cataloged.
Whether this is the "best" way is a whole other question.