The way I understand it, Google monitors just about every signal we can imagine (and some we can't). That's why they can say whether a signal is "reliable" or "noisy" or whatever.
So in that sense, they do consider everything -- but it only makes sense to build out automated infrastructure that supports a particular signal when it shows a statistical degree of dependability.
In their relevance algo, for example, this process is responsible for the shifting values of the H1 element. So I imagine they will continue to monitor the location meta tags, and if they show more dependability in the future, then they would get folded into the brew.