Once we intuitively kept count (bigger, smaller, etc.) in our heads. Then came math and accounting systems (stones, seeds/grain, fiat currency, the abacus) and tracking systems (sheepskin and ink, parchment and ink, paper and ink, paper and typeset) early analysis tools (slide rulers, calculators, etc), then punch cards and early analog computing - all somewhat tangible and limited in their capacity, very analog.
Then came digital and Moore's law and supercomputers and ubiquitous networking.
For years I've been reading about the growing volume of "data" and "information". (Information is not to be confused with data, as it's a bit more loosey goosey, albeit data can also be collected and interpreted in slanted, selective, etc. ways.)
It's both amazing and not so amazing that so much data is being sought by tracking, monitoring, listening, etc. Every day there's a new device or system created to collect, record, analyze and report on data from any and all "changes in the system or status quo" - be that a human body, an enterprise, etc.
We've always been awash in data, albeit we either were not so consciously aware of its presence or role or significance. However, we sure as heck are being made aware of it now. The NSA spies. Analytics is everywhere. Every point of contact, every action becomes an occasion or opportunity for someone to observe and collect more data.
And decisions must be made.
Who is going to risk deciding without looking at THE DATA . . ALL THE DATA . . AND . . make sure we HAVE ALL the data!
So, given the ever growing glut of data I pause to ask a few questions, likely ones that anyone and everyone might want to pin to a board somewhere . . for the next decade.
Are we reaching the point where answers to questions involving "the data" are approachable, but unanswerable, by humans due to the sheer volume of available data?
Are we at the point where machines are making more decisions than humans? Are machines also taking over the role of "writing the decision making code" - to process the data, interpret the data, etc - due to humanity's limited ability to write code complex enough to process the data, etc?
When we rely on computers to crunch and analyze "the data" DO WE KNOW, that deep down in the code, what's going on . . that the machine is "getting it right"?
When there is so much data and so much of decision making is being assigned to software . . how do we know that we know when we no longer are doing the thinking?
With so much data available will simple questions cease to be respected, meaningful or even be allow to exist?
Is there such a thing as too much data producing bad results?