Aug 15, 2013 | By George Gantz
Uncertainty – More Data is Not Enough!
Today I read two articles in the online journal Nautilus: Michael Tuts’ “Discovering the Expected”, which discusses the process physicists have used to validate the discovery of the Higgs Boson, and Amir D. Aczel’s “Chasing Coincidences”, which highlights our tendency to remember the unexpected. The first article highlights the need to throw out the data that does not suit our purpose, and the second highlights how our unconscious attention throws out experience that does not contain meaning. In each case, it is the patterns and relationships that are important. But uncertainty is also significant from a metaphysical perspective.
The two articles both address the practical issues of dealing with very large data sets. For the physicists at CERN, there is an abundance of data about the expected patterns of particle decay from the high-energy collisions produced. This data needs to be identified and thrown out in order to allow the researchers to look for the needle in the haystack – the evidence of decay from the elusive and very rare Higgs Boson. For the average human, there is an analogous process – we ignore the things that are normal or expected, but vividly recall the very unusual or unexpected event. While the unusual event may be, statistically, a mere random occurrence, it will, never-the-less, grab our attention because it is something which has a meaning for us.
There are critical implications here in terms of how we draw conclusions from large data sets. As the size of our data sets increase, causation takes on a curious ambiguity. How do we know if something is random or the result of an unknown influence – given the seemingly infinite range of statistical possibilities in the data set? If we let our imagination run, and consider for example the large data set of all quantum events in our universe, the question gets quite profound.
To draw upon examples from the Axcel’s article, if a card player draws four aces three times in a row, I may conclude that the deck was stacked, even thought it is statistically possible for that result to be random. As a contrary example, it is possible (thought not probable) that an airline reservation agent knowingly put Mr. Axcel and Mr. Scott (who turned out to be a school friend of his wife) together on the plane flight where they, seemingly coincidentally, met. The event may, in fact, not have been random. How do we decide whether something is random – or caused by some agency at work?
This problem becomes more acute for very high consequence / very low probability events. How can we ever determine if the flap of a butterfly wing changed the course of a storm 1,000 miles away, or if the choice of a single human being changed the course of history? How can we ever determine if miracles are real – as I discussed in “Are Miracles Real” http://swedenborgcenterconcord.org/are-miracles-real/
In the case of large data sets, such as, for example, the events in the world at large or the universe of all possible quantum events, the conclusion is that more data will not and cannot provide the answer to key questions about human life, including the question of the existence of God. In practical terms, one cannot determine the truth of key conjectures about the world from more data. And, as I have previously noted <http://swedenborgcenterconcord.org/reflection-and-recursion/>, in logical terms “Some true conjectures can never be proved.”
We have no proof, and we have no data. All we have is faith.
“Anyone can, if he wishes, find support for the Divine idea in the sights of nature……. Those who support the idea of nature admittedly see these facts, but because they have mentally rejected the idea of man’s heavenly state, they call these nothing but the workings of nature.” Swedenborg, True Christian Religion #12.
3 Responses to “Uncertainty – More Data is Not Enough!”
Join the Discussion