Human beings are often credited with intuition, the capacity of knowing something without being able to explain where the knowledge comes from. Intuitions are inherently inscrutable, even to ourselves. Interestingly, as Artificial Intelligence researchers have tackled the challenges of interrogating big data with “deep learning” algorithms, they are finding the algorithms great at scouring through masses of data to make good predictions, but programmers can’t explain how the algorithms reach their conclusions. Like their human creators, algorithms are becoming inscrutable. <MORE>
Imagine 100 monkeys typing (presumably randomly) on 100 typewriters for a limitless period of time: Eventually, hidden somewhere in the seemingly endless streams of nonsense, they would produce all of the works of Shakespeare. This popular thought experiment has been around for more than a century (longer than typewriters!) and demonstrates interesting features of both randomness and infinity. It is a useful starting point for discussing unique problems now being encountered with large data sets.