Oct 02, 2012 | By George Gantz

## P versus NP – a Confounding Issue in Mathematical Complexity

A simple problem – what is the shortest route for visiting a number of different cities – is extraordinarily difficult and may become impossible as the number of cities rises. If you can show that any “NP Complete” problem can be solved in a reasonable amount of time (“P”) you can win $1Million, as this is one of the seven “Millennium Problems” in mathematics that the Clay Foundation has offered a cash prize for solving. Personally, I doubt the NP Complete problems can be solved in P time – and I believe that their intractability demonstrates a hard stop to what we can “figure out”.

The shortest route puzzle is known as the traveling salesman problem, and it begins quite innocuously. Two cities – one option for the shortest route. Three cities – two options for the shortest route. Four cities – six options. Easy – calculate all the routes and you know the answer. But 10 cities – almost 400,000 options. 20 cities – nearly a quintillion options. At 100 cities – more options than there are atoms in the universe. Oops – the distance for each option cannot be calculated unless you have huge computers and lots of time – and don’t add another city!

The complexity in the traveling salesman problem increases so fast because the number of options keeps multiplying – an exponential explosion. The number of options for N cities is equal to (N-1)!, which is “N-1 factorial” or 1x2x3…x(N-1). Note that this speed of increase is actually far worse even than the exponential values 2^N (2x2x2… N times) or 3^N (3x3x3… N times). Quite unfortunately, many practical problems in scheduling, logistics, networking, computer science, fluid mechanics, genomics and other fields share this characteristic – as the number of data points increases, the complexity escalates boundlessly.

Are there shortcuts? For many practical problems, you may not need a perfect answer if the difference between perfect and “pretty good” is small. But for highly complex matters such as optimal network configuration, finding a better method of determining the “best” solution may be critical. And for the mathematicians, the question of whether there is any shortcut to the perfect answer is fascinating.

Mathematicians have classified the difficulty of solving a problem by the number of steps the solution requires, and by the difficulty of checking the solution we have come up with. For problems requiring only a “polynomial” number of steps to solve (e.g. the number of steps is less than N^2 or N^3 steps – note that this is much smaller than the exponential numbers 2^N or 3^N), we know that as N increases we will likely be able to solve the problem with a big enough computer. These are the “P” problems referenced in the title. The traveling salesman and other problems that seem to require 2^N, k^N, or N! steps to solve may (or may not) be outside of the P group, depending on whether there are any shortcuts.

As for checking a solution, we know we can calculate the length of any traveling salesman route pretty quickly. So assuming we had a lucky computer (mathematically referred to as “non-deterministic” or “N”) that always picked the best route, we can easily calculate the result – and that check takes less than a “polynomial” number of steps, so this problem is “NP”. More surprisingly, mathematicians (Stephen Cook and Richard Karp in 1971) identified and proved the existence of a super-class of NP problems, which they named NP Complete. If you can solve an NP Complete problem in P time, then you can solve any NP problem in P time! Conclusion – If there is ANY P solution to an NP Complete problem, then all NP problems are really just P. This would mean there are shortcuts which would make many computing and modeling tasks vastly simpler – and revolutionize the way we plan and analyze.

However, the prevailing expectation among mathematicians and other experts, ever since P versus NP was formulated, is that NP problems are not solvable in P steps – the world is not that simple. No one has yet found a P shortcut to any NP Complete computational problem. However, no one has ever been able to prove the contrary either – that an NP Complete problem cannot be solved in P time. So the mystery continues and there is now $1M Clay prize riding on it.

Many of you may be asking at this point – uh, so what is the big deal? Well consider – what if the world we live in consists mostly of processes that are of type “NP”? That is, the more data we get, and the more we try to analyze what the outcome of those processes will be, the more impossible the task becomes. In any finite amount of time it will be impossible to determine the outcome of the world’s functions with any precision. Are we beginning “to know what we don’t know” – and cannot know?

Seeing the limits of our finite, human capacity to know things about the world we live in is unsettling. Moreover, there are related mathematical findings that reinforce this sense of unease in fields such as chaos theory, which has found that in complex systems extremely small variations in initial conditions can result in widely divergent (as contrasted with convergent) outcomes, or in computer science, where many complex calculations are “irreducible” and cannot be shortened or predicted.

When I was young (and when science was young – say the 19^{th} century), the notion of determinism, the sense that the universe was a complex but ultimately predictable system of billiard-ball like interactions that mathematical formulas would be able to unravel precisely, was incredibly strong and appealing. It was very comforting and grounding to think that we would be able to figure it all out if we just had more time and more data.

The progress of mathematics in the last century has unraveled the determinist’s dream. The world and life is infinitely complex and non-deterministic – we do not have the opportunity to “figure it out”, we have to live it to see how it turns out.

But we do have a choice in what we believe about the world. Is non-predictability and complexity simply random chaos and our presence here an accident? Or is this a continuing reinforcement of a belief in divine creation and the rightness and proper order of the cosmos. Seems like an easy choice.

Sources:

“Machines of the Infinite”, by John Pavlus, Scientific American September 2012.

The Millennium Problems, by Keith Devlin (2002).

The Lifebox, The Seashell and The Soul, by Rudy Rucker (2005).