Computational Irreducibility in Economics – An Economist’s Review of A New Kind of Science

Posted on July 28, 2012

6


Beginning in the mid-20th century, economists have published articles, books and original research whose content consisted increasingly of systems of equations acting as economic models, and decreasingly of economic reasoning and theory. This was based on the belief that the epistemology of science relied on mathematics as providing verifiable theories, such as those that were prevalent in physics. Stephen Wolfram, a prodigy physicist, began writing a computer package to do physical modeling in the 1980’s after studying elementary computational models. The package, called Mathematica, instead of providing proof of the supremacy of mathematics in science, showed that it was only meaningful in very specific cases, and thus that computational models are more fundamental to science. He published his study of fundamental computations under the title A New Kind of Science. The conclusions provide not only a refutation of mathematical models of economics, but a vindication of the Austrian method of economic science, praxeology.

Wolfram begins his theory by generating illustrations of one of the simplest computational systems there is, a one-dimensional, binary cellular automaton. These systems are selected due to their clarity when their behavior is displayed on pages of a book, forming a neat two-dimensional progression pattern.

By enumerating all possible rules within this framework (256 in total), Wolfram classifies them into four categories.

Repeating patterns (such as rule 1)

Nesting patterns (such as rule 90)

Random patterns (such as rule 30)

Complex patterns. (Mixes all three beforehand, such as rule 110.)

He then proceeds to explore the implications of the existence of random and complex patterns, from simple rules and simple initial conditions, over the practice and theory of all science.

He arrives at many important conclusions as concerns the science of economics.

Mathematical modeling’s limits

The most significant conclusion concerns the use of mathematical equation models. One of the major features of Mathematica is solving mathematical equations, which Wolfram calls systems based on constraints. He finds that enumerating constraints is extremely weak when it comes to generating complex behavior, as it does not explicitly detail how this behavior is generated. One must therefore attempt a multitude of different strategies, and it then becomes the strategy that determines if complex behavior will be modeled or not.

Mathematics, or systems based on constraints such as equations, are only really useful for modelling simple repeating and nesting systems, as these systems involve quite a significant amount of repetition and it is therefore possible to abstract this repetition away and guess the outcome before the system arrives at it. However, when it comes to complex systems, repetition does not occur, and thus these systems are called computationally irreducible – the only way to predict or model their behavior is through the simple computation itself.

This explains why mathematical models can predict the orbital paths of planets, but not the weather patterns on the planets themselves. Weather patterns are subject to randomness.

In economics, the concept of arbitrage of prices reflects the phenomenon of predicting repetitive patterns. A successful arbitrageur is one who knows when prices are going to fall into a specific pattern, and by his buying and supplying action, he modifies the pattern. This means that any theory of economics that attempts to predict prices can only result in an arbitrage action that ends the predictive power of the model, and thus that these models cannot be used to establish economic laws. What it also implies is that economic action is itself not repetitive and that it reacts to repetitive behavior. A modeling system that is constrained to repetitive patterns can therefore only fail to explain the bulk of economic phenomena, and arrive only at trivial conclusions.

Randomness and economics

Chaos theory identified a fact called sensitivity to initial conditions, colloquially known as the “butterfly effect”, as a cause of predictive failure of a model. Wolfram demonstrates this by changing a single block in the starting conditions of each elementary cellular automaton, and showing the comparative results. Sure enough, for simple, repetitive models the change is averaged out and the system’s outcome is more or less the same, allowing mathematical prediction. For complex patterns, however, the “error” propagates outwards, completely altering the result.

But that is not sufficient for Wolfram, since this still implies that the initial conditions are randomized by an outside process. He therefore identifies another form of randomness, intrinsic production of randomness from the computation itself, explaining the source of randomness in nature. He demonstrates this by attempting to apply methods of perception and analysis to chaotic cellular automata starting from simple initial conditions, and seeing no noticeable pattern in their perpetual growth.

This reminds us of Hayek’s principle of tacit knowledge. Hayek claimed that scientific knowledge was not the only relevant knowledge to an economy, that the particulars of time and place mattered just as much, and that the only way to reveal this information was through a process of exchange based on market prices. This is the same principle as the behavior of a cellular automaton, which samples the particulars of its neighbors to decide its behavior. The resulting random movement of adjustment can thus explain, for instance, the fractal movement of financial prices, as each market actor reacts to the last action of the last market actor.

Wolfram generalizes computational irreducibility to the matter of free will, and why human action seems to be free of being predicted by any law. According to computational irreducibility, the processes taking place in the human mind are maximally complex, and there is no way to produce a faster computation to arrive at their results, thus giving the impression of free will from the perspective of any other system in the universe.

That is to say, each starting pattern of a complex process, or initial condition, produces a unique behavior that cannot be abstracted away by a simpler process, and that makes any such patterns necessarily unpredictable and free of external motion.

A methodology for synthetic science

Ludwig von Mises classified scientific knowledge using a Kantian model of two axis: the a priori and a posteriori, and the synthetic and analytic. A priori knowledge comes prior to experience, and is used to interpret our experiences, while a posteriori knowledge is derived from experience and observation. Analytic knowledge is knowledge reduced from a large body of facts, while synthetic knowledge is derived by taking elementary subjects and combining them in interesting ways. An example of a synthetic a priori fact is the equation 1 + 2 = 3, where knowing the meaning of 1, 2 and + does not imply anything until their synthesis is computed to be 3, and the experience of observing 1 apple and 2 other apples does not reveal the meaning of 3.

An example of an analytic, a posteriori theory is the practice known as econometrics. Econometricians believe that they can use techniques of regression analysis to extract constants from large bodies of economic data, which Austrians claim is impossible since economic data consists of reported historical events instead of precise measurements. Econometricians begin by sampling a lot of price information for a particular good, say oil barrels, over a specific timeline, and then use regression equations to see if there is any relationship between time and the price of oil. If such a relationship existed as fact, however, the price of oil would not be free to fluctuate (computationally irreducible), thus all econometricians can discover is the relationship between time and a particular price of a barrel of oil in a particular time span.

According to Mises, economics, that he generalized as praxelogy, is a synthetic a priori science, like arithmetic. It involves taking economic axioms such as action and scarcity of means, and synthesizing them with initial conditions such as a man alone on an island, or a two men alone on an island, then using logical deductions to illustrate their choice of actions.

While this practice has been derided as unscientific by neoclassical and keynesian economists, it is exactly the same method that Wolfram proposes to use to do fundamental science in physics and other natural sciences. Wolfram claims that, by doing computational modeling of simple systems, we can simply “search” through their results for patterns that replicate our experience, thus verifying patterns that we know to be “apodictically certain” (Mises’ terminology) from their computational rules and initial conditions.

Wolfram is attempting to search for the ultimate rule of the universe, but his theory of computational irreducibility, combined with the theory of universality of computation, provides for an infinity of other simple rule systems emerging in the universe. The science of economics can therefore claim to have found such rule systems as early as the 18th century, and even before in ancient texts.

So how would an economist inspired by Wolfram’s science continue his research in economics? He would do it like Wolfram studied his cellular automata, enumerating all the possibilities of economic behavior, testing them under a variety of sufficiently comprehensible initial conditions, and observing the behaviors that emerged for patterns that occur in the real world, constrained by computational irreducibility. This is precisely the method of economics done by the Austrian school, the method used in the 19th century by the original economists, and the method that will bring us as clear an understanding of economic behavior as axioms systems have provided an understanding of mathematics.

Advertisements
Posted in: Original