### Beginning in the mid-20th century, economists have published articles, books and original research whose content consisted increasingly of systems of equations acting as economic models, and decreasingly of economic reasoning and theory. This was based on the belief that the epistemology of science relied on mathematics as providing verifiable theories, such as those that were prevalent in physics. Stephen Wolfram, a prodigy physicist, began writing a computer package to do physical modeling in the 1980’s after studying elementary computational models. The package, called Mathematica, instead of providing proof of the supremacy of mathematics in science, showed that it was only meaningful in very specific cases, and thus that computational models are more fundamental to science. He published his study of fundamental computations under the title A New Kind of Science. The conclusions provide not only a refutation of mathematical models of economics, but a vindication of the Austrian method of economic science, praxeology.

Wolfram begins his theory by generating illustrations of one of the simplest computational systems there is, a one-dimensional, binary cellular automaton. These systems are selected due to their clarity when their behavior is displayed on pages of a book, forming a neat two-dimensional progression pattern.

By enumerating all possible rules within this framework (256 in total), Wolfram classifies them into four categories.

Repeating patterns (such as rule 1)

Nesting patterns (such as rule 90)

Random patterns (such as rule 30)

Complex patterns. (Mixes all three beforehand, such as rule 110.)

He then proceeds to explore the implications of the existence of random and complex patterns, from simple rules and simple initial conditions, over the practice and theory of all science.

He arrives at many important conclusions as concerns the science of economics.

### Mathematical modeling’s limits

The most significant conclusion concerns the use of mathematical equation models. One of the major features of Mathematica is solving mathematical equations, which Wolfram calls systems based on constraints. He finds that enumerating constraints is extremely weak when it comes to generating complex behavior, as it does not explicitly detail how this behavior is generated. One must therefore attempt a multitude of different strategies, and it then becomes the strategy that determines if complex behavior will be modeled or not.

Mathematics, or systems based on constraints such as equations, are only really useful for modelling simple repeating and nesting systems, as these systems involve quite a significant amount of repetition and it is therefore possible to abstract this repetition away and guess the outcome before the system arrives at it. However, when it comes to complex systems, repetition does not occur, and thus these systems are called *computationally irreducible* – the only way to predict or model their behavior is through the simple computation itself.

This explains why mathematical models can predict the orbital paths of planets, but not the weather patterns on the planets themselves. Weather patterns are subject to randomness.

In economics, the concept of arbitrage of prices reflects the phenomenon of predicting repetitive patterns. A successful arbitrageur is one who knows when prices are going to fall into a specific pattern, and by his buying and supplying action, he modifies the pattern. This means that any theory of economics that attempts to predict prices can only result in an arbitrage action that ends the predictive power of the model, and thus that these models cannot be used to establish economic laws. What it also implies is that economic action is itself not repetitive and that it reacts to repetitive behavior. A modeling system that is constrained to repetitive patterns can therefore only fail to explain the bulk of economic phenomena, and arrive only at trivial conclusions.

### Randomness and economics

Chaos theory identified a fact called sensitivity to initial conditions, colloquially known as the “butterfly effect”, as a cause of predictive failure of a model. Wolfram demonstrates this by changing a single block in the starting conditions of each elementary cellular automaton, and showing the comparative results. Sure enough, for simple, repetitive models the change is averaged out and the system’s outcome is more or less the same, allowing mathematical prediction. For complex patterns, however, the “error” propagates outwards, completely altering the result.

But that is not sufficient for Wolfram, since this still implies that the initial conditions are randomized by an outside process. He therefore identifies another form of randomness, intrinsic production of randomness from the computation itself, explaining the source of randomness in nature. He demonstrates this by attempting to apply methods of perception and analysis to chaotic cellular automata starting from simple initial conditions, and seeing no noticeable pattern in their perpetual growth.

This reminds us of Hayek’s principle of tacit knowledge. Hayek claimed that scientific knowledge was not the only relevant knowledge to an economy, that the particulars of time and place mattered just as much, and that the only way to reveal this information was through a process of exchange based on market prices. This is the same principle as the behavior of a cellular automaton, which samples the particulars of its neighbors to decide its behavior. The resulting random movement of adjustment can thus explain, for instance, the fractal movement of financial prices, as each market actor reacts to the last action of the last market actor.

Wolfram generalizes computational irreducibility to the matter of free will, and why human action seems to be free of being predicted by any law. According to computational irreducibility, the processes taking place in the human mind are maximally complex, and there is no way to produce a faster computation to arrive at their results, thus giving the impression of free will from the perspective of any other system in the universe.

That is to say, each starting pattern of a complex process, or initial condition, produces a unique behavior that cannot be abstracted away by a simpler process, and that makes any such patterns necessarily unpredictable and free of external motion.

### A methodology for synthetic science

Ludwig von Mises classified scientific knowledge using a Kantian model of two axis: the *a priori* and *a posteriori*, and the *synthetic* and *analytic*. *A priori* knowledge comes prior to experience, and is used to interpret our experiences, while *a posteriori* knowledge is derived from experience and observation. Analytic knowledge is knowledge reduced from a large body of facts, while synthetic knowledge is derived by taking elementary subjects and combining them in interesting ways. An example of a synthetic *a priori* fact is the equation 1 + 2 = 3, where knowing the meaning of 1, 2 and + does not imply anything until their synthesis is computed to be 3, and the experience of observing 1 apple and 2 other apples does not reveal the meaning of 3.

An example of an analytic, *a posteriori* theory is the practice known as econometrics. Econometricians believe that they can use techniques of regression analysis to extract constants from large bodies of economic data, which Austrians claim is impossible since economic data consists of reported historical events instead of precise measurements. Econometricians begin by sampling a lot of price information for a particular good, say oil barrels, over a specific timeline, and then use regression equations to see if there is any relationship between time and the price of oil. If such a relationship existed as fact, however, the price of oil would not be free to fluctuate (computationally irreducible), thus all econometricians can discover is the relationship between time and a particular price of a barrel of oil in a particular time span.

According to Mises, economics, that he generalized as praxelogy, is a synthetic *a priori* science, like arithmetic. It involves taking economic axioms such as action and scarcity of means, and synthesizing them with initial conditions such as a man alone on an island, or a two men alone on an island, then using logical deductions to illustrate their choice of actions.

While this practice has been derided as unscientific by neoclassical and keynesian economists,** it is exactly the same method that Wolfram proposes to use to do fundamental science in physics and other natural sciences. **Wolfram claims that, by doing computational modeling of simple systems, we can simply “search” through their results for patterns that replicate our experience, thus verifying patterns that we know to be “apodictically certain” (Mises’ terminology) from their computational rules and initial conditions.

Wolfram is attempting to search for the ultimate rule of the universe, but his theory of computational irreducibility, combined with the theory of universality of computation, provides for an infinity of other simple rule systems emerging in the universe. The science of economics can therefore claim to have found such rule systems as early as the 18th century, and even before in ancient texts.

So how would an economist inspired by Wolfram’s science continue his research in economics? He would do it like Wolfram studied his cellular automata, enumerating all the possibilities of economic behavior, testing them under a variety of sufficiently comprehensible initial conditions, and observing the behaviors that emerged for patterns that occur in the real world, constrained by computational irreducibility. *This is precisely the method of economics done by the Austrian school, the method used in the 19th century by the original economists, and the method that will bring us as clear an understanding of economic behavior as axioms systems have provided an understanding of mathematics.*

*Original*

SocialScientist

July 29, 2012

Wolphram is wrong in his exaggerate claims, and you’re wrong in your application of them here. The entire enterprise of social science (economics included) is to find theories which take really complicated systems and strategic interactions and reduces them to useful simplifying statements which generally explain variation in the empirical record. There’s nothing special about randomness- most theories these days are probabilistic. There’s nothing assumption/constraint free about this approach- it’s a simple version of agent based models. There are loads of real world problems which can be represented with formal models, without resorting to a fancy emergent behavior from an agent based model. The proposal to “enumerat[e] all the possibilities of economic behavior” is a particularly silly suggestion. Leaving the number of possibilities aside, it’s a waste of time. We target limited resources at the most interesting questions with the best methods and data available. If you have a toy agent based model that gives you traction, then great, but it’s hardly a paradigm.

strangerousthoughts

July 29, 2012

The problem with probabilistic models is that they imply a different kind of randomness than the complex randomness Wolfram is suggesting, which is to say a distribution of events. By the central limit theorem, as Wolfram describes, the shape of a random pattern will even out to round, but that is only true for type 3 systems. For type 4, their shape always remains random, thus they don’t even have a distribution, which is why they can be described as computationally universal.

This is why, for instance, Nassim Nicholas Taleb warns of “black swan events” wiping out financiers who rely on assumed distributions for their trading strategies. There is no way to establish the actual distribution of financial prices, as doing so would result in action that changes the distribution.

As for your other point, yes, there are some aspects of social science that can be modeled with mathematics, but Wolfram accounts for that. He does not disagree that some physics can be modeled with mathematics, but he does demonstrate why not all of physics can be, and the interesting parts cannot be.

SocialScientist

July 29, 2012

In the abstract there doesn’t seem to be a disagreement, there’s some partition between events that are better (more easily) described with a closed form solution and there are others where an emergent solution makes sense (the fun El Farol Bar problem for example).

Where there is disagreement is on where the partition lies, particularly with statements like “the interesting parts cannot be,” “fail to explain the bulk of economic phenomena, and arrive only at trivial conclusions,” “the method that will bring us as clear an understanding of economic behavior.”

To put meat on the argument: Where exactly in social science is there a poverty of useful explanations, and how exactly is this approach going to fill that hole better than a research agenda full of functional forms? I ask because a debate over one’s favorite technology is rarely productive (i.e. the pathetic exchanges between the qualitative and f/q camps in recent decades).

strangerousthoughts

July 29, 2012

There is no poverty of useful explanations in economics – Austrian economic theory provides an extremely lucid description of economic organization. The issue here is that a large establishment of economists consider this theory to be unscientific. And they’re dead wrong about that, its methodology is in fact the only available mean of tackling the problem.

dnarby

January 29, 2013

Strangerous, I hope you are well. I miss your writing, and although I understand it takes considerable time to craft words, perhaps you could from time to time comment on various current events. Cheers! -Dave

strangerousthoughts

January 30, 2013

I’ve unfortunately run out of things to discuss, because most of the controversies that come up are repeats of the arguments already made on this site.

I’m usually hunting for new debates on Reddit, and you can find me in the anarcho-capitalism section.