Problems With Modern Applied Economics, Applied Economic Analysis

In many ways the modern movement toward applied modeling is laudable. It empirically analyzes evidence through modeling and attempts to avoid the pontificating that characterized earlier periods. But it has problems. Since the connection with general equilibrium theory has been eliminated, there is no theoretical core to limit assumptions. Put simply, one can argue that modern applied economics is mostly data mining with some semblance of "scientific empirical testing" added to make it seem less ad hoc. Don't misunderstand us; there is nothing wrong with data mining. One can discover a great deal about the economy by examining the data. But employing such an approach undercuts one's ability to formally and statistically test the results. If the choice of the model is ad hoc, then the results are ad hoc. That doesn't mean that the models can't be informally tested and compared with reality, but the major emphasis in modern economics is on formal empirical testing of the models, and while there is much seemingly formal testing, for many the testing is not satisfying because the requisites of testing are not met.

The problem is exacerbated by the incentives to publish that exist within the profession. These incentives often lead economists to choose ad hoc pragmatic models—because of their likelihood of getting published—which require a positive test of statistical significance and empirical statistical applicability, rather than reasonable results. These problems are serious, but they are not the problems of neoclassical economics. In fact, they are problems that developed because modern economics has moved away from the neoclassical assumptions and become more eclectic.

One way modern economics is dealing with this problem is through the work of complexity theorists. Their work provides an alternative to a general equilib­rium foundation. In the complexity approach, one takes the position that something so complex as the aggregate economy cannot have formal analytic foundations; hence our understanding of it must proceed through alternative means. In complex systems, order develops spontaneously as patterns emerge. The simplicity of complex systems is to be found in the study of dynamics and iterative processes, not in structural simplicity. In the complexity approach, everything is data mining, but it is a highly sophisticated data mining done under specific rules—rules that are just now being developed. It is still a modeling approach, but the work is done with computer simulations. In thinking about these recent developments, one individual stands at the center of modern economics: John von Neumann. His 1944 book on game theory with Oskar Morgenstern pointed the way to expanding general equilibrium via game theory, and his work on artificial life and computers is at the foundation of the complexity approach to economics. The ever-falling costs of computing will push this approach forward in the twenty-first century, and in future editions we will likely see much more discussion of von Neumann.