Evolution Technique Methods Of Evolution

Evolving Techniques, Methods of Evolution, Evolution Technique

The evolution of microeconomics has entailed a progression from one mathe­matical language to another, each of which has been able to resolve some of the ambiguities that marred its predecessor. Initially, economists such as Paul Samuel­son and John Hicks (1904-1989) translated the geometry of the 1930s into the multivariate calculus of the 1960s. The partial differentials of calculus repre­sented the interrelationships among sectors; the sign of the second partial derivative illustrated stability conditions; and the sign of the first derivatives captured the interactive effects. Cross-partial elasticities of demand, linear homogeneous production functions, homothetic demands, and constant elastic­ity of substitution (CES) production functions all appeared in microeconomic terminology. The results of the mathematical reformulation of microeconomics are impressive. As economists worked through the problems, they began to perceive the relationship between prices and Lagrangian multipliers (the values of the constraints). The question of whether prices were inherent in economic systems had been debated previously, but now mathematical economists could show that prices occurred naturally through a maximization process and that even in the absence of markets, constrained maximization will still have a "price" (called a shadow price). If prices do not exist, another rationing device must replace price.

They also showed how one can easily reformulate a maximization problem subject to a constraint into a constrained minimization problem: by switching constraints and objective functions, the problem "Maximize output subject to technical production constraints" is equal to the problem "Minimize cost subject to producing a certain output." Such a reformulation, which is known as "analyzing the dual," affords insight into the nature of the maximization problem by showing how slight changes in the output or constraints change the situation. These developments had both practical and theoretical significance. On the practical side, the understanding of shadow prices and duals led to significant developments in modern management techniques. On the theoretical side, the analysis of the dual added to economists' analysis of scarcity, a symmetry that deepened their understanding of the problem. What previously took volumes to present (often incorrectly) could be covered in one or two pages (for those who knew the language). Given the earlier misuse of informal models and confusion about their implications, most economists saw these developments as a significant gain. The 1987 winner of the Nobel Prize in economics, Robert Solow, com­mented:

I detect a tendency ... to idealize the old, nonformalist days in economics. I lived through those days and I was educated when that was the way economics was done, and let me tell you—they were not so great at all. They were pretty awful, in fact. My nonformalist education was full of vagueness and logical inconsistency and wishful thinking and mere prejudice and post hoc propter hoc, and pontification was everywhere in the classes I took and the lectures I went to.

But the reformulation of microeconomic theory in terms of multivariate calculus also had problems. Multivariate calculus requires an assumption of continuity and poses the maximization problem in a highly rarefied way. In response to these shortcomings of calculus, economists modified the maximization problem in a number of ways—some of which made microeconomics more practical and useful in business, while others provided deeper understanding of the economy.
By the 1970s the possibilities of comparative static calculus had begun to be exhausted, and the cutting edge of theoretical work was being done in dynamic calculus, in which time is explicitly taken into account. To see why dynamic calculus is relevant, consider the production problem. The intermediate micro-economic approach is to say that the firm faces a production problem: given a set of inputs and relative prices, it chooses an optimal quantity of output. But where is time in the model? It is suppressed; so how the model actually works is unclear. Adopting a comparative static interpretation provides a somewhat temporal dimension. The problem is considered twice: before and after a single change. Thus, it becomes an analysis of two points in time. No consideration is given, however, to how one gets from one point to the other or to how long that time period is.

For a better analysis of the process of getting from one point to the other, the mathematical formulation of the problem must explicitly include the time path along which one goes from the initial state to the end state. The calculus that accomplishes this is optimal control theory. Students typically learn optimal control theory in the calculus course following differential equations, which follows multivariate calculus. The solution sets are similar, but instead of being expressed in Lagrangian multipliers, they are expressed in Hamiltonians and bordered Hessians.

After increasing the complexity of the calculus it used, microeconomic analysis expanded away from it, for both practical and theoretical reasons. Practically, it moved toward linear models, because linear algorithms existed by which one could more easily compute numerical solutions. Thus, a simple linear formula­tion was more relevant to real-world problems, and linear, network, and dynamic programming were added to the economist's tool kit. In theoretical work, the formulation of the general equilibrium problem soon went beyond calculus to set theory and game theory. Economists preferred these approaches because they were more precise and did not require assumptions of continuity, as calculus did. As the techniques changed, so did the terminology; terms such as upper-semi continuous and a Cournot-Nash equilibrium became commonplace in graduate microeconomic theory courses.

Another significant change in microeconomics is evident in its handling of uncertainty. Economic decisions must be made in the face of an uncertain future. Marshall did not attempt to tackle the uncertainty problem directly. Modern microeconomics, however, formally confronts uncertainty, though often with stochastic rather than static processes. To analyze such models, microeconomics uses applied statistical decision theory, a blend of statistics, probability theory, and logic.


As is often the case, one can look at developments in a field in both a positive and a negative light. Take game theory, for example, which we have presented briefly as simply an alternative, more elegant and precise, way of performing general equilibrium analysis. It is that, but it is also much more. It is the most general analysis of human interaction that exists, and it offers ways in which economists can analyze interdependent actions that they otherwise have to assume away. Thus, it offers practical models for understanding oligopoly behavior, which comprises large portions of most Western economies. Similarly, it offers enormous insight into social problems, as it does in Thomas Schelling's (1921- ) work, such as The Strategy of Conflict (1960). Alternatively, it offers a method of synthesizing all the social sciences, as it does in Martin Shubik's (1926- ) Game Theory in the Social Sciences (1982).

Thus, the logic of game theory is as compelling today as it was when John von Neumann and Oskar Morgenstern first published The Theory of Games and Economic Behavior in 1944. Modern graduate education strongly emphasizes game theory approaches, and games are part of the modern economist's tool kit.