-
- Our new model includes many new features: For example, instead of using a representative household, it includes an aggregated population that is accurate in millions of households (age, education, race, and consumption habits). Instead of using a proxy firm, we model the behavior of tens of thousands of large firms, through one-to-one correspondence with real firms….
-
- J. Doyne Farmer, Making Sense of Chaos: A Better Economy for a Better World p. 258
-
- Our new model includes many new features: For example, instead of using a representative household, it includes an aggregated population that is accurate in millions of households (age, education, race, and consumption habits). Instead of using a proxy firm, we model the behavior of tens of thousands of large firms, through one-to-one correspondence with real firms….
Minstream economics is willing to build an economic model based on the “representative person.” A single hypothetical consumer represents all households. One projection firm represents all businesses. Many different types of workers are grouped together as “labor.” Many different types of equipment and other productivity-enhancing factors (such as business reputation or know-how) are grouped together as “capital.” I have long been skeptical of this approach to economics, which I call the “GDP factory” analysis.
For decades, J. Doyne Farmer and her small team of like-minded researchers strive for and use a different approach. Borrowing from the field of ecology, they wish to build models that include agents that use different strategies within a system as a whole.
The individual representation method involves carefully selecting a set of assumptions in the economist’s head about the individual’s behavior, representing them as equations, and solving equations for one equation. It predates the computer age.
Farmer’s method, shown in his new book Making Sense of Chaos, requires a very different modeling strategy, called “agent-based modeling.” It begins by looking at how different people choose livelihood, food, and investment strategies. The goal is to see how these strategies interact over time. This requires computer simulation.
For example, consider the stock market. The “proxy” approach assumes a single investor with full knowledge and a single strategy to maximize return relative to risk. The farmer’s approach instead begins by looking at the types of strategies used by different investors. Others focus on the basics. Others try to spot trends. Everyone has different experiences and uses different heuristics.
Models representing individuals in the stock market often have uninteresting and irrational dynamic properties. They predict little market movement, much less trading than we see, and nothing like the pattern of starts and crashes that seem to characterize existing markets. Models with different investors are able to replicate the patterns we see in the stock market.
One of the most interesting findings from agent models is that as the influence of players using a single strategy increases, the dynamics of the financial market change. Techniques that ease short-term volatility can cause sudden volatility.
For example, Farmer points out that in the late 1990s major investment banks adopted “value at risk” (VaR) as a strategy to manage market exposure. VaR measures the loss from, say, a negative price movement of two standard deviations. Using such a metric, a risk manager can say that he can increase risk exposure as market volatility decreases, and should decrease it when volatility increases. In good times, you get a self-reinforcing feedback loop that drives up prices as banks expand their portfolios. However, the small difficulty leads everyone using VaR to try to sell at the same time, causing a tight self-reinforcing downward loop. Farmer says this explains what happened in the financial markets before and during the 2008 financial crisis.
Farmer and his colleagues also use computer simulations of various agents’ strategies to analyze the energy market, with a particular focus on trying to assess the possibility of an “energy transition” to avoid climate change. According to their analysis, the main cost from switching to renewable energy sources is to improve the electricity grid. But actually producing energy will cost less, so a faster energy transition would be good for the economy.
- In 2050, for example, our average annual cost worldwide for the Fast Transition electricity network is about $670 billion per year, compared to $530 billion per year for No Transition. However, the total cost of the system in 2050 is estimated at $5.9 trillion for the Fast Transition and $6.3 trillion for the No Transition year. So, while the additional cost of $140 billion may seem expensive, it is far less than the savings from cheaper energy. p. 253
The traditional way of doing economic theory will always have the advantage of being easy to communicate and replicate. If someone shows the results of a standard model, you can solve the math yourself and get a feel for what’s driving the results.
For empirical work, replication is less reliable. Farmer reports that when he had a company interested in exploiting stock market inefficiencies, his team looked at published papers on stock market anomalies.
- In about half of the papers, we were unable to reproduce the results, even when examining subtle deviations from efficiency using the same data. p. 146
“If economists are going to use agent-based modeling, they will have to develop ways to explain, explain, and justify the decisions they make in building the models.”
The simulation is very obvious to those of us who are not on the team building the model. We cannot reproduce the results ourselves. If economists are to use agent-based modeling, they will need to develop ways to describe, explain, and justify the choices they make in building models.
I think of economic models as maps. With the old triptych, if the map said to take the George Washington Bridge from where I live to Boston, I would be stuck on that. With the map on my smart phone, I can consider alternatives, and make adjustments in real time based on traffic conditions.
For economists, large amounts of data are available. Computing power has increased by orders of magnitude. This probably favors agent-based modeling relative to the individual representative level.
But like maps for policymakers, agent-based models are far from reliable. I would be careful not to think that they make mediocre decisions to be a good way to use the economy. One should not bet the farm on the Farmer.
Source link