The Clunkers Program

Mark Thoma points to an analysis by Ted Gayer and Emily Parker.

The $2.85 billion program provided a short-term boost in vehicle sales, but the small increase in employment came at a far higher implied cost per job created ($1.4 million) than other fiscal stimulus programs, such as increasing unemployment aid, reducing employers’ and employees’ payroll taxes, or allowing the expensing of investment costs.

Pointer from Mark Thoma.

Although this analysis supports a view that this program was not a very effective stimulus, I think this sort of analysis has to be somewhat tenuous. Any government spending involves a diversion of funds from some other use. Any government spending redistributes income. As far as I can tell, Gayer and Parker assume that the subsidies accrued to car buyers. But maybe the subsidies accrued to auto companies or auto workers, in which case the multiplier effects would show up rather indirectly. The concept of what is seen and what is not seen casts suspicion on any calculations of the sort attempted here.

I am not trying to defend the Cash for Clunkers program, of course. I am just trying to point out how difficult it is to draw firm conclusions about the effect of macroeconomic policy.

A DY2PVSC Post I Wish I Had Written

From someone who prefers to blog anonymously.

Economics is a science, but it is a very politicized science. The Medicaid study, with its ambiguous results, offered justification for the policy proposals of both supporters and opponents of ACA, for example. Both sides were offering an incomplete picture of the study in this debate, but both sides were also correct the claims they made even if they strategically left out inconvenient findings.

Pointer from Tyler Cowen.

Read the whole thing. He is reacting to a column by Raj Chetty, and I had a similar reaction. While proclaiming the scientific virtues of economics, Chetty was sneaking in his own biases, through a selective presentation of results.

The more important point is that we all are tempted to do this, and we need to work hard to resist such temptation. One of the reasons for my occasional DY2PVSC posts (“Did you two people visit the same country”) is to try to pair up research that supports one side with research that supports the other.

Sentences to Ponder

From Jason Brennan.

Certain Austrian economists, take note. You ain’t gonna win in economics by doing philosophy of economics. It’s not because the world’s unfair, but because it’s fair.

Read the whole post. I think that it is fair to offer methodological criticisms of specific papers and even of entire lines of research. But….well, what Jason Brennan said.

Some Important Principles of Economics

What would I consider to be some important principles of economics that I want my students to walk away with?

1. Market processes promote long-term growth. The market processes of specialization, exploiting comparative advantage, and creative destruction have benefits that are widely dispersed in the long run. These processes impose short-term costs on those who have invested in physical and human capital made obsolete.

2. Competition is a regulatory mechanism. For example, the ability of a business to exploit consumers or workers is attenuated by competitive forces. Businesses do not like competition. They lobby for policies that stifle competition. Often, such lobbying is successful. Competition is far from perfect as a regulatory mechanism. It is possible to be too optimistic about how well it can work. It is also possible to be too optimistic about the prospects for fixing the flaws in markets using government regulation.

3. Human cooperation is difficult to achieve. The conditions under which large organizations can operate without internal friction are never satisfied. Aligning incentives is more difficult in the real world than it might appear to be in the abstract.

4. Economists need two hands. The conditions under which markets will produce optimal outcomes are never satisfied. The conditions under which a government official can act as an omniscient, benevolent central planner are never satisfied.

The 2014 Nobel Laureates Fama, Hansen, and Shiller

What they have in common is the “second moment.” In statistics, the first moment of a distribution is the mean, a measure of central tendency. The second moment is the variance, or spread. Politically, their views have a high second moment. If they are asked policy questions during interviews, the differences should be wide.

Shiller is known for looking at “variance bounds” for asset prices. Previously, economists had tested the efficient market hypothesis by looking at mean returns on stocks or bonds. Shiller suggested comparing the variance of stock prices with the variance of discounted dividends. Thus, the second moment. He found that the variance of stock prices was much higher than that of discounted dividends, and this led him to view stock markets as inefficient. This in turn made him a major figure in behavioral finance.

Fama was the original advocate for efficient markets. However, he was an empiricist. He verified an important implication of Shiller’s work: if stock prices vary too much, stock returns should exhibit long-run “mean reversion.” Basically, when the ratio of stock prices to a smoothed path of dividends is high, you should sell. Conversely, when the ratio is low, you should buy. Mean reversion also says something about the properties of the second moment.

Finally, Hansen is the developer of the “generalized method-of-moments” estimator. This is a technique that is most useful if you have a theory that has implications for more than one moment of the distribution. For example, Shiller’s work shows that the efficient markets hypothesis has implications for both the first and second moment (mean and variance) of stock market returns.

Although Tyler and Alex are posting about this Nobel, I think that John Cochrane is likely to offer the best coverage. As of now, Cochrane has written two posts about Fama.

In one post, Cochrane writes,

“efficient markets” became the organizing principle for 30 years of empirical work in financial economics. That empirical work taught us much about the world, and in turn affected the world deeply.

In another post, Cochrane quotes himself

empirical finance is no longer really devoted to “debating efficient markets,” any more than modern biology debates evolution. We have moved on to other things. I think of most current research as exploring the amazing variety and subtle economics of risk premiums – focusing on the “joint hypothesis” rather than the “informational efficiency” part of Gene’s 1970 essay.

Cochrane’s point that efficient market theory is to finance what evolution is to ecology is worth pondering. I do not think that all economists would agree. Would Shiller?

Some personal notes about Shiller, who I encountered a few times early in my career.

1. His variance-bounds idea was simultaneously discovered by Stephen LeRoy and Dick Porter of the Fed. The reference for their work is 1981, “The Present-value Relation: Tests Based on Implied Variance Bound,”’ Econometrica, Vol. 49, May, pp. 555-574. Some of the initial follow-up work on the topic cited LeRoy and Porter along with Shiller, but over time their contribution has been largely forgotten.

2. When Shiller’s Journal of Political Economy paper appeared (eventually his American Economic Review paper became more famous), I sent in a criticism. I argued that his variance bound was based on actual, realized dividends (or short-term interest rates, because I think that the JPE paper was on long-term bond prices) and that in fact ex ante forecasted dividends did not have such a bound. Remember, this was about 1980, and his test was showing inefficiency of bond prices because short-term interest rates in the 1970s were far, far higher than would have been implied by long-term bond prices in the late 1960s. I thought that was a swindle.

He had the JPE reject my criticism on the grounds that all I was doing was arguing that the distribution of dividends (or short-term interest rates) is unstable, and that if you use a long enough data series, that takes care of such instability. I did not agree with his view, and I still don’t, but there was nothing I could do about it.

3. When I was at Freddie Mac, we wanted to use the Case-Shiller-Weiss repeat-sales house price index as a check against fraudulent appraisals. (The index measures house price inflation in an area by looking at the prices recorded when the same house is sold in two different years.) I contacted Shiller, who referred me to Weiss. Weiss was arrogant and unpleasant during negotiations, and we gave up and decided to create our own index using the same methodology and our loan database. Weiss was so difficult, that we actually had an easier time co-operating with Fannie on pooling our data, even though they had much more data at the time because they bought more loans than we did. Eventually, our regulator took over the process of maintaining our repeat-sales price index.

4. Here is my review of Shiller’s book on the sub-prime crisis. Here is my review of Animal Spirits, which Shiller co-wrote with George Akerlof.

Finally, note that Russ Roberts had podcasts with Fama on finance and Shiller on housing.

Pseudo-physics Backlash Watch

Bart Wilson quotes from the conclusion of Frank Knight’s The Ethics of Competition.

man’s relations with his fellow man are on a totally different footing from his relations with the objects of physical nature and to give up, except within recognized and rather narrow limits, the naïve project of carrying over a technique which has been successful in the one set of problems and using it to solve another set of a categorically different kind.

In 1924, this statement evidently did not affect the direction taken by the economics profession. Perhaps we might be more receptive to it today.

Macroeconomic Methodology

Here is the draft introduction and conclusion to my chapter on economics as history rather than physics.

Economists love to dress up as physicists. We like to put theories into the form of equations. However, there are important differences between physics and economics, and these differences are particularly pronounced in the case of macreconomics…

1. Politicians and journalists want answers to questions such as, “How many jobs will (or did) a certain stimulus proposal create?” However, it is not possible to give reliable answers to such questions. Economists who purport to do so are misrepresenting the state of knowledge that actually exists.

2. Economists would like to know which theories are ruled out by the data and which theories are supported by the data. However, our ability to make statements along these lines is quite limited.

3. I believe that the study of macroeconomic events is going to have to be comparable to the study of revolutions, wars, or other historical events. There will be many plausible causal factors per event.

4. It will not necessarily be the case that the best explanations for macroeconomic events will be a single “model” that uses the same causal factors for every event. Instead, each important macroeconomic event may have important idiosyncratic elements involved.

5. Many very different explanations for an event will be consistent with the data.

6. Neither the use nor non-use of equations will ensure clarity or logical consistency. Confusion may be embedded in verbal descriptions of macroeconomic theories. Confusion also may be embedded in equations.

7. Neither verbal descriptions nor equations express verifiable relationships. Macroeconomic hypotheses will contain assumptions that will be highly contestable.

Pattern-Seeking

Temple Grandin and Richard Panek write,

Michael Shermer, a psychologist, historian of science, and professional skeptic – he founded Skeptic magazine — called this property of the human mind patternicity. He defined patternicity as “the tendency to find meaningful patterns in both meaningful and meaningless data.”

What all these examples tell me is that in society, the three kinds of minds — visual, verbal, pattern thinkers — naturally complement one another. When I recall collaborations in which I’ve successfully participated, I can see how different kinds of thinkers worked together to create a product that was greater than the sum of its parts.

I don’t know why it is written in the first person. I suppose that means that the thoughts belong to Grandin?

Anyway, finding meaningful patterns in both meaningful and meaningless data, which Grandin says describes people on the autistic spectrum, might also describe macroeconomists.

Noah Smith Picks Up the Theme

He writes,

In macro, most of the equations that went into the model seemed to just be assumed. In physics, each equation could be – and presumably had been – tested and verified as holding more-or-less true in the real world. In macro, no one knew if real-world budget constraints really were the things we wrote down. Or the production function. No one knew if this “utility” we assumed people maximized corresponded to what people really maximize in real life. We just assumed a bunch of equations and wrote them down. Then we threw them all together, got some kind of answer or result, and compared the result to some subset of real-world stuff that we had decided we were going to “explain”. Often, that comparison was desultory or token, as in the case of “moment matching”.

In other words, the math was no longer real. It was all made up. You could no longer trust the textbook. When the textbook told you that “Households maximize the expected value of their discounted lifetime utility of consumption”, that was not a Newton’s Law that had been proven approximately true with centuries of physics experiments. It was not even a game theory solution concept that had been proven approximately sometimes true with decades of economics experiments. Instead, it was just some random thing that someone made up and wrote down because A) it was tractable to work with, and B) it sounded plausible enough so that most other economists who looked at it tended not to make too much of a fuss.

I think that this is a well-expressed criticism, which Paul Krugman sidesteps in his response. I understand Krugman’s point to be that it is possible when expressing ideas verbally to say something that would be incoherent or self-contradictory if you were to try to express it in mathematical terms.

However, let us reflect on Smith’s point. Macroeconomic equations are not proven and tested. They are instead tentative and speculative. And macroeconomists have not been able to avoid allowing math to disguise this tentative, speculative quality of theory. Indeed, in the very same post in which Krugman defends math, he writes,

The basics of what happens at the zero lower bound aren’t complicated, but people who haven’t worked through small mathematical models — of both the IS-LM and New Keynesian type — generally get all tied up in verbal and conceptual knots.

In fact, it is pretty to easy to understand the liquidity-trap argument without mathematical models. However, the idea embedded in IS-LM models that there is only one interest rate is controversial (in fact, it is downright false). The idea that the Federal Reserve runs out of things to buy when the Fed Funds rate is zero is controversial. The idea that an interest rate that is “close to zero” is the same as an interest rate that is zero is controversial. Yet Krugman appears to be so persuaded by his math that he cannot seem to come to terms with anyone who disagrees with his view that the liquidity trap is an important characterization of the current U.S. economy.

I think that Noah Smith has expressed clearly and profoundly that macroeconomists who dress up like physicists are being tragically foolish. I think it is one of the best blog posts that I have ever read.

The idea of freeing macro from its pseudo-physics pretensions came up in Jag Bhalla’s post that I mentioned the other day. Perhaps it is something “in the air” right now. I hope so.

Falkenstein on Happiness Research

He makes three interesting points. (Pointer from Jason Collins)

I note many writers I otherwise admire, usually libertarian leaning, are quite averse to the Easterlin conclusion, thinking it will lead us to adopt a luddite policies because growth would not matter in such a world

I am one of those libertarian writers who is averse to happiness research, but my aversion holds regardless of the conclusions reached. Happiness research embodies the claim that you, the researcher (I am not referring here to Falkenstein), can know more than me, the subject, about what gives me happiness. I believe that claim is false. Further, from a libertarian perspective, I believe that claim almost surely will lead you to devalue my liberty.

When an economist tells you a symmetric ovoid contains a highly significant trend via the power of statistics, don’t believe them: real effects pass the ocular test of statistical significance (ie, it should look like a pattern).

See his charts to understand his point. Putting Falkenstein’s point in more colloquial language, I would say that when the data consists of a blob of points, just because the computer can draw a line of best fit does not mean that you have demonstrated the existence of a meaningful linear relationship.

evolution favors a relative utility function as opposed to the standard absolute utility function, and the evidence for this is found in ethology, anthropology, and neurology. Economists from Adam Smith, Karl Marx, Thorstein Veblen, and even Keynes focused on status, the societal relative position, as a motivating force in individual lives