A Shortage Explained Using Textbook Economics

Alex Tabarrok writes,

California has plenty of water…just not enough to satisfy every possible use of water that people can imagine when the price is close to zero. As David Zetland points out in an excellent interview with Russ Roberts, people in San Diego county use around 150 gallons of water a day. Meanwhile in Sydney Australia, with a roughly comparable climate and standard of living, people use about half that amount. Trust me, no one in Sydney is going thirsty.

I often complain about textbook economics, but certainly this is an example where it offers important insights. When you see a “shortage,” look for the artificially low price.

The price system fosters order. Repression of the price system leads to disorder.

MIT Economics and Academic Prejudice

The MIT economics department’s dominance was fading just as I entered grad school there. David Warsh, himself a long-time chronicler of the department, reviews a book edited by E. Roy Weintraub on the golden age of economics at the Institute.

A sixth factor, advanced by Weintraub in the Transformation volume, argues that the rise of MIT stemmed from its willingness to appoint Jewish economists to senior positions, starting with Samuelson himself. Anti-Semitism was common in American universities on the eve of World War II, and while most of the best universities had one Jew or even two on their faculties of arts and sciences, to demonstrate that they were free of prejudice, none showed any willingness to appoint significant numbers until the flood of European émigrés after World War I began to open their doors. MIT was able to recruit its charter faculty – Maurice Adelmam, Max Millikan, Walt Rostow, Paul Rosenstein-Rodin, Solow, Evsey Domar and Franco Modigliani were Jews – “not only because of Samuelson’s growing renown,” writes Weintraub, “…but because the department and university were remarkably open to the hiring of Jewish faculty at a time when such hiring was just beginning to be possible at Ivy League Universities,”

Pointer from Mark Thoma. My Swarthmore College professor Bernie Saffran emphasized the anti-Semitism factor also. Bernie’s version was that Harvard’s anti-semitism made Samuelson feel that he would be better off at MIT, and once he went to MIT he went about using Jews to build a superior department to pointedly punish Harvard. It took almost three decades (roughly from the end of World War II to the late 1970s) for Harvard to come back.

Economists generally view prejudice by a firm as unsustainable, because that firm will lose out to competitors. The lesson I take from the Harvard-MIT story is that in academia prejudice can persist for a while, with long-term detrimental effects. Consider that as you read stories about prejudice against conservatives.

Read Warsh’s entire article, which covers much more ground.

UPDATE: For more on the economics of discrimination, check out the links on David Henderson’s post.

Should the CBO Use Dynamic Scoring?

John Cochrane writes,

Greg Mankiw has a nice op-ed on dynamic scoring

The issue: When the congressional budget office “scores” legislation, figuring out how much it will raise or lower tax revenue and spending, it has been using “static” scoring. For example, it assumes that a tax cut has no effect on GDP, even if the whole point of the tax cut is to raise GDP.

My thoughts.

1. I am against dynamic scoring. Dynamic scoring means using an economic model. I think that politicians and the press give too much credence to economic models as it is. Even static scoring requires some modeling, but the modeling has more to do with spreadsheet arithmetic as opposed to claiming to be able to predict economic behavior.

2. To the extent that the CBO has to predict economic behavior, I think it should present several scenarios, as opposed to a point estimate or a range. Cochrane says it well:

It’s a fact, we don’t know the elasticities, multipliers, and mechanisms that well. So stop pretending. Stop producing only a single number, accurate to three decimals. Instead, present a range of scenarios spanning the range of reasonable uncertainty about responses.

Responding to another point from Cochrane, Mankiw writes,

you need to specify how the government is going to satisfy its present-value budget constraint. You might be tempted to ask the model what happens if the government cuts taxes and never does anything else. But you won’t get very far. The model will tell you that the government has to do something else eventually, and it won’t tell you what will happen if the government tries to do something impossible.

What I hear Greg saying is that to properly do dynamic scoring, you would need to include a model of future policy responses. That is a point well taken, but I am not sure that I would restrict those policy responses to be only doing things that are possible. Policy makers are doing impossible (that is, unsustainable) things now. The challenge is to predict the outcome of undertaking unsustainable policies until you cannot do so any more.

Of course, the traditional “static” scoring does not solve the problem of how to predict the outcome of unsustainable policies, either.

Paul Krugman Sentences I Might Have Written

I certainly agree with this:

the professional economists who either play important roles in making policy or appear to have influence on the discussion got their Ph.Ds from MIT in the second half of the 1970s. An incomplete list, with dates of degree:

Ben Bernanke 1979
Olivier Blanchard 1977
Mario Draghi 1976
Paul Krugman 1977
Maurice Obstfeld 1979
Kenneth Rogoff 1980

Larry Summers was at Harvard during the same period, but he was an MIT undergrad and very much part of that intellectual circle. Also, just about everyone on the list studied with Stan Fischer, who remains very much in the middle of policy-making.

Note that we are talking about macroeconomic policy. But some important microeconomic policy makers came out of that period as well. Carl Shapiro comes quickly to mind.

Of course, Krugman has other sentences that I could not have written.

Analytically, empirically, the MIT style has had an astonishing triumph.

As you know, I think that macroeconomic data can be twisted to “prove” any theory. You can look at reasonable, credible blog posts by Scott Sumner or Tyler Cowen pointing out many discrepancies between recent macroeconomic performance and the Krugman-style Keynesian analysis. Empirical macroeconomics seems to me to boil down to a pure exercise in confirmation bias.

As you also know, I have a less exalted view of MIT’s approach to economics and of Stan Fischer’s role as the Genghis Khan of macro. See, my recent post on academic hiring networks, my memoirs of a would-be macroeconomist, or my recent essay on camping-trip economics. Read that essay next to Krugman’s post.

Reluctant Heroes Austan Goolsbee and Alan Krueger

They write,

It is fair to say that no one involved in the decision to rescue and restructure GM and Chrysler ever wanted to be in the position of bailing out failed companies or having the government own a majority stake in a major private company. We are both thrilled and relieved with the result: the automakers got back on their feet, which helped the recovery of the U.S. economy. Indeed, the auto industry’s outsized contribution to the economic recovery has been one of the unexpected consequences of the government intervention.

Pointer from Tyler Cowen.

I guess there is no such thing as the seen and the unseen. For those of you who do not know, Goolsbee and Krueger were officials in the Obama Administration as the bailout was being executed. Here, if their arms do not break from patting themselves on the back, it won’t be for lack of trying.

Timothy Taylor, I question the editorial decision to publish this piece, even if you also include an article that challenges the auto bailouts. Could you not find a neutral party to tell the pro-bailout side? If not, then what does that tell us?

The Closed Network of Faculty Hiring

Colleen Flaherty reports on a study of academic hiring.

The study, published this week in Science Advances, is based on hand-curated data about placements of 19,000 tenure-line faculty members in history, business and computer science at 461 North American institutions with doctoral programs. Using a computer-aided, network-style analysis, the authors determined that just 25 percent of those institutions produced 71 to 86 percent of tenure-line professors, depending on discipline.

My guess is that if you were to study the economics field, the concentration would be even higher.

You can think of this as a very natural equilibrium. A few graduate schools get good reputations. They then get a huge share of the best students. With the best students, they place students well. This reinforces their ability to attract the best students. etc.

The actual quality of teaching does not matter in this equilibrium. In fact, MIT, where I went, had a reputation of caring more about teaching than other top departments. In hindsight, the few classmates with whom I keep in touch think that the classroom instruction was really poor. Some of the courses in the required sequences were a complete waste of time (Ray Fair, I’m looking at you.) Tom Rothenberg, who visited for a semester from Berkeley (and then quickly fled the Boston weather), was by far the best teacher we had. In fact, the core was taught much more by visiting professors (Fair, Dixit, Begg, Rothenberg) than by permanent faculty, which tells you how much the permanent faculty really cared about teaching.

I don’t have a problem with an equilibrium in which the best students are attracted to a few departments. But when only a few dissertation advisers control entire sub-fields, you get a dreadful monoculture. I have said many times that I think that macro suffered from this. The field became completely dominated by students of Thomas Sargent at Minnesota and Stan Fischer and Rudi Dornbusch at MIT. Together, they produced nothing but a giant garbage factory.

Response to a Comment

Concerning Gregory Clark’s findings of the absence of high multi-generational mobility, a commenter writes,

I still can’t believe things are quite as static as he makes them out to be, but I don’t know enough to dispute any of his specific findings. The model of human social behavior I carry around in my brain just doesn’t match the one he presents.

One thing we know is that there is high variance in outcomes across siblings. Back when people had many children, it may have been the case that if you were well off it was very likely that at least one of your grandchildren would be well off, but not so likely that every one of your grandchildren would be well off. With people having fewer children, either multigenerational mobility will go up or other forces (such as stronger assortative mating) will offset what otherwise would be an increase in random variation across generations.

Macroeconomics and Expertise

As part of a general blog conversation, Noah Smith writes,

Scott Sumner expresses incredible confidence that NGDP targeting is best. Paul Krugman expresses incredible confidence that fiscal stimulus is effective and that austerity is counterproductive. John Cochrane expresses incredible confidence that structural form – removing “sand in the gears” – is the best medicine for an economy in recession. Robert Lucas said that the “central problem of depression prevention has been solved.” And so on, and so forth.

I think normal people realize that that certitude is basically never warranted. Yes, those economists often (but not always) have some evidence to back up their claims. But not the kind of evidence that people have in disciplines where data is more abundant, controlled, and replicable.

Pointer from Mark Thoma. I disagree that this is how normal people think. I believe that the main problem with non-expert opinion in macroeconomics is that normal people can be just as dogmatic in their macro views as self-proclaimed experts.

I would amend the last paragraph to substitute “thoughtful economists” for “normal people.” With that amendment, the quoted passage would have my vote.

Probability and Causal Density

“Scott Alexander” writes,

If ten different factors caused the decline in crime, that would require that ten different things suddenly changed direction, all at the same time in 1994. That’s a pretty big coincidence. . .we should give some credibility penalty to a story with ten factors.

I do not buy this argument. I do not think that one should automatically penalize a study more for claiming that there are ten factors rather than one prominent factor.

My view would be that when there is a lot of causal density, one should be skeptical of any study that claims to have the answer, whether the answer consists of one factor or many. Take as an example the financial crisis of 2008. There are many plausible causal factors. Should we prefer a study that attributes the crisis entirely to one factor rather than a study that attributes it to a combination of factors? I think not. Instead, given the non-experimental nature of the problem, I think we need to accept the fact that we will have to live with some uncertainty about what exactly caused the crisis.

For a phenomenon that is amenable to replicable experiments, it may be possible to obtain evidence against causal density and in favor of an explanation based on one or two factors. But not for something like the drop in crime over the past two decades.

For Econ Grad Students

Nick Rowe explains the overlapping generations model.

Imagine an infinite line of people, each holding one beer. One equilibrium is where each person drinks one beer. But there is a second equilibrium, where each person gives his beer to the person in front. The person at the front of the line drinks two beers, and everyone else drinks one. The second equilibrium is Pareto Superior to the first, because the person at the front of the line drinks more beer, and everyone else is the same. You can imagine the first person in line giving the person second in line a bit of paper, in exchange for the beer. That bit of paper (money) travels down the line in exchange for the beers traveling up the line.

I think that the OLG model is perhaps the stupidest idea in monetary economics. But if you have to learn it, at least go to Nick’s post in order to understand it.

The OLG model is probably a reasonably good way to think about Social Security. I think that this was probably the original intent. I think that the profession would be better off if it had never been viewed as a way to think about money.

[update: Roger Farmer has an excellent overview, and he points out that Samuelson did originally intend it as a model of money.]