Category Archives: PSST and Macro
Essay Backup: Measure Occupational Satisfaction
Why we tend to be negative and paranoid
Psychologists Paul Rozin and Edward Royzman were the ones who originally coined the term negativity bias to describe this asymmetry. “Negative events are more salient, potent, dominant in combinations, and generally efficacious than positive events,”
. . .We tend to focus on the constellation of threats as signifying some systematic program aimed at doing us harm. This is a manifestation of what I call “agenticity”—our tendency to infuse patterns (especially patterns of threat or harm) with meaning intention and agency. And so we imagine that disconnected misfortunes are commonly directed by intentional agents, sometimes operating invisibly. Souls, spirits, ghosts, gods, demons, angels, aliens, governments, religious officials and big corporations all have played this role in conspiracist lore (and, in the case of the latter three entries, real life, too, it must be conceded). Taken together, patternicity and agenticity form the cognitive basis of conspiratorial cognition.
There are many other paragraphs in the essay that I wanted to excerpt.
We automatically search for patterns and for stories–preferably involving supposedly culpable individuals–to explain those patterns. Recall that Ed Leamer’s macroeconomic textbook is titled Macroeconomic Patterns and Stories. Blaming the Fed is the simplest conspiracy-theory type explanation, which I try to resist.
Blaming every weather event on climate change would be another example.
Aggregate economic data
For the essay I am writing on Edward Leamer, I have been re-reading Macroeconomic Patterns and Stories. He is really good at diving into data. Most academic macroeconomists think that actually studying the way that statistics are collected and looking for patterns would be beneath them.
For example, Leamer found the pattern of momentum in payroll employment growth. From that standpoint, the period from October 2011 through the September 2019 has been really uninteresting. There have been no sequences of three months of employment gains that were either well above or well below average. My rule of thumb is that less than 50,000 net jobs is a low-growth month and more than 350,000 is a high-growth month. Over the past eight years or so we have had only 3 low-growth months and one high-growth month (which barely made the cut at 355,000). So no real chance to test the momentum pattern, although I suppose that you count the persistence of middling numbers as “momentum.’
Anyway, I have the following thoughts, not all of which comes from Leamer.
1. The short-term fluctuations in GDP and net employment changes that we call recessions, even deep ones, are really small relative to: long-term growth; seasonal fluctuations (Leamer points out that real GDP on average drops at a 20 percent annual rate in the first quarter of the year, before the Commerce Department does its “seasonal adjustment” of the data. That is more than twice as much as the rate at which (seasonally-adjusted) real GDP falls during a bad recession.); secular shifts between sectors, e.g. employment rising in health care and education but declining in manufacturing; gross job flows, with 4 million workers leaving jobs and 4 million workers starting new jobs every month.
2. Can we say that the process for calculating real GDP in 2019 resembles the process in 2009? In 1999? In 1979? In 1959? In 1929? It would be an interesting exercise to go back to the raw data collected by the Commerce Department (or by Simon Kuznets prior to that) to estimate GDP. Fifty years ago, the government statisticians were calling manufacturing firms and getting counts of steel production or automobile assemblies or what have you. Now those figures are a much smaller fraction of GDP. How are the statisticians calculating the output of hospitals, of non-profit organizations, etc.? Academic economists don’t want to know how the statistical sausage gets made, but that seems to me to be a serious oversight.
3. Overall, I recommend being very wary of macroeconomic analysis that purports to give trends in productivity or to compare real income today with real income in past decades. The range of different answer that you can get using reasonable alternative methods for constructing the data far exceeds the variation in the phenomena that you are trying to assess.
A non-virtue of nationalism
Alberto Mingardi writes (in an email to followers of the Bruno Leoni Institute that he heads),
Nationalism, as political rhetoric, is quite incompatible with a concept which is key to classical liberal. That of the sovereignty of the consumer, as it has been called perhaps with an infelicitous term. It is difficult to ‘nationalize’ consumers, very easy to ‘nationalize’ producers.
In Specialization and Trade, I pointed out that we are much more concerned with conditions in the one market in which we produce than with conditions in any one of many markets in which we consume. If we are going to use our political voice in a market, it is going to be as a producer rather than as a consumer. Economic nationalism, by strengthening political voice rather than markets, is bound to favor producers rather than consumers. In short, Mingardi has a point.
A review of Specialization and Trade
Kling’s criticism of contemporaneous macroeconomics reads like a criticism of the kind of macroeconomics still taught at the undergraduate level. But modern macroeconomics has moved on—it is general equilibrium microeconomics. Its primary objective is not to produce the one and only model for economist-engineers or “experts” to use, but rather to help us understand mechanisms. A good expert knows many models, is informed about institutions, and has the courage to judge which of the models (or mechanisms they identify) are the most relevant in a specific context. We don’t need a new macroeconomics. But maybe we need better “experts.”
To me, this sounds as though Niepelt is happier than I am with what he calls general equilibrium microeconomics and what Paul Krugman (rightly) calls Dark Age Macro. Of course, Krugman wants to go back to old-time Keynesian religion, which I reject. I would refer Niepelt to my macro memoir essay.
Wages and productivity
Scott Alexander writes,
Median wages tracked productivity until 1973, then stopped. Productivity kept growing, but wages remained stagnant.
This is called “wage decoupling”. Sometimes people talk about wages decoupling from GDP, or from GDP per capita, but it all works out pretty much the same way. Increasing growth no longer produces increasing wages for ordinary workers.
Is this true? If so, why?
He makes a valiant effort to summarize and assess the economic literature. But this is where orthodox economics is hopeless.
Productivity by definition is output divided by the amount of labor input. Let me make three points:
1. You can’t measure the numerator very well.
2. You can’t measure the denominator very well.
3. The U.S. is not just one big GDP factory. Both the numerator and the denominator are affected by shifts in the composition of the economy, even if actual productivity and wages were not changing at all.
The numerator is output. How many people work in businesses with measurable output? Scott Alexander doesn’t. I never have. Most of my readers never have. There are entire industries, like health care, education, and finance, where we do not have any idea how to measure output. And even within an industry that has quantifiable output, we still have the issue that, as Garett Jones pointed out many years ago, most workers are not engaged in actual production; they are building organizational capabilities. Even if the factory managers can count widget production, they cannot measure the productivity of the tax accountants or of the team developing a new marketing initiative.
The denominator is labor input. But most of labor input consists of human capital. To measure labor input, you need to be able to measure quality, not just quantity. What is the incremental value of X years of schooling and Y years of experience? We do not have a reliable way to do that. One approach uses wage rates as an indicator of quality, but that amounts to assuming that productivity and wage rates are tightly coupled, but that amounts to assuming away the question that Alexander is raising.
We are not in a GDP factory. As the share of GDP devoted to health care and education goes up and the share devoted to manufacturing goes down, we are giving more weight to a sector where real output and the quality of labor input are extremely difficult to measure.
I think that for economists to say anything useful about productivity and wages, they should try to study individual units of individual firms. My guess is if you were to undertake such a study, you would be overwhelmed by doubts about the precision of your measurements and the difficulty of obtaining a decent signal-to-noise ratio. It’s perverse that you would instead look at the aggregate statistics cranked out by the Commerce Department and the Labor Department and pretend that it’s 100 percent signal.
Klassic: my macro memoir
My idiosyncratic take on the intellectual history of macroeconomics. I’ve gotten a couple emails from academic economists praising it. But intellectual history is much under-valued in the economics profession, and this long essay/short book deserves a much bigger audience than it’s had. To me, it seems like a jolly good read.
Kotlikoff on PSST
Economics has many theories of economies rapidly flipping from good to bad. They go under the headings multiple equilibrium, contagion, self-fulfilling prophecy, panics, coordination failures, strategic complementarities, sun spot equilibria, collective action, social learning, and herding.
Pointer from John Cochrane, who offers extensive comments. Read his whole post.
Kotlikoff argues that the mere perception that the economy was in trouble was enough to cause trouble.
Employers laid off their workers in droves to lower their payrolls before their customers stopped arriving. This was the worst of the many types of multiple equilibria associated with the GR.
…The slow recovery is hard to explain except as the result of everyone expecting a slow recovery.
I see this as a PSST story. Patterns of specialization and trade depend on business managers’ confidence that those patterns will continue.
I, too, have been thinking a lot about the contingent nature of economic outcomes. I am mulling an essay that will strongly criticize the view that the market acts like a set of equations providing a deterministic solution for given tastes, technology, and resource endowments. Instead, there are multiple equilibria that depend on people’s perceptions and beliefs.
Part of my argument is that hardly anyone in the economy has a measurable marginal product. Most of us are producing intangible output, even if we work in goods industries. For example, relatively few of the employees at a pharmaceutical company are actually bottling pills.
Businessmen operate in a world of high fixed costs, with mostly overhead labor. If they doubt that revenue is going to remain at current levels, one response is to cut costs by reducing overhead labor. If enough firms do this at once, their fears of recession becomes self-fulfilling. This sort of self-fulfilling recession is particularly easy to fall into when there is a financial crisis.
Kotlikoff’s main point, which Cochrane emphasizes and expresses support for, is that a financial crisis of 2008 was itself a sudden shift in perceptions that took place in the context of an inherently fragile financial system.
The drop in potential GDP
Robert Murphy points me to Paul Krugman doing economic analysis.
The analysis concerns potential GDP, which is a concept that plays a big role in a paradigm that I have come to reject, that of aggregate supply and demand. But let’s roll with it, at least for a while. Potential GDP is what GDP would be at full employment, meaning that there is no shortfall in aggregate demand. Another term for potential GDP is long-run aggregate supply.
More below the fold. Continue reading