How to Change Minds

Maria Popova writes,

Nearly half a millennium before modern psychologists identified the three elements of persuasion — attunement, buoyancy, and clarity — French physicist, philosopher, inventor, and mathematician Blaise Pascal (June 19, 1623–August 19, 1662) intuited this mechanism as he arrived at a great truth about the secret of persuasion: Pascal came to see that the surest way of defeating the erroneous views of others is not by bombarding the bastion of their self-righteousness but by slipping in through the backdoor of their beliefs.

Pointer from Olivia Goldhill.

Borrowing a Hansonian locution, I would say that argument is not about changing minds. Instead, it is about playing status games. You make points that lower the status of those with whom you disagree, and this in turn raises your status among those with whom you agree.

As Popova’s article explains, if your goal is to change someone’s mind, then the best approach is to start by talking about what seems right about the person’s beliefs. Then allow the person to come around to the problems with their thinking and, ultimately, to the better alternative.

Perhaps my Three Languages of Politics can be useful in this regard.

Price Discrimination Explains Everything

A commenter writes,

Standard economics that is taught in intro economics says that higher prices induces greater supply. Under certain restrictive assumptions that is generally true.

But there are exceptions. One is when the bulk of the firms costs are fixed and variable costs are a minor cost of bring the product to market

I tend to think of this as the rule rather than the exception, and I agree that introductory economics classes should give it more attention. I would say to my high school econ students that price discrimination explains everything. By that, I mean that in real-world business, the challenge is often recovering fixed cost. Marginal cost is often low, so you want to charge a low price for use on the margin. But you want to try to extract a high price from people willing to pay.

For one example among money, if you are a cell phone service provider, rather than charge the same price per unit of data for all customers, you try pricing tiers. $X per month for 0-2 gigs, $Y a month for 2-4 gigs, etc. You are trying to recover more of your fixed costs from the big users, and at the same time your users feel that the marginal cost is zero, at least until they bump up against the next tier of usage.

Cable company bundling is another classic example of price discrimination explains everything. So are store coupons. So is the price of popcorn at movies.

Evidenced-Based Policy

Robert Doar writes,

Momentum for evidence-based policymaking is building at all levels of government, from federal legislation funding rigorous evaluations to the bipartisan Commission on Evidence-Based Policymaking to counties looking to make funding decisions based on results.

I am afraid that my reaction is to be cynical. When you make funding decisions for programs based on evidence, what will change will be the reported evidence, not the programs.

During the Vietnam War, Secretary of Defense Robert McNamara was famous for demanding statistical evidence that strategies were working. He got what he was asking for, but the statistical evidence did not capture what was really happening.

To cite another example, the Stiglitz-Orszag paper on Fannie Mae and Freddie Mac appeared to be evidence-based. Recall that they wrote,

This analysis shows that, based on historical data, the probability of a shock as severe as embodied in the risk-based capital standard is substantially less than one in 500,000 – and may be smaller than one in three million. Given the low probability of the stress test shock occurring, and assuming that Fannie Mae and Freddie Mac hold sufficient capital to withstand that shock, the exposure of the government to the risk that the GSEs will become insolvent appears quite low.

Within any organization, including a profit-seeking business, one has to be cynical about “evidence.” Show me a CEO who always believes every report he or she receives from middle management, and I will show you a company that is at high risk for going bankrupt very soon. I have never been a large-company CEO, but if I were I would make a point of setting up internal checks and balances so that I did not have to rely on any one set of carefully crafted reports.

You are entitled to ask, “How can you be against evidence? Evidence is bound to make policies better than if evidence is ignored.”

My response is that I am afraid that evidence will be distorted to make spending programs and regulations appear better than they really are. I will take public choice theory over misleading evidence, any day.

A Searle Thought to Ponder

In Making the Social World, John R. Searle writes,

If we assume that democracies are defined in part by majority rule as expressed in elections, then another feature of successful stable democracies is that few, if any, of the important problems of life are determined by elections. Such questions as who will live and who will die, who will be rich and who will be poor, cannot be decided by elections if the country is to be stable. Why not? Elections are too unpredictable for people to be able to plan and live their lives based on the outcome of elections. If you knew that if your opponents won the next election, you were likely to be thrown into a concentration camp, or executed, or have all your property confiscated, you could not make stable and enduring life plans. In successful democracies, it does not matter who gets elected. . .I have noticed that life pretty much goes on after the election as it did before, regardless of who gets elected.

Some thoughts.

1. It is of course in the interest of political activists and journalists to argue otherwise–that “this is the most important election in history,” that the wrong choice will lead to disaster, etc. Their warnings typically do not turn out to be valid, although some day that could change.

2. This is an argument for keeping the stakes in politics low, and thus the argument tends to weigh in on the libertarian side of things. However, I doubt that those who favor activist government will think much of the point that “elections are too unpredictable.”

What I’m Reading

1. Philosopher John R. Searle’s The Making of the Social World, published in 2010. One excerpt:

How do governments, so to speak, get away with it? That is, how does the government manage to be accepted as a system of status functions superior to other status functions?. . .governmental power is a system of status functions and thus rests on collective recognition or acceptance, but the collective recognition or acceptance, though typically not itself based on violence, can continue to function only if there is a permanent threat of violence

…All political power is a matter of status functions, and for that reason all political power is deontic power.

For some reason, my brain keeps wanting to read “deontic” as “demonic.”

Anyway, I think of a status function as a social convention that assigns people or objects certain properties. I think of a deontic power as a right or obligation.

So, imagine a busy intersection. We could put up a traffic light and by general consent give it a status function to regulate traffic flow. Or we could let an individual direct traffic. For the status function to work, we need to be willing to follow the social convention of obeying the signals, either from the stoplight or from the individual.

Next, suppose that we recognize that the individual wears a uniform and a badge, and we recognize that the individual is permitted to impose fines on people who do not obey. These are stronger deontic powers, and they will deter drivers from trying to cheat the system. We can think of that move as a metaphor for government by consent (although the consent may not be explicit or universal).

As of this writing, I have yet to finish the book. By the time this post goes up, I may have finished a first read, but the book will require some re-reading. It seems to me that Searle is likely to turn out to be on my side of a disagreement with Michael Huemer.

2. Ryan Avent’s new book (not yet out) The Wealth of Humans. I attended a discussion of the book the other night. As the conversation jumped around, I found myself frequently thinking, “Show me the model.” That is out of character for me, because I have spent a lot of the last few years criticizing economists’ use of formal models. But as people tried to speculate about capital accumulation, wealth distribution, and productivity differentials, I found that I could not follow what was being said. I needed to think in terms of supply and demand curves crossing, income adding up to output, and output equal to labor input times output-per-worker. It was hard to get that in a purely verbal discussion, particularly when people were speaking extemporaneously.

James Tobin’s Presidential Address

Robert Waldmann brings it up. Tobin delivered it in 1971, and it was published in 1972.

Pointer from Mark Thoma.

As I remember it, Tobin suggested thinking of an economy with two industries and wages rigid downwards. Suppose that demand shifts away from industry X to industry Y. Because wages do not fall in industry X, you get unemployment there. Because wages do rise in industry Y, the overall rate of inflation goes up. Note, however, that with more inflation, the real wage in X falls, which means less unemployment there than otherwise.

This simple story gives you an explanation for both stagflation and the Phillips Curve. The point that Waldmann is making is that macroeconomists did not need to take the detour that they took in the 1970s. They could have stayed on the path that Tobin laid out for them. My thoughts:

1. It is amazing how much better you can do if you break up the GDP factory into two industries. I think you can do even better with more disaggregation, but the modeling would be much hairier.

2. I agree that macro would have done better to follow this path. However, macro still would not be very good. The problem of too many plausible causal factors chasing too little data is insurmountable. See my science of hubris paper, as well as the recent Paul Romer screed.

3. The sociology-of-economists question of how macro remained (and continues to be) stuck for so long is quite interesting. See Daniel Drezner’s piece (for which I also thank Thoma). As you know, my explanation is that Stan Fischer became the Genghis Khan of macro.

Haidt, Cosmides, and Tooby on Socialism’s Attraction

Self-recommending. I went to the event with high expectations, and I was not disappointed. I will post on the substance once I have watched a re-run. Each of the speakers had problems. Jonathan Haidt was flustered by technical difficulties which delayed the start of his talk. Leda Cosmides had a sore throat from a cold. And John Tooby reminded me of Paul Samuelson, in that it appeared that his mind was working much faster than he could talk, giving the listener the feeling of missing out on insights that were in the speaker’s head but never made it out of his mouth.

In general, I wish the event had been longer.

Paul Romer, Macroeconomics, and Trouble

Romer writes,

In the last three decades, the methods and conclusions of macroeconomics have deteriorated to the point that much of the work in this area no longer qualifies as scientific research. The treatment of identification in macroeconomic models is no more credible than in the first generation large Keynesian models, and is worse because it is far more opaque. On simple questions of fact, such as whether the Fed can influence the real fed funds rate, the answers verge on the absurd. . .The larger concern is that macroeconomic pseudoscience is undermining the norms of science throughout economics. If so, all of the policy domains that economics touches could lose the accumulation of useful knowledge that characteristic of true science, the greatest human invention.

Pointer from Mark Thoma. I am on board with the above passage, but soon Romer writes

To appreciate how far backwards our conclusions have gone, consider this observation, from a paper published in 2010, by a leading macroeconomist:

… although in the interest of disclosure, I must admit that I am myself less than totally convinced of the importance of money outside the case of large inflations.

Romer could be talking about me, except for the “leading macroeconomist” part.

Anyway, he goes on to argue that the disinflation that took place in the early 1980s is evidence that monetary policy matters. My comments.

1. I agree that for those (few) of us who doubt the importance of monetary policy, the “Volcker disinflation” represents the most difficult data point.

2. Still, Romer appears to me to distort things. He calculates a rise in the real interest rate of 5 percent. But I believe that a lot of that comes from inflation falling–not just the Fed raising nominal rates.

3. Long-term interest rates rose dramatically as well. Arguably, the “Volcker disinflation” should be called the “bond-market vigilante disinflation.”

4. In general, although much of Romer’s critique focuses on the identification problem and the challenge of teasing out causality, it is impossible for him (or anyone) to demonstrate that changes in the money supply are exogenous rather than endogenous.

Overall, I agree with Romer that the methodological challenges in empirical macro are daunting–I would say overwhelming. For my take, see Macroeconometrics: The Science of Hubris.

I am just quibbling over the one instance which he argues demonstrates an empirical truth.

Book Discussions

If any readers are willing/able to organize a group interested in Specialization and Trade, I am willing/able to travel to talk with such a group. I think about 10-20 people would be a good size. I am particularly interested in speaking to autodidacts in their 20s and 30s.

There are several topics in the book which, in hindsight, could have been developed further. One of them that I have been thinking a lot about recently is the long shadow cast by World War II on economic thinking and policy. In the book, I do mention that all of the major nations fighting the war used central planning to a considerable extent. But other points are worth noting, including:

1. In Great Britain, major industries were nationalized from the post-war period all the way up to the late 1970s, when Margaret Thatcher took over as Prime Minister.

2. In the U.S., price controls were used during the war to fight inflation, and the belief in price controls died hard. If I recall correctly, many in the Truman Administration wanted to continue controls after the war, and they were disappointed when Congress abolished them. As late as the early 1970s, the Nixon Administration attempted to go the price-control route, with disastrous results.

3. Another challenge during the war and the post-war period was the potential for labor unions in key industries, such as steel and coal, to bring the economy to its knees. In the decades following the war, Presidents had to resolve major strikes by cajoling (or even forcing) industry and labor leaders to accept settlements. Finally in the 1980s, both Thatcher (coal miners) and President Reagan (air traffic controllers) won important confrontations with striking workers. Many on the left are still bitter about this. They long for the days when unions were more of a force.

4. Because the wartime economies were centrally planned, a lot of economic research involved developing tools for such planning. Prior to the war, the idea of representing an entire economy using mathematical symbols and equations to represent inputs and outputs was adapted from the Soviet Union by Wassily Leontief, who was awarded the Nobel Prize in 1973. After the war, MIT economists, notably Robert Solow (who had studied with Leontief at Harvard), thought that Leontief’s model of production was both too detailed and too rigid. They worked on solutions to the problem of optimizing output that involved linear programming, resulting in an important textbook on programming techniques by Joseph Dorfman (Harvard), Paul Samuelson, and Solow.

5. Also, the MIT economists developed and elaborated on the concept of an aggregate production function. This eliminates the detail by aggregating “capital” and “labor” inputs and treating the economy as a GDP factory. This generated an extensive, but now largely forgotten, literature, including the so-called Cambridge Capital Controversy.

6. The advantage of the aggregate production function is that there are mathematically tractable ways to represent substitution between capital and labor. The Constant Elasticity of Substitution production function, which includes Cobb-Douglas as a special case, was another topic that filled the journals of the early 1960s with now-forgotten articles. I recall that in the early 1970s one of my undergraduate professors, Bernie Saffran, pointed out to us that econometricians trying to estimate the CES production function were trying to tease second and third derivatives out of data where you could be lucky merely to find that the first derivative had the correct sign.