Government Debt, Off Balance Sheet

John Cochrane writes,

Guaranteeing more than half of financial sector liabilities is impressive. But most of us don’t know how large financial sector liabilities are. GDP is about $17 Trillion. $43 Trillion is a lot.

This is only financial system guarantees. It doesn’t include, for example, the federal debt. It doesn’t include student loans, small business loan guarantees, direct loan guarantees to businesses, the ex-im bank and so on and so forth. It doesn’t include non-financial but likely bailouts like auto companies, states and local governments, their pensions, and so on.

He cites analysis from the Richmond Fed.

Not to worry, of course. Debt is just something we owe to ourselves.

Paul Romer Issues a Clarification

He writes,

I wrote that the economists I criticize for using mathiness are engaged in a campaign of ACADEMIC politics, not one of national politics. Whatever was true in the past, the now fight is over ACADEMIC group identity.

Pointer from Mark Thoma. Read the whole thing. My remarks:

1. As I wrote in my earlier comment on Romer, I see monopolistic competition as prevalent. Perhaps the Chicago school would want to argue that even though in practice we do not see perfect competition, if you make predictions assuming perfect competition, you will typically be correct. But I do not want to speak for Chicago.

2. Romer seems to want to march under the banner of “science” in economics, and I am skeptical of that. Reader Adam Gurri pointed me to an entire book of essays that take such a skeptical position. I am not sure that the essays speak to me, but I am still pondering.

3. In my view, as the economics profession has grown stronger in math, it has grown weaker in epistemology. That is, the generations of economists that came after Samuelson and Solow lost the ability to ask “How do we know that?” They are content to re-use equations simply because they can be found in prominent publications, but (as Noah Smith has pointed out) not because they have been verified empirically, as they would be in physics or another hard science.

There is a slight overlap between Romer’s critique and mine. Romer is saying that economists are choosing models in order to maintain “group cohesion.” I say that they are choosing models based on appeals to authority.

What I wish to claim is that epistemology in economics is really difficult. It is more difficult than in physics. We have a much harder time testing our theories experimentally. We face insurmountable levels of causal density. We do not have a neat, clean answer to the question “How do you know that?” It appears to me that physicists can answer that question in ways that are much more straightforward and compelling. (I am thinking of physics at a high school level. Maybe at the research frontier physics also faces epistemological challenges.)

Because epistemology in economics is really difficult, I think that if you care about epistemology, you are going to find much published research in economics frustrating. That will be true for articles that avoid math as well for articles that use math.

The Hopeless Argument over Productivity Stagnation

Scott Sumner writes,

Do I believe these numbers? Not really, as I don’t believe the government’s price level numbers. Lots of this “growth” occurred in the 1990s and is just Moore’s Law in computers, not the US actually producing more “stuff.” I don’t consider my current office PC to be 100 times better than my 1990 office PC.

Well, Scott, Your current PC’s hard disk capacity is measured in gigabits. Your 1990 PC’s hard drive was measured in megabits. Let me know which of your applications and documents to wipe out to get what’s on your current office PC to fit on your 1990 PC.

Oh, and have fun surfing the net with that 2400 baud modem. That is, if you can figure out how to install a TCP-IP stack on Windows.

In fact, as James Pethokoukis points out, there are those who argue that the official statistics have more recently under-stated the improvement in computers. (The Commerce Department numbers no longer track Moore’s Law.)

Of course, nobody is arguing that we don’t have better personal computers. What we are trying to assess is the amount of the multiple. But I do not know how to make that assessment. Or what fraction (multiple) of a 1990 PC’s value to assign to a smart phone.

To argue for or against stagnation, you have to assign a value to total output in a year and divide it by what you think is an appropriate measure of (labor) input. Then you have to take the second difference of that ratio. Cyclically-adjusted, of course.

Seems futile to me.

The EITC in Practice

Timothy Taylor writes,

The EITC adds a lot of complexity to the tax forms of the working poor, who are often not well-positioned to cope with that complexity, nor to hire someone else to cope with it. About 20% of EITC payments go to those who don’t actually qualify, which seems to happen because low-income people hand over their tax forms to paid tax preparers who try to get them signed up. Of course, there’s another group, not well-measured as far as I know, of working-poor households who would be eligible for the EITC but don’t know how to sign up for it.

One of the advantages of a universal benefit is that you give the money to everyone. My idea is that you would then tax some of it back at a marginal rate of 20 or 25 percent. That is, for every dollar that someone earns in the market, they are lose 20 cents or 25 cents in universal benefits. Compared to a marginal tax rate of zero, 25 percent is more complex and has a disincentive. But it is much less complex and de-motivating than our current system of sharp cut-off points for benefits like food stamps and housing assistance. And having a non-zero tax rate allows you to have a higher basic benefit at lower overall budget cost.

Social Science, Dogma, and Steven Pinker

Why am I re-reading The Blank Slate? First, because on first reading I marked it as one of the all-time great non-fiction books. Second, because I am thinking about possible parallels between the psychology of B.F. Skinner and the economics of Paul Samuelson. Some remarks:

1. Both Skinner and Samuelson dominated their fields around 1960. For those of you who do not know, Skinner’s view was that all behavior is learned, through the process of reward and punishment. We do what is rewarded and avoid what is punished.

2. Both took a very mechanistic view of, respectively, human behavior and the economy. We should not be surprised to see intellectuals in the aftermath of World War II seeing the world in terms of simple machines. What won that war? T-34 tanks. B-17 bombers. LCA’s that carried soldiers to the beaches held by Germans or Japanese. Relatively simple machines, built in enormous numbers, by countries whose economies were under considerable central control. Neither Skinner nor Samuelson would have thought in terms of personal computers or the Internet as metaphors.

3. Psychology eventually escaped the clutches of Skinner’s restricted research paradigm. Economics succumbed to Samuelson’s.

4. Pinker’s goal in the book is to dispel the dogma that human beings are shaped entirely by environmental factors, especially arbitrary social circumstances, and that social scientists have the power to re-shape society to achieve any desired outcome. On p. 19, he writes,

In behaviorism [Skinner’s psychology], an infant’s talents and abilities didn’t matter because there was no such thing as a talent or an ability. . .To a behaviorist, the only legitimate topic for psychology is overt behavior and how it is controlled by the present and past environment.

According to what I see as the central dogma of (progressive) social science, individual characteristics and choices bear little or no responsibility for differences in life outcomes. Instead, people who are successful owe their achievements to being born into power and privilege. People who are unsuccessful owe their deprivation to being born into poverty and discrimination. These outcomes are entirely changeable, through the application of social science.

Of course, this dogma is still prevalent. President Obama and many academics appear to be wedded to it.

5. I am starting to think about doing a work with the tentative title: Specialization and Trade: A Re-Introduction to Economics. The idea will be to shift focus from the issues of scarcity and choice that are now considered absolutely central in economics textbooks. Although these are certainly important, I want to argue that the central social phenomenon is specialization. In some respects, my goal is like Pinker’s. I want to emphasize the weaknesses in the simple, mechanistic views Samuelsonian economics, particularly Keynesianism, and instead offer a way of thinking of the economy that owes more to computational metaphors. Consider these sentences from Pinker (p. 31, p. 39, p. 40):

The mental world can be grounded in the physical world by the concepts of information, computation, and feedback.

The mind is a complex system composed of many interacting parts.

It is now simply misguided to ask whether humans are flexible or programmed, whether behavior is universal or varies across cultures, whether acts are learned or innate, whether we are essentially good or essentially evil. Humans behave flexibly because they are programmed: their minds are packed with combinatorial software that can generate an unlimited set of thoughts and behavior.

After listing a set of physical brain differences that are associated with different mental capacities and behavioral traits, Pinker writes (p. 44-45),

These gross features of the brain are almost certainly not sculpted by information coming in from the senses, which implies that differences in intelligence, scientific genius, sexual orientation, and impulsive violence are not entirely learned. . .There is much we don’t understand about how the brain is laid out in development, but we know that it is not indefinitely malleable by experience.

Similarly, I want to point out that economic outcomes are not indefinitely malleable by fiscal, monetary policy, and policies intended to correct market failures.

Steven Pinker on Money as a Consensual Hallucination

He writes,

Life in complex societies is built on social realities, the most obvious examples being money and the rule of law. But a social fact depends entirely on the willingness of people to treat it as a fact. It is specific to a community, as we see when people refuse to honor a foreign currency or fail to recognize the sovereignty of a self-proclaimed leader. And it can dissolve with changes in the collective psychology, as when a currency becomes worthless through hyperinflation or a regime collapses because people defy the policy and army en masse.

That is from p. 65 of The Blank Slate, which I am re-reading.

I would quibble that you do not get hyperinflation from a sudden loss of confidence in the currency. You get it when the government spends more than it taxes and loses the ability to borrow, so its only choice is to print money–and then people lose confidence in the currency.

The important social reality is that people are willing to lend to the government at affordable interest rates. That is what has the potential to suddenly change (see Greece) and that is why large deficits create potential instability.

Michael Tanner on a Guaranteed National Income

He writes,

As strong as the argument in favor of a guaranteed income may be, there are simply too many unanswered questions to rush forward with any such plan. Opponents of the welfare state have long criticized its supporters for believing that even good intentions justified failed programs. In considering some form of a universal basic income, we should avoid falling into the same trap. Instead we should pursue incremental steps: consolidate existing welfare programs, move from in-kind to cash benefits, increase transparency, and gather additional data. This would allow us to reap some of the gains from a universal income without the costs or risks.

It is a comprehensive, well-balanced paper. Nonetheless, I disagree with his conclusion. I think that the incentive problems with the current system are so bad that I would like to see the next Administration take its best shot at something better. As you know, my preference is for a negative-income-tax type system, but with the added administrative issue of having the grants be in the form of flexible-benefit dollars that only can be used for food, housing, medical care, and education. In terms of the trade-offs Tanner discusses, I am willing to incur higher administrative costs in order to keep the overall cost of the grants lower while trying to keep the implicit marginal tax rate below 25 percent.

Because the current system discourages marriage and work, I think that the larger mistake would be to leave it in place rather than try something that is likely to be better, knowing that it will be imperfect.

Mathiness, Starting in 1937

Noah Smith writes,

Macroeconomic theory is chock full of mathiness. It’s not just Lucas and Prescott, it’s the whole scientific culture of the field.

I think you find this going all the way back to John Hicks’ famous 1937 paper, “Mr. Keynes and the Classics.”

The “i” in this model could be a short-term interest rate, or it could be a long-term interest rate. It could be a risk-free rate, or it could be a risky rate. It could be a nominal rate, or it could be a real rate.

And, as Smith points out once again, none of the equations in the IS-LM model, or any other mathematical macro model, has any demonstrated empirical validity. The equations are, at best, a way of organizing and expressing the economist’s opinions about macro.

My own opinion, as you know, is that thinking about the economy as if it were a single business (or as a single consumer who also runs a single business) is wrong-footed from the very start. Instead, I believe that it is in the shifting kaleidoscope of patterns of specialization and trade among multitudes of businesses that employment fluctuations take place.

It is fascinating to me that there are critics who will not buy the PSST story until they see it expressed using math. To me, that is as beside the point as arguing that it has no validity unless it can be told in Latin or Swahili or Yiddish.

Housing and the Punch Bowl

On Wednesday, I appeared on a panel discussing the state of credit underwriting in the housing market. I raised two questions:

1. Are national credit standards, set by Freddie, Fannie, and FHA, appropriate, or do they throw out too much local information?

I made a Four Forces argument that there are too many divergences in economic performance that make local information valuable. On the other hand, you could argue that simply by tracking search data, Google and Zillow have better information on local trends than would an on-site mortgage underwriter. Interestingly, the session chairman, Bob Van Order, presented information showing that after the crash loans under-performed relative to their known characteristics (including ex post home price performance) and over-performed more recently. This suggests that it is possible for underwriting to be looser or tighter than it appears based on observable characteristics, which in a way suggests that there is local information that is important.

2. Are we in 2004? That is, is the stage set for another housing bubble, and all that is needed is a loosening of credit standards?

One of the speakers, Sam Khater of CoreLogic, re-iterated what he wrote here, that “price-to-income and price-to-rent ratios are high.”

Very few mortgages originated since 2009 have defaulted. There are two reasons for this. One is that credit standards were tightened. The other is that the trend of house prices has been up. Now, there is all sorts of talk about the need to loosen standards. I pointed out that both the private sector and public officials tend to be very procyclical when it comes to mortgage credit–when the market is going up, they want to loosen standards, and after it crashes they want to tighten standards.

I would be ok with loosening standards on credit scores now, provided that the industry holds the line on down payments, meaning that we do not see an increase in the the proportion of loans with down payments below 10 percent. This is not the time for the FHA to make a big expansion in its high-LTV lending (Ed Golding, can you hear me?)

To encourage high-LTV lending now would be adding alcohol to the punch bowl just as the party is getting good.

Paging James Bennett and Michael Lotus

Ola Ollson and Christopher Paik write,

We outline an agricultural origins-model of cultural divergence where we claim that the advent of farming in a core region was characterized by collectivist values and eventually triggered the out-migration of individualistic farmers towards more and more peripheral areas. This migration pattern caused the initial cultural divergence, which remained persistent over generations. The key mechanism is demonstrated in an extended Malthusian growth model that explicitly models cultural dynamics and a migration choice for individualistic farmers. Using detailed data on the date of adoption of Neolithic agriculture among Western regions and countries, the empirical findings show that the regions which adopted agriculture early also value obedience more and feel less in control of their lives. They have also had very little experience of democracy during the last century. The findings add to the literature by suggesting the possibility of extremely long lasting norms and beliefs infuencing today’s socioeconomic outcomes.

Pointer from Tyler Cowen.