The parrot is not dead, insists Ricardo Reis.
these dissertation theses are fairly representative of what modern research in macroeconomics looks like. . .
used micro data to show that it is mostly young people who adjust their consumption when monetary policy changes interest rates. Younger people are more likely to obtain a new mortgage once interest rate changes, either to buy a new home or to refinance an old one, and to spend new available funds. Her research has painstaking empirical work that focuses on the role of mortgages and their refinancing features, and a model with much heterogeneity across households…
There is more at the link. Pointer from Tyler Cowen.
Work of the sort described above sounds promising. It differs from traditional macroeconomics in a refreshing way.
Traditional macroeconomists take some very dodgy averages and call them “aggregates.” If that practice comes to be replaced by work that takes seriously the variation that underlies the averages, then we will have reason to celebrate. Unfortunately, many of the other papers Reis describes sound to me more like traditional macroeconomics.
What is wrong with the aggregation exercise? Just off the top of my head:
1. Wages and unemployment rates vary by demographic groups more than the aggregate wage and unemployment rate vary over the medium run (which is the typical period for macroeconomic analysis.
2. Inflation rates vary more across industries (health care vs. computer chips) than the average inflation rate varies over the medium run.
3. Saving rates vary more by household characteristics (including cultural background) than they vary over the medium run.
4. Much of work does not produce final output. Instead, much of the labor force has become Garett Jones workers, producing organizational capability.
5. There has been a steady increase in hard-to-measure components of the economy: the value of medical services; the value of employee benefits; the value of consumers’ surplus derived from information and communication technology; etc.
My preferred alternative to traditional macro is PSST. Traditional macroeconomists are more likely to think in terms of a single labor market. In the PSST view, unemployment is a phenomenon that results when patterns of specialization need to be reconfigured. Thinking in terms of a single labor market model is wrong-footed from the get-go.
In terms of taking Macro 2 in 1990, it appears everything I learned Japan has disproven.
1) I do believe the hardest point of Macro Economic theories is the whole economy interacts with social and political realities. One idea Keynes had really wrong was workers would end up getting 15 hours aweek which has not happened. While the average person work hours over lifetime has decreased, we now have the average person at school a lot longer and more people enjoy retirement.
2) I think the main value of Macro is you have measure the whole economy. I don’t think The Great Recession or Japan supports Keynesian economics or Friedman Monetarism. However, we have PSST economics is leading to huge breakdowns in the areas of the Rust Belt which probably led to Trump’s win.
3) For conservatives, I find the hardest contradiction is the PSST/Creative Destruction economics spills over to social and political realities. Look at the coal mining jobs since 1950 which has a lot of creative destruction but disrupted a lot of local communities. I don’t see how a conservative can say they support both PSST/Creative Destruction and local church & government institutions. Endless PSST weakens local churches.
“Endless PSST weakens local churches.”
I’m not sure that’s necessarily the case. I think declining population and losses in human capital in mining towns have more to do with weakening churches in those communities than anything.
Just like the economy adapts to changes, churches do to. Perhaps not in the same way because profit isn’t the primary goal (or shouldn’t be). In theory, churches should be attempting to adapt to optimize on religious objectives, whatever those may be. Success isn’t necessarily measured in membership numbers or tithes.
You could argue that even though attendance and membership is down, those who still do attend regularly are more devout. Maybe churches are actually more effective than they used to be, but there’s not a good way to tell. Companies optimize for profit, but what do churches optimize for? In the economy you can measure dollars, but I’m not sure what the measure is for organizations like churches.
A few PSST adaptations I’ve noticed about churches in my Midwestern city of about 500,000:
1) In the 1950’s, mega-churches weren’t really a thing. Where you used to have 10 different churches from the same denomination in one community, now you might now only have two churches with more resources and larger membership. Economies of scale, so to speak.
2) A lot of denominational churches do appear to be dying, but in our area they are being replaced with independent “community” churches which don’t appear to be associated with any specific denomination. I can’t speak for total membership in the aggregate.
3) I’ve seen churches meeting in untraditional buildings lately. The traditional steeple and bell tower is becoming a thing of the past, at least in my area. Churches are renting out school buildings on the weekends and are leasing commercial spaces instead of owning traditional church buildings. Related, a lot of old church buildings that used to be owned by denominational churches are being repurposed into unique single family or multi-family homes or business spaces.
4) I’ve seen a lot of entrepreneurial ideas being experimented with in our area. We have a church that operates a coffee shop as a community outreach project. Mission work has changed quite a bit, too. I’ve seen churches supporting missionaries that operate in places like San Francisco and Minneapolis instead of Africa or South America. I’ve also heard of online outreach projects by a few churches here. I’ve also seen churches that run addiction recovery programs.
But while individual changes are much larger than aggregate, this means aggregate changes are even more important since it takes vast numbers of individual changes to alter aggregates. More knowledge of individual changes should result in better aggregate predictions though.
One might be interested in modeling and understanding the effects of compositional constraints. For example, in a game of musical chairs, no matter what the behavior of individual players, when the music stops one of the players will be left standing. In principle, one might be able to understand this phenomenon by carefully considering all of the dependencies among different players’ movements. In practice, the phenomenon is more easily understood by considering the aggregate number of chairs and aggregate number of players rather than the variations across players and configurations and positions of chairs.
Under the PSST view, when chairs’ positions are altered, players must change their walking strategies to find new chairs and some players will more easily adjust than others. That may be true but doesn’t really address the compositional constraint aspects. I see macroeconomists’ focus on aggregates as their attempt to understand compositional constraints, which may be why market monetarists often talk about “musical chairs models” and “hot potato” effects.
By the way, compositional constraints also arise in finance theory, where one must incorporate constraints such as every share of stock must be held by someone, the aggregation of all individuals’ portfolios must sum up to the market portfolio, etc. When there is variation across subgroups such as varying risk preferences in finance or points 1-3 in Kling’s post, it’s all the more important to understand constraints on aggregates to avoid fallacies of composition.
http://fxdiebold.blogspot.com/2017/04/on-inefficiency-of-pseudo-out-of-sample.html
Mark Thoma listed this reference. It is an incremental algorithm which finds the minimum number of aggregate samples that typically represent the sustainable portion of the model. Stability is the proportion of out of band samples that seemed to be no part of a sustainable set of sample. So instead of blindly measuring the whole if finds how the out of band samples cause risk to your model. In other words, find the set that fit your model, if too many points are left out, your model sucks.