on a per capita basis, Medicare costs are rising faster for those at later ages. In 2000, for example, Medicare typically spent about 2.4 times as much for a 90 year-old as for a 65 year-old. By 2011, it was spending about 2.8 times as much
There is much more at the link. He cites a Kaiser Family Foundation analysis.
Something to bear in mind is that the mix of ages within Medicare can change. If you get an increase in “young old” in a given year, then average spending could decline, even though on a lifetime basis Medicare spending is not falling, and may even be rising.
Yes, the bubble of baby boom retirees is surely driving the average medicare age down (and per-capita costs down with it). But in 10-15 years things could get ‘unexpectedly’ ugly pretty fast as the oldest boomers get into their mid-80s.
What about the OOGs?
You mean the Old Old Geezers? I don’t think they exist.
There needs to be a statistical-analysis-terminology equivalent for “adjusting for inflation”, or, “real vs. nominal,” or “in 2005 dollars.” Something that makes sure to adjust for Simpson’s paradox and fallacies of composition, adjusting for shifts in granular cohorts, and so forth. Related to your recommendation for ‘scenario analysis’.
For instance, this year government workers got their raise, but it fell below the rate of CPI inflation. You can respond to, “Hurray, another raise!” with “Only in nominal terms, in real terms, we are getting paid less.”
Likewise when someone reports “Hurray! Average Medicare costs per capita are decreasing! We’re bending the cost curve!”, someone else can easily respond with “That’s only in composed terms, in decomposed terms, expected lifetime expenditures are going up, and that’s why an actuarial predictions of future costs are continuing to climb.”