Robin Hanson interviewed by Brian Beckcom. Worth listening to the whole thing or reading the whole transcript. Some excerpts:
one of the most important things that happens to us is that we might get accused of violating a norm. And in that case, we want to be ready to defend ourselves to say that we aren’t violating a norm. And that’s overwhelmingly important. So important that your conscious mind is not really the president or king of your mind. It’s the press secretary. Its job is mainly to keep track of what you’re doing and always have a story about why what you’re doing was okay and not violating norms.
. . .we are quite ready to open to the idea that other people are making mistakes and other people are biased and other people have hidden motives. And that’s going to be the case in politics. Politics is obviously a scenario where we have conflicts. And so, the other side is going to be a plausible candidate to us of people who are falling for biases and mistakes and who have motives that they aren’t aware of because we’re happy to attribute the other side to their terrible mistakes and motives. Whereas for our side, we don’t think that needs to be invoked because we’re doing the reasonable thing and they’re doing the unreasonable thing.
He talks about the difference between elites, who have influence, and experts who have, well, expertise.
There’s a whole bunch of complicated things going into choosing elites, but basically they are two different games played by different rules that overlap. And so, one of the more interesting things is elites often try to hide the elite game they play and pretend to be experts or pretend to be something else. And, but often, you know, and in some sense, the Nobel prize winner shows you that, in fact, the elite game is the game most people would like to join if they could. Even Nobel prize winners say, “Too bad I’m only an expert. I’m not an elite. Because I want to go try to be one of these elites.”
Consider the pandemic.
Pandemic experts have had their standard story about what to do in a pandemic that goes back decades. And, you know, you can look on all their standard writings about, you know, what to do about travel bans or what to do about masks or what to do about quarantines and all those sorts of things. And they’ve had their standard story about what to do in a pandemic. And there was no particularly new information that showed up except as soon as we had a pandemic, all the experts, all the elites in the world suddenly decided, “That’s a subject to talk about.” The elites went wild talking to each other about pandemics and the elites decided that they did want masks and they did what quarantines and lockdowns, and they did want travel bans. And so, the elites declared that was the better thing. And they, the experts, caved immediately. As soon as the elites declared that that was better, the experts changed their mind about what the expert judgment was just like in 1984.
On how to control your biases:
Change your incentives. So, for example, one way to change your incentives about almost any topic is to make a bet about it. As soon as someone says to you, after something you said, “Do you want to bet,” your mental process immediately switches. You suddenly – well, from the moment you said it, it sounded clear and clean and believable and obvious even, and as soon as someone says “wanna bet,” you immediately start to wonder how you could be wrong.
. . .Have fewer opinions. And in each topic, ask yourself, “Do I need an opinion on this? Am I, you know, especially good at this?” And if you don’t need an opinion or you don’t have any special expertise compared to other people you could rely on, then don’t have an opinion on it.
And yes, of course, he is on the list for Fantasy Intellectual Teams (FITs).
Arnold, you are right that Robin Hanson is a great thinker. Let me add that he is a great one for the same reason that Tyler Cowen is a charlatan (read again the last paragraph that you quote from the interview).
It’s irrelevant that RH is a great thinker, however. You want to take him out of what he does to do something else that most likely he cannot do and/or doesn’t want to do. If he wants to become a politician (that’s what you are looking for), please tell him to test himself by running for a position in his local government. I bet you that he would end up like G. Warren Nutter in the 1960s (if you don’t know about him, there must be someone in GMU’s Econ who knows Nutter’s story).
I don’t think that Arnold wants them to become politicians. He just wants people to pay more attention to someone like Robin Hanson than to someone like Ibram Kenzi.
It’s hard to name anyone but blogosphere-celebrities since “this guy I know” doesn’t communicate much, but of those, Hanson would be in my top five for sure, and Greg Cochran and Steve Hsu would be near the top of my list too.
Probably by means of reinforced operant conditioning, I can’t help but want to play a version of fantasy sports in which each player is assigned a particular position, and create a kind of “shadow cabinet” in contrast to certain government positions. (Ideally those positions in a rationally-reorganized structure as opposed to the present mess, but you go to fantasize with the structure you have.)
So, I might put Tanner Greer as Secretary of State and Scott Sumner as Chairman at the Fed, but not vice versa. Bryan Caplan as the Secretary of Education, but not of Homeland Security (Mark Krikorian, with Stewart Baker at CISA). Eric Lofgren as Under Secretary of Defense for Acquisition, with Jeff Bezos multi-hatted as Secretary of Defense, Postmaster General, and Administrators of GSA and FEMA, at least. Hell, probably DNI too. You get the picture.
Still, I look forward to the blog post explaining the connection between “intellectual status inversion” and the lose in institutional reputation and competence, and until then I will hold off on any anticipatory comments about the theory.
That being said, from many personal observations, I can say that there are plenty of sufficiently, tolerably good thinkers near the top of many of those institutions, many ordinary institutions functioned quite well and were justifiably well-regarded in the past without especially good thinkers at the help (or in influence via their status), and the precipitous decline in trust and competence seems to have occurred without enough change in the structure of the status of good intellectuals to account for the rapid degradation we observe and, alas, experience first hand.
Here’s a question. Was was happened with McNamara and Vietnam a result of giving too little status to “whiz kid” intellectuals, or too much? If one starts going down the road of “well, there are of course good-thinker intellectuals and bad-thinker intellectuals, and its the good ones that get too little status, while the bad ones get too much”, then I think that really shifts the question to whatever axis or criteria one is using to differentiate the intellectuals, that is, the status *between* thinkers according to quality and accuracy, not *of* thinkers in general.
I like Robin, and went to see him when he came to talk in Bratislava maybe 2 years ago. (Great lecture about internal bias)
But he’s wrong about things one is not an expert about:
“then don’t have an opinion on it.”
Instead, have a weak opinion about it, and try to know what the important facts are that you don’t know.
He’s absolutely right about making a bet on something to focus on what you think you know, and a Strong Opinion might be a $100 bet or $50, while a weak opinion might be $10, or $5, or only $1.
I’m thinking GameStop will be down to about $22 in 6 months, so if I buy a share now for about $325 it’s to donate about $300 to the goal of making short sellers take it in the shorts for shorting 138% of the shares. The short sellers were willing taking bets that GME would drop – and they’re about to lose their shirts. As their shorts come due and they HAVE to have shares they’ve sold – or default/ breach contract/ go bankrupt.
When studying decision analysis, with the idea of helping get better Bayesian “prior probabilities” about outcomes, it was helpful to have this approach:
Assume bad outcome X happened.
What do you think caused it?
Those answering would then have higher probabilities of the bad event.
Decision making under uncertainty is the most important highly underdiscussed issue. Usually it’s an unstated but significant issue in most discussions of any problem, including COVID.
. .Have fewer opinions. And in each topic, ask yourself, “Do I need an opinion on this? Am I, you know, especially good at this?” And if you don’t need an opinion or you don’t have any special expertise compared to other people you could rely on, then don’t have an opinion on it.
I think that’s wrong. We never take in information as some isolated fact. Always–always, always, always–it has to fit into some framework, some “narrative”, some way of understanding the world. That means we have to have an opinion about everything (or just ignore anything we don’t have an opinion about). However, the opinion we do have can be loosely held and easy to change (a weak Bayesian prior). That is probably what most opinions should be.
Or you can have several contradictory opinions at once and see how new information fits or doesn’t fit into them. I personally have 3 opinions on why black Americans do so much worse than white Americans: 1) white racism, and then a system of white privilege, 2) black conduct–more fatherless children, more present orientation, etc., 3) genetics–whites have higher IQ, conscientiousness, etc, with Ashkenazim having the highest and being most successful.