The scout welcomes contrary information as helping to correct this map. The soldier uses reasoning to defend one’s map of reality. The solider fights contrary information as if to stave off defeat.
It looks as though this is going to be a good year for new books. Galef’s is one of the best so far.
Does Galef’s book address how the scout mindset should deal with misinformation and disinformation? That mindset is much more vulnerable to being misled by those things.
“That mindset is much more vulnerable to being misled by those things.”
Your assertion has no support. I recommend trying to read the book.
It stands to reason that the scout mindset would be more vulnerable to being misled by misinformation and disinformation. Making the effort to “drop your intellectual defenses” (i.e., scepticism) results in credulity and leads that result. If on the other hand Arnold is only talking about an unthinking or irrational rejection of other views, and not including a sceptical approach, then the scout/soldier dichotomy is banal – just basic rationality.
Obviously a ‘soldier’ mindset renders one much more susceptible to misinformation and disinformation from one’s own tribe. I don’t think dropping your intellectual defenses means abandoning one’s scepticism. Doesn’t it mean almost the opposite: accepting greater skepticism of one’s original position?
It may be basic rationality, but then basic rationality is rare. People do tend to react with hostility toward contrary positions and evidence and view the intellectual process as a deliberate pursuit of evidence supportive of their current position, rather than a dispassionate pursuit of whatever position the evidence favors, and the former definitely exposes one to rather than insulates one from misinformation imo.
By definition, the soldier mindset resists changing its position based on new information, so it tends to reject both good and bad information. The scout mindset more readily accepts new information as updates to its priors. If that information is wrong, the updates will reduce the accuracy of the scout’s understanding.
The soldier/scout dichotomy closely resembles advice to keep an open mind, and all the usual witticisms about that apply:
“The trouble with having an open mind, of course, is that people will insist on coming along and trying to put things in it.” … “Your assumptions are your windows on the world. Scrub them off every once in a while, or the light won’t come in.” … “If you don’t believe in something, you’ll fall for anything.” … “Flexibility requires an open mind and a welcoming of new alternatives.”
As Thucydides mentions, encouraging an open mind is not very novel or helpful by itself.
Perhaps the answer to my question is yes, that Galef does explain how scouts should recognize and handle bad information, whether that bad information is malicious or simply in error. However, I struggle to draw a clear line between that — which implicitly has some core epistemology that resists easy change — and aspects of a soldier mindset.
1) Cogent punchline in your review!
2) Re:
“Galef suggests that one good habit is to cultivate friends who model the scout mindset.
‘One of the biggest things you can do to change your thinking is to change the people you surround yourself with. We humans are social creatures, and our identities are shaped by our social circles, almost without our noticing.’ (p. 219)”
Here, too, there is a tension between (a) individual strategy to maximize one’s scout mindset and (b) aggregate social dynamics. If scouts self-sort into a closed group, then they thereby relegate soldiers to a closed group. Won’t soldiers, too, become ‘more so’? My intuition is that ongoing social interaction and dialogue, however imperfect, across mindset types/hybrids is crucial to a healthy modern society.
Or perhaps I generalize unduly by analogy to Charles Murray’s dichotomy of Belmont and Fishtown?
I have to say that I am getting tired of dualism everywhere. Dividing the world into the Good and the Bad.
Mr. Kling, you have condemned such either/or thinking in other places.
But now you have been taken with the scout vs soldier dichotomy, and we back to dualism. Scout good, Soldier bad. I don’t like it.
It is like Charles Reich’s Consciousness Two versus Consciousness Three.
I want a classification system that doesn’t make one approach morally and mentally superior to another. In ecology, the health of the biosphere depends on diversity. Many different styles are needed. I think Jonathan Haidt understood this.
So make up your mind. You dump on populists for lacking an ideology but when it’s never-Trumpers all over the map, that is “scout mentality.” Sadly, that seems where FIT has ended up too: “my clique good, everybody else bad.”
“It looks as though this is going to be a good year for new books.”
Reminder 1: the new Charles Murray book will drop next week (6/15).
Reminder 2: “The Inconvenient Minority” drops in mid-July (7/13).
Looking forward to reading both of them.
“And why beholdest thou the mote that is in thy brother’s eye, but considerest not the beam that is in thine own eye?”
Sure, most people should be more humble, self-critical, open-minded, and charitable. The trouble is, not only do people not do this often enough, they *are really bad at it* even when they try. Even when they try really hard and have a lot of practice and training, they are still really bad at it, because they are fighting an uphill battle with their own psychological nature. It’s very easy to fool yourself into thinking you have been a good scout and thus stick to attractive error.
On the other hand, people are naturally *really good* at finding flaws and poking holes in theories with which they disagree, or claims from people they don’t like. As easy as it is to fool yourself, it is really hard to fool your competition or opponents or enemies into going along with a weakness in your arguments.
The question is which sets of norms and designs for epistemic institutions should one favor in the light of this stark asymmetry in human nature and cognition.
If a scout is fooling herself and clinging to error, she would need – indeed want – a more accurate scout to come along and vigorously argue her out of her view. How are these encounters going to happen? What are the incentives driving these processes that will gradually bring most scouts into accord with the most accurate theory?
In the marketplace, there is only one obvious answer to any of this, which is adversarial competition. In our legal tradition, we have found no better way to guarantee fairness, accuracy, and due process. In publishing, we have editors, reviewers, proof-readers, all looking our work over for errors and mistakes. Whenever results matter, we have murder boards, scrub-downs, stress-tests, etc., to try and avoid disaster by finding as many problems and loose ends we can before implementation, and in the these matters one wants a separate set of eyes with a certain amount of ruthlessness in your inspectors.
In the book’s terminology, epistemic soldiers are bad without intellectually redeeming features, while scouts are purely good. But the problem is, aside from the tell-tale signs of nasty interaction, how does one tell the difference between a scout arguing a different belief in good faith from a soldier arguing in bad faith, especially since everyone *thinks* they are being a good scout?
My concern is that this is just yet another way that people will abuse language to prematurely dismiss opposing viewpoints, by unfairly labeling them with a denigrating epithet which applies “asymmetric insight” in terms of avoiding engaging with contrary points on the merits and, instead, personalizing debates by diagnosing bad motivations, pretexts, and inadequate levels of self-awareness. We already have too much of that going on.
“In the book’s terminology, epistemic soldiers are bad without intellectually redeeming features, while scouts are purely good.”
Intellectually, I think this is pretty much true. The issue is that for knowledge to be actionable may require a degree of ‘soldierlyness.’ If everyone in an army is independently looking for the best place to camp, it may ‘on average’ find the best site to camp, but will be scattered about it, whereas if soldiers are somewhat prone to fall in line behind a suboptimal but good enough campsite, the cost of suboptimality is made up for by greater unity. There are some things, in fact, where it’s more important we all do the same thing than do the right thing (e.g., it’s more important that we all follow the same traffic laws than that any of us follow the optimal set of traffic laws).
But with public policy, I don’t think think unified action really has that much value. The system imposes the necessary degree of regimentation. If each voter does the math and picks the best candidate, only one wins; we all end up at the same campsite anyway.
“But with public policy, I don’t think think unified action really has that much value.”
Certainly it has value to people pushing the interest of their group (and by extension, themselves). There are for instance many voting blocks/interest groups which can reliably provide nearly uniform support for a candidate/party/bill. They tend to get the results they want, even when they are unfair to less coordinated outsiders.
Haidt’s literal example of this from his book is a coordinated phalanx that surrenders its decision making to the group overwhelming an uncoordinated horde of individual fighters.
“If each voter does the math and picks the best candidate”
Best for Whom?
Democracy isn’t disinterested people choosing the best candidate for the nation. It’s interest groups trying to seize power for their own ends. The purpose of democracy is to provide a less violent way for those power struggles to take place, but it doesn’t make them go away.