Hostname: page-component-54dcc4c588-scsgl Total loading time: 0 Render date: 2025-09-11T08:12:20.082Z Has data issue: false hasContentIssue false

Stubborn Fools and the Arrogantly Open-minded

Published online by Cambridge University Press:  09 September 2025

Laura Frances Callahan*
Affiliation:
Department of Philosophy, https://ror.org/00mkhxb43 University of Notre Dame , Notre Dame, IN, USA
Rights & Permissions [Opens in a new window]

Abstract

The thought that intellectual arrogance consists in, roughly, overconfident resilience in one’s beliefs has been influential in philosophy and psychology. This thought is in the background of much of the philosophical literature on disagreement as well as some leading psychological scales of intellectual humility. It is not true, however. This paper highlights cases (of “stubborn fools” and the “arrogantly open-minded”) that cause trouble for equating intellectual arrogance with overconfident belief resilience. These cases are much better accommodated if we see intellectual arrogance as, instead, a form of vicious intellectual distraction by the ego.

Information

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of The Canadian Journal of Philosophy, Inc

1. Introduction

Bob is at Thanksgiving dinner, and the topic has turned to politics. He finds an occasion to trot out his current favorite claim—“The whole border wall would have only cost each American $0.03, if we’d just gone ahead and done it when Trump wanted to the first time around!” His nieces and nephews express skepticism and try to give him reasons. But Bob does not listen. Those idiots do not know what they are talking about.

Bob seems intellectually arrogant. But what, exactly, makes him arrogant? Is his arrogance precisely his overconfidence and stubbornness in the face of disagreement—his disposition to hold his view confidently and to resist reconsidering, even when challenged by others who disagree? As Levy (Reference Levy2023b, 3143) has it, “It’s a truism that intellectually humble people moderate their views in the light of evidence.” Perhaps, then, intellectually arrogant people are precisely those who do not. This is a simple, powerful sort of view. And although it is not a view that has been explicitly defended in philosophical debates about the nature of intellectual humility and intellectual arrogance,Footnote 1 , Footnote 2 it nonetheless seems to be influential in psychology as well as some other corners of philosophical literature (more on which in Section 2). It will be helpful to have a label and official definition for this type of view, so here goes:

Hold Fast. Intellectual arrogance consists in dispositions to be overly confident in one’s beliefs, and/or to be overly resilient in one’s beliefs.

I’m going to argue that Hold Fast is the wrong way to think about intellectual arrogance. And Levy’s “truism” is false.Footnote 3 Overconfidence relative to one’s evidence and stubbornness in the face of evidential or interpersonal belief challenges are neither necessary nor sufficient for intellectual arrogance, which is essentially something else. My particular suggestion is that arrogance is a flavor of the viciously prideful disposition to be distracted in one’s thinking by one’s intellectual ego. Bob is not arrogant simply in virtue of being overconfident or resilient in his beliefs. He is arrogant in virtue of being intellectually distracted in his responses to disagreement, allowing attention to his own ego to hinder a fair and appropriate response to his epistemic environment—that is, his family members.

So, one aim of this paper is to support conceiving of (intellectual humility and) intellectual arrogance in terms of distraction. I have argued elsewhere that intellectual humility is freedom, in one’s thinking, from prideful distraction by one’s intellectual ego (Callahan Reference Callahan2024). Here, I will suggest that intellectual arrogance is one particular viciously prideful disposition, at odds with intellectual humility. There are also other flavors of vicious and distracting intellectual pride: intellectual vanity, intellectual domination, etc.Footnote 4 But these will not concern me here.

My second aim in the paper is to highlight a variety of cases—cases of “stubborn fools” and those who are “arrogantly open-minded”—that seem to me to deserve much more attention in the literature, regardless of the merits of my particular analyses of intellectual humility and intellectual arrogance. These are cases that complicate the Bob-inspired thought that there’s a very tight connection between being intellectually arrogant and sticking to your guns, or being overly confident and resilient in your views.

I’ll start in Section 2 by reviewing some of the strands in the literature that support or express a Hold Fast view of intellectual arrogance. This is particularly important because, again, Hold Fast is not a particular account of arrogance that anyone has defended as such. It’s also important because the objections of Sections 3 and 4 might seem so decisive that one might worry Hold Fast is a bit of a straw man. My aim in Section 2 is to show that Hold Fast is an influential background thought in the philosophy of disagreement, and, perhaps more worryingly, it is roughly the account of arrogance suggested by the existing psychological scales of intellectual humility.

With this established, I’ll argue in Section 3 that not all overconfident, resilient believers are arrogant, and in Section 4 I’ll highlight cases of non-resilient, arrogant intellectual behavior. Sections 3 and 4, then, suggest that Hold Fast is the wrong way of thinking about intellectual arrogance. In Section 5, I’ll sketch an alternative, distraction-based account and explain how it correctly diagnoses the cases reviewed in Sections 3 and 4.

2. Arrogance, Steadfastness, and Resilience in the Literature

Let us start with the dialectic in the disagreement literature that casts steadfasters as necessarily arrogant—or at least, unhumble. The pictured thinking is that steadfasters are doomed to be (or at least to risk seeming) dogmatic or arrogant, whereas conciliationists are (or at least often seem) admirably modest or humble. Or, as Matheson (Reference Matheson2024, 326) puts it, “Considerations involving intellectual humility have often been used to motivate a conciliatory view of disagreement.”

One quick note on the terminology here: “dogmatic” and “arrogant” are often used synonymously in this literature.Footnote 5 For example, whereas Hazlett (Reference Hazlett2012) interdefines intellectual humility with intellectual dogmatism, Hazlett (Reference Hazlett, Gordon and Cartere2017) interdefines intellectual humility with intellectual arrogance, calling the latter account of intellectual humility a “variant” on the former. Similarly, modesty and humility have long been treated nearly interchangeably, with philosophers working on humility interacting seamlessly with, for example, Driver’s (Reference Driver1989) and Ben-Ze’ew’s (Reference Ben-Ze’ew1993) work on modesty (and vice versa). So, where we see conciliationists claiming “modesty” or accusing steadfasters of “dogmatic” thinking, I think we are also seeing a background way of thinking about arrogance (and, relatedly, humility).

This is exactly what we do see, for example, in some of Christensen’s work defending conciliationism. Christensen claims, “Maintaining my belief in the face of this sort of disagreement (i.e., peer disagreement) can seem dogmatic” (Reference Christensen2010, 206, parenthetical added). And Christensen (Reference Christensen, Christensen and Lackey2013) links conciliation, in contrast, with epistemic modesty:

[T]he disagreement of others who have assessed the same evidence differently provides at least some reason to suspect that we have in fact made such a mistake; and that reason to suspect that we have made a mistake in assessing the evidence is often also reason to be less confident in the conclusion we initially came to. The rationale for revision, then, expresses a certain kind of epistemic modesty. (Christensen Reference Christensen, Christensen and Lackey2013, 77)

According to Christensen, then, maintaining one’s view despite peer disagreement would instantiate overconfidence in the face of new reasons. And because maintaining one’s view would be overly confident, epistemic modesty (which, again, I’m assuming is synonymous with intellectual humility) precludes it. Note that Richard Feldman (Reference Feldman and Antony2006, 213) too links conciliation and the giving up of one’s beliefs in response to persistent disagreement to humility.Footnote 6

But worries about the arrogance of remaining confident in the face of disagreement are not limited to card-carrying conciliationists. Hazlett (Reference Hazlett2012) argues that it may be reasonable to be steadfast in some peer disagreement cases—that is, to maintain one’s first-order belief—in spite of a seeming flirtation with intellectual dogmatism or arrogance, because what would really be viciously arrogant would be maintaining the higher order belief that one’s initial belief is reasonable. For Hazlett, then, steadfastness about higher-order attitudes would indeed be arrogant, although garden variety steadfastness needn’t be.

Again, Carter and Pritchard (Reference Carter, Pritchard, Chienkuo, Slote and Sosa2016), though not conciliationists themselves, seem to identify conciliation toward a disagreeing party with an exercise of intellectual humility. They write:

A widely shared insight in the disagreement literature is that, in the face of a disagreement with a recognised epistemic peer … the epistemically virtuous agent should adopt a stance of intellectual humility—that is, a stance where one exhibits some measure of epistemic deference by reducing one’s initial confidence in the matter of contention. (51)Footnote 7

Finally, echoing this theme—and indeed, attempting to provide empirical support for it—a recent paper by Beebe and Matheson (Reference Beebe and Matheson2023) analyzes data aimed at testing the hypothesis that intellectual humility is linked with conciliation in the face of peer disagreement. They find correlations between conciliatory responses to disagreement and intellectual humility, which they suggest bolsters the connections posited in much of the literature just reviewed.Footnote 8

So, much of the disagreement literature links intellectual arrogance with a disposition to belief resilience/stubbornness and overconfidence. Much of the disagreement literature, then, assumes roughly a Hold Fast picture of intellectual arrogance. But it’s not just philosophers of disagreement who seem attracted to something like Hold Fast. Psychologists developing measures of intellectual humility (and hence, indirectly, intellectual arroganceFootnote 9 , Footnote 10) also seem to be in Hold Fast’s grip. For the prominent psychological scales for intellectual humility, all seem to home in on subjects’ willingness to change their beliefs or give them up—that is, subjects’ lack of belief resilience and (in some cases) low confidence in their beliefs.

For example, Krumrei-Mancuso and Rouse’s influential scale (2016) calls “openness to revising one’s viewpoint” one of the four factors that enter into overall intellectual humility. Similarly, Leary et al.’s scale (2017) asks participants to rate their agreement with the following six items:

  • I question my own opinions, positions, and viewpoints because they could be wrong.

  • I reconsider my opinions when presented with new evidence.

  • I recognize the value in opinions that are different from my own.

  • I accept that my beliefs and attitudes may be wrong.

  • In the face of conflicting evidence, I am open to changing my opinions.

  • I like finding out new information that differs from what I already think is true.

Notice that most of these items—the third and possibly the sixth being exceptions—explicitly reference a willingness to abandon one’s current beliefs. The intellectually humble person, as painted by this scale and—perhaps to a lesser extent, as commonly conceived—is a person who does not hold on to their views tightly, who is willing to revise. Given the (significant, if not complete) interdependency of intellectual humility and intellectual arrogance, the arrogant person will thus be a person who holds their beliefs tightly and is not open to changing their mind.

Finally, Hoyle et al.’s scale (Reference Hoyle, Davisson, Diebels and Leary2016) includes the items: “My views about _______ are just as likely to be wrong as other views,” “I am open to new information in the area of _______ that might change my view,” “My views about _______ today may someday turn out to be wrong,” and “My views about _______ may change with additional evidence.” Here again, we see the intellectually humble person painted as one who is open to changing their beliefs or turning out to have been wrong, whereas the intellectually arrogant person—one assumes—will necessarily lack such openness.

In the following two sections, I’ll argue that, contrary to what these conversations in philosophy and psychology suppose, belief resilience and overconfidence are neither necessary nor sufficient for intellectual arrogance. Hold Fast is the wrong way to think about intellectual arrogance, and this is important because it leads us both to miss other manifestations of arrogance (and other varieties of vicious pride) and also to improperly diagnose the failings of people who are stubborn or overconfident in their beliefs. I’ll start with this latter issue by highlighting cases of “stubborn fools”—people who are overconfident or resilient in their beliefs but not intellectually arrogant.

3. Not All Overconfident/Resilient Beliefs Are Intellectually Arrogant

My point in this section is, I hope, an intuitive one. There are many ways of being overconfident or stubborn in one’s beliefs, and many different motivations or causes driving such epistemic behavior. This is so in general and also in the specific cases of responding to disagreement, with which our quick survey of philosophers in Section 2 was primarily concerned. I’ll briefly canvas four types of overconfident or resilient beliefs that are not necessarily arrogant. Some of these beliefs—perhaps all—are liable to censure, including epistemic censure. But, just as not all bad actions are arrogant actions, not all bad beliefs are arrogant beliefs.

3.1. Rational Mistakes

Suppose Jim does not realize that the gambler’s fallacy is a fallacy. Jim’s just totally convinced that the roulette wheel has to come up black (or whatever) the next time around because it’s been so long since it last did. Jim is clearly overconfident, relative to his evidence, in the proposition that the roulette wheel will come up black next time.

We can make the situation worse. Suppose you try to explain to Jim why his reasoning is fallacious, and he fails to reduce his confidence that the wheel will come up black. That is, suppose Jim is not only overconfident but stubborn or resilient. Still, I think, Jim might not be intellectually arrogant; he might be simply exhibiting more of the same, boring kind of psychological failures of which the gambler’s fallacy is an instance. Jim might fail to adjust and reconsider his belief because he’s lazy or exhausted and does not want to think hard right now. Jim might fail to adjust because, despite trying to fairly consider your point, he just cannot do the logic properly because of incompetence or ignorance—he’s drunk, undereducated, logically challenged, or confused about the concepts you are using. Or Jim might be just so fixated on his soon-to-come good luck streak that he cannot properly listen.

In all of these versions of the case of overconfident and/or resilient Jim, he is foolish and irrational. He’s clearly set to lose a lot of money. Is he intellectually arrogant, though, given his overconfidence and belief resilience? It seems not—or not necessarily anyway, given the brief descriptions here. Jim just seems to be making mistakes, the same way we all make rational mistakes. Sure, Jim’s particular mistakes result in overconfidence and belief resilience; he ends up behaving as one might expect intellectually arrogant people often to behave when challenged in their views. But this seems insufficient to make Jim arrogant, because it’s clear that Jim is not motivated by any arrogant concerns for his status or inflated self-image. We can think of Jim as being as humble as we like—he’s just confused, or drunk, or in the grips of a gambling addiction. These are problems, but they are certainly not the same problem as intellectual arrogance.

3.2. Jamesian Truth-Seekers

Next, consider the fabled agent who prizes James’s epistemic goal of believing truly much more highly than the goal of avoiding error. This is the agent who, legend has it, just tries to believe everything. Call this agent Saf.

Saf would be systematically overconfident about just about everything. Saf would presumably also be fairly unmoved by counterevidence, including in the form of disagreement, because, recall, they just do not care much about the possibility that they are wrong. But would they thereby be arrogant? Do we want to say that Saf—who simply does not care very much about avoiding error and so is ready to believe and maintain just about any view—should thereby be classed with the jerks of the world who interrupt people and distinctly lack the virtue of intellectual humility? I do not think so. Saf is weird and arguably irrational, probably incoherent—definitely criticizable somehow. But, again, not all doxastic failings are arrogant failings.

3.3. Epistemic Partiality

I’ll use myself as the next example. I take myself to have pretty great evidence for the following propositions:

  • My partner does not tell lies at work for no reason.

  • My friend Madeline does not steal from the grocery store.

I am confident in these beliefs, and, again, I take some degree of confidence to be well-grounded. But I may also be overconfident, and indeed prepared to continue believing these propositions, even in some evidential situations that would cast rational doubt on their truth. Were I to get some evidence that my partner was lying at work or that my friend has been stealing tomatoes, I might be sluggish to update on that evidence because of the personal relationships involved.

Now, of course, we can argue about whether there are genuine, moral reasons to overestimate our friends.Footnote 11 Perhaps instead our tendencies to overestimate our friends are ultimately (relatable) failings, as Hawley (Reference Hawley2014, 2033) has it:

These (the phenomena used as examples in the epistemic partiality literature) are genuine phenomena, which indicate how likely we are to be led astray epistemically with respect to our friends. But these phenomena do not show that we have reasons of friendship to resist bad news (about our friends) in this way, or that norms of friendship require us to do so. (parentheticals added)

If Hawley is right, then perhaps there is no interesting difference between my partiality to believing in my friend’s goodness and, for example, Jim’s partiality toward the belief that the roulette wheel will come up black. But thankfully I do not need to take any stand on whether there are genuine norms of friendship that require epistemically partial behavior; it is not important to me whether partiality represents a truly distinct kind of case from Jim’s. What matters for my purposes is that partiality cases belong in the growing list of cases of overconfident, stubborn belief, where the belief in question just does not seem arrogant.

3.4. Strategically Resilient Beliefs

A final category of “stubborn fools” may hold their beliefs overly confidently and/or resiliently for what I will call strategic reasons. An agent may judge at some point in an inquiry, however, tacitly, that it is time to close a particular question—to treat her answer as settled and to cease to inquire or consider evidence bearing on the question, at least absent some unforeseen occurrence demanding she do so. She may judge that she is better off treating the question as closed in this way and, indeed, that she is more likely to realize epistemic values by doing so. At a certain point, it is not worth investing further epistemic resources to keep an inquiry open, and there are significant epistemic and practical benefits to being able to treat a proposition as given in one’s reasoning.

A sub-type of strategically resilient belief is faithful belief. The person with theistic faith, for example, will believe in God’s existence resiliently, according to many of the leading theorists of faith.Footnote 12 She will not be fully sensitive to changes in her evidential situation. While unforeseen circumstances may demand that she reevaluate—faithful beliefs needn’t be totally fixed and rigid—the faithful person as such will shrug off and fail to engage with a range of more routine challenges.

I claim that the faithful person whose beliefs are sticky or resilient in this way is not necessarily arrogant. Again, notice that we have a distinctly non-arrogant motive here, for resilience. Sticking with theism needn’t be about coming out right or maintaining some kind of status (though of course there are cases in which coming out right can motivate dogmatic religious beliefsFootnote 13). The faithful theist might persist in her faith commitment because it has the somewhat-unwavering quality of a faith commitment. Perhaps she consciously recalls that she at some point took herself to have strong reasons to go all-in on theism, and she recognizes that this is not the kind of unforeseen, in-your-face situation that would make reopening basic questions about the meaning of her life rational for her. Or perhaps her faith commitment simply operates at this point subreflectively, maintaining belief through routine challenges without her needing to re-certify the propriety of such moves.

A second type of strategically resilient belief might extend far beyond those propositions in which we have faith, to all of our proper beliefs. Lawlor (Reference Lawlor2014), drawing on Broome (Reference Broome2013) and Holton (Reference Holton, Vargas and Yaffe2014), claims that the functional role of belief demands a degree of stability. She writes:

The functional role of belief is to coordinate thought with action. If you believe something, then the question of its truth is closed, and you are disposed to act without further investment of time and energy in gathering evidence. … For a belief to play its functional role, it must be stable—it must stick around so it is there when needed. (ibid., 1)

If that’s right—if all beliefs require at least a modicum of stability—then there should be some, perhaps minimal, range of evidence deflection that we employ on behalf of all our proper beliefs (as opposed to, say, our credences). In Holton’s view, this ends up looking like simply resisting reconsidering our beliefs. He claims:

Once I have formed a belief about some matter, it will sometimes be rational for me not to reconsider it, even though, were I to reconsider it, it would be rational to revise the belief. In particular, it will sometimes be rational for me to disregard the opinions of a peer, even if, were I to reconsider my belief in the light of those opinions, it would be rational for me to revise my own. (Holton Reference Holton, Vargas and Yaffe2014, 18)

Here too, I claim that resilient belief, maintained for the sake of navigating one’s life and making decisions, need not be arrogant. Constantly recalibrating our attitudes to our evidence would take considerable time, attention, and effort. And while we may want to say that failing so to recalibrate is epistemically regrettable and irrational, my point here is not to defend resilient or stubborn beliefs, full stop. I just want to argue that some of them are not arrogant. It’s not arrogant to have practical reasons for getting on with one’s life and avoiding some reconsideration of one’s beliefs. And at the risk of sounding like a broken record, not all sins are sins of arrogance.

3.5. Two Objections

I’ve suggested that non-arrogant people who are drunk or incompetent, or have weird epistemic values, or are partial to their friends, or have faith commitments, might be unmoved not only by counterevidence in general but by the disagreement of others. But you might think these people must be arrogant, insofar as they represent others’ opinions as inconsequential or unimportant for the question of what they ought to believe.

In response, I want to say first that, as I fill out the details of these cases in my own mind, these agents do not represent others’ opinions as inconsequential. Rather, the agents I’ve sketched are largely flummoxed by others’ opinions and maintain their original views by avoiding representing the views of others at all, rather than discrediting them.

Here’s an illustration of the kind of distinction I have in mind. Consider two Christians who are aware of all the same facts about religious diversity in the world but remain unmoved by disagreement in the sense of remaining confidently Christian. The first Christian, rather dizzied by trying to think through the issues posed by religious diversity, has decided to set this issue aside as too “lofty” for her; now she more or less avoids thinking about it. The second Christian decides, contrary to her evidence, that all the non-Christians in the world who seem to be smart and intellectually virtuous are actually idiots or villains or both, certainly not her epistemic peers and not deserving of any conciliation when it comes to her own beliefs.

The agents in my cases, as I imagine them, are more like the first Christian. They may be intellectually lazy. They may be deeply confused. They may even be self-deceived. But they do not actually regard themselves as epistemically superior to those whose beliefs they dismiss. And arrogance does not seem to be an apt criticism.

The second thing to say is that representing others’ views as inconsequential may be a hallmark of arrogance, but is no guarantee of it. Some people’s views are inconsequential. My three-year-old’s beliefs about most topics, for example. And even incorrectly representing someone’s view as inconsequential can be innocent, or non-arrogant. Sometimes we get misleading evidence about others’ levels of epistemic merit. Even if the agents in my cases do represent their interlocutors’ views as to-be-dismissed, this does not make them arrogant. Crucially, one wants to know why they represented their interlocutors’ views in this way—whether this was an arrogant mistake, an innocent mistake, or perhaps a non-arrogant-though-criticizable mistake fueled more by love or faith than by ego.

Take a partiality case. I could arrogantly dismiss your opinion that Madeline is a shoplifter because I want to think of myself as more perceptive than you or because I cannot stand not to come out right in a disagreement. But I think there’s another version of this kind of case, where, to be frank: it’s really not about you, okay? That is, I could simply ignore or dismiss what you have to say not because of anything about you—I may actively try not to think about how or why I’m dismissing your testimony—but rather because I simply struggle to think poorly of Madeline.

A second sort of objection I’ve encountered goes as followsFootnote 14:

Sure, many of these stubborn believers do not seem arrogant, per se. But many of them do seem overly self-centered or self-regarding. The partiality and strategic reasons cases especially look like cases where agents adopt particular epistemic stances or policies because these benefit the agents somehow. Maybe there’s an arrogance-adjacent vice here in allowing too much of the self or one’s individual goals and preferences into one’s epistemic life.

In response, I want to make a distinction between vices of greed and vices of pride. I want to say that some of the stubborn fools just surveyed are indeed open to charges of greed, by virtue of prioritizing their desires for strong friendships or stable frameworks for guiding decision-making. But they are not similarly open to charges of vicious pride or arrogance.

The greedy person overweights or overprioritizes her desires. Because she wants chocolate cake, she proceeds to eat all of it and leave none for others, for example. Perhaps the partial friend or the loyal theist is open to this kind of charge. Because she wants strong friendships, or a stable framework for guiding her life, she is willing to set aside epistemic duties like responding to all of her evidence. Greed is a self-involving vice, to be sure, because it involves inordinate concern with the desires that are one’s own.

And yet pride is different. In vicious pride, the self or ego is itself the object of one’s inordinate concern. One wants to be, not comfortable or happy or fulfilled, but excellent, worthy, important. Of course, a desire to be important can feed a desire to have the things to which important people are seemingly entitled.Footnote 15 Pride can fuel greed. And where what one desires is specifically status or importance, greed shades into pride. But at least in principle, and sometimes in practice, these vices come apart. The proud person needn’t be prone to eating all the chocolate cake, even when she really wants to. And the cake-hogger needn’t be particularly ego-obsessed. Even if many of the agents in this section are overly self-centered and greedy in prioritizing their idiosyncratic epistemic goals or practical ends, I maintain that they aren’t intellectually arrogant.

The overall point I’ve tried to make here in Section 3 is that it’s possible to be overconfident and resilient in one’s beliefs for reasons (or causes) that have nothing to do with an arrogant sense of superiority over one’s peers or an arrogant concern to be great or to best others. It’s possible for steadfast belief resilience to not really be about an interlocutor’s inferiority or one’s own greatness. And this poses a real problem for Hold Fast and related views on which overconfidence and resilience are sufficient for arrogance.

In the next Section, I argue that overconfidence and resilience are, moreover, unnecessary for intellectual arrogance.

4. Not All Arrogant Beliefs/Believers Are Overconfident or Resilient

Remember Bob, from the introduction? Bob’s intellectual arrogance was manifest in the confident resilience of their beliefs. I think it’s focusing on cases of arrogance like Bob’s that has led to the widespread assumption that belief resilience or stubbornness is an essential and indeed central aspect of intellectual arrogance.

Nonetheless, matters seem more complicated when we consider a broader range of cases. Take the following:

Sam has believed that fluoride is good for people’s teeth since she learned this as a child. She’s never thought much about it. One day she takes some clickbait and ends up, via obscure Reddit threads, gaining a ton of evidence that actually fluoride has no beneficial effects on teeth; it’s been promoted by the government for some sinister purpose. Sam finds these sources easy to believe; she immediately discards her previous, childhood-grounded belief that fluoride is good for teeth and embraces a new, “conspiratorial” view. In particular, Sam finds it easy to trust her own, independent take on the matter she’s arrived at via “doing her own research,” and she’s attracted to the way that this countercultural belief marks her as special.

Gio was raised in a nonreligious household, believing that Christianity was false. While in college, however, he wanted a way to stand out from the (secular) campus crowd. He declared himself a Christian and started going to church, much to the surprise and dismay of his family. Gio quickly became an apologist for his new faith; having seen through all the worldly, materialist bunk he was fed as a child, he felt especially well-placed to take on atheists’ attacks.

Just like Bob, Sam and Gio also seem intellectually arrogant. Why? Well, one Hold Fast-friendly aspect of their arrogance is manifest in the confident, resistant beliefs they end up holding. Take Sam’s conspiracy-theoretic case in particular. Conspiracy-style beliefs are famously entrenched and resistant to counterevidence. Similarly, in Gio’s case, we noted in the previous section that faith-based beliefs are often resilient in the face of (some degree of) counterevidence.

But I do not think this is the whole story. It seems to me that Sam and Gio are intellectually arrogant—not just in the confident, resistant beliefs in fluoride-conspiracies or Christianity they end up holding—but also in their conversion processes. I want to say they manifest intellectual arrogance in their very readiness to abandon their prior, conventional, or long-held beliefs. They are, I claim, arrogantly “open-minded”;Footnote 16 it’s because they are intellectually arrogant that they are so quick to jump on these novel-to-them bandwagons.

After all, if our intuitions about intellectual arrogance in Sam’s and Gio’s cases were simply driven by the confidence and resilience of their final beliefs, we ought to have similar intuitions about intellectual arrogance in the following two cases:

Sam* has believed that fluoride is a government conspiracy since she learned this as a child. She’s never thought too critically about it—the government is always out to get you, and you just have to protect yourself. She’s aware that millions of people blithely drink the fluoridated water and think it’s fine, but for her this is a closed question. Sam tries not to think about how so many people could be so wrong and just gets on with her RO filtered-water life.

Gio* was raised in a Christian household and cannot imagine living as though there were no God. Christianity shapes all of his moral beliefs and his general outlook on life; he never wants to stop following Jesus, no matter what. He tries to maintain Christian company and steeps himself in Christian media to protect his faith from the deceit and temptation of “the world.”

Sam* and Gio* remind me of the stubborn fools from the previous section. They may be irrational in their resilient, stubborn beliefs. But Sam* and Gio* do not seem intellectually arrogant in the same way (or to the same degree) as Sam and Gio, who forged their own, triumphalist way to arrive at the same confident, resilient beliefs. There’s something additional going on in Sam’s and Gio’s cases, then, that Hold Fast cannot diagnose.

I’m interested in the intellectual arrogance that can be specifically manifest in an eagerness to forge one’s own way in one’s beliefs, to “do one’s own research,” and to reject previously held beliefs in a triumphalist way. I want to be clear that I do not think all desires to do one’s own research are intellectually arrogant. Levy (Reference Levy2022) suggests, for example, that prioritizing the acquisition of understanding over the acquisition/maintenance of knowledge and justification might motivate a disposition to do one’s own research; this does not seem intellectually arrogant per se. Neither does a kind of self-reliance and distrust of others/authorities borne of psychological damage and fear. My claim here is that some instances of eagerness to revise one’s beliefs or desires to do one’s own research manifest intellectual arrogance, and this claim is interesting in the present context because it’s at odds with a Hold Fast picture of what intellectual arrogance is.Footnote 17

In the next section, I’ll propose an alternative way of thinking about intellectual arrogance that can help explain why Sam’s and Gio’s conversions manifest arrogance (and why Sam* and Gio* and the stubborn fools from Section 3 do not). First, however, I just want to make two points that might help a still-skeptical reader see Sam’s and Gio’s conversion processes as intellectually arrogant.

First, and focusing on Sam, psychologists have linked the attraction to conspiracy-style beliefs with arrogance-adjacent traits like narcissism, entitlement, and “need for uniqueness.” For example, Imhoff and Lamberty (Reference Imhoff and Lamberty2017) found that:

Across three studies, the desire to see oneself as unique was weakly but reliably associated with the general mindset behind conspiratorial thinking, conspiracy mentality, and the endorsement of specific conspiracy beliefs. These results thus point to a hitherto neglected function of conspiracy theories: to present oneself as distinct from the crowd.

Indeed, the need for uniqueness appears to play a modest causal and not merely correlational role in conspiracy belief formation. Lantian et al. (Reference Lantian, Muller, Nurra and Douglas2017) experimentally manipulated need for uniqueness and increased conspiracy belief in study participants. Other studies, including those described in Cichocka et al. (Reference Cichocka, Marchlewska and Agnieszka Golec2016), have found positive associations between conspiracy beliefs and individual narcissism.Footnote 18 Finally, Neville, Fisk, and Ens (Reference Neville, Fisk and Ens2025) find that psychological entitlement—that is, “the dispositional tendency to claim excessive and unearned rewards and resources, and to demand undeserved special treatment”—is associated with conspiracy theory endorsement and conspiracy theorizing as an overarching cognitive style.

Now, of course the psychological study of conspiracy beliefs and mindsets is ongoing; it would be unwise to put much stock in individual studies. And only some of these studies suggest that arrogance-adjacent traits motivate conspiracy theoretic beliefs, as opposed to merely correlating with them. But insofar as psychological research suggests that arrogance-adjacent traits not only result from but motivate some conversions to conspiracy beliefs, we should not be surprised to find Sam intuitively arrogant—and arrogant in a way that outstrips any arrogance we see in Sam*.

The second point that may help readers see Sam as arrogant is that, although a Hold Fast picture of intellectual arrogance and perhaps also broader tendencies in epistemology lead us to look at agents’ beliefs, to see any manifestations of intellectual virtue or vice, it seems clear on reflection that other mental states and processes can evince and display intellectual virtue/vice too. Think about virtuous or vicious ways of gathering evidence, asking questions, or more broadly directing our attention and conducting inquiry; think about virtuous or vicious ignorance. Not only in our beliefs but also in the ways we open or close questions, we can be biased or brave, curious or careless. And I think it’s clear that arrogance and humility in particular can be manifest in some of this wider intellectual activity. Take, for example, arrogant ignorance: ignorance of one’s limitations, ignorance of the best reasons for one’s opponents’ views, perhaps also the carefully preserved “white ignorance” that Mills (Reference Mills, Sullivan and Tuana2007) explores.

Once we are looking at the broader mental states and inquiry-relevant dispositions of agents like Sam and Gio, we might see that they are keen to ask questions like: Where does conventional wisdom go astray? What are other people missing here? Sam and Gio seem to give a lot of weight to their independent evaluations of relevant evidence and to be relatively dismissive of “authorities.” And importantly, at least in the way I fill out the details of these vignettes in my mind, they both seem biased in favor of concluding inquiry in a way that allows them to feel special and smart.

In this section and the previous one, I’ve highlighted cases of stubborn fools (Section 3) and the arrogantly “open-minded” (Section 4) that make trouble for a Hold Fast picture of intellectual arrogance (which, as we saw in Section 2, has been influential). It remains to present an alternative, positive sketch of intellectual arrogance.

5. The No-Distraction Account of Humility and Implications for Intellectual Arrogance

I have motivated and argued for a particular view of intellectual humility elsewhere, called the no-distraction account.Footnote 19 According to the no-distraction account of intellectual humility, that virtue fundamentally consists in freedom, in one’s thinking, from prideful distraction by one’s intellectual ego. The person with intellectual humility enjoys a certain relative ability to think clearly or focus; her intellectual energies are not often diverted by thoughts about her own status or intellectual abilities, and her thinking does not suffer distortion from the need to validate her own beliefs, as such.

In contrast, the person distracted by pride may struggle to, for example, pay attention in a philosophy talk because they are too preoccupied with coming up with a brilliant question to ask in front of the august audience members. Or they may struggle to write or research because they are consumed by monitoring to make sure they are writing well (and, perhaps, because they fear they are writing poorly). Or they may struggle to fairly appreciate an argument, whether written or in conversation, because they are running a background cognitive load of worry about how this argument will make them look foolish or brilliant, given their previous doxastic commitments. As these examples illustrate, we do not always know when and/or feel how we are distracted by pride; in the viciously proud person, such distractions can operate at both conscious and subconscious levels.

This way of thinking about intellectual humility draws on a tradition of thinking about humility in general as a kind of transparent virtue, marked by the ability of its possessor to simply do or see other, more important things. For Murdoch, for example, humility is a centrally important human virtue that allows us to see reality beyond ourselves. In her words, “Humility is not a peculiar habit of self-effacement, rather like having an inaudible voice, it is selfless respect for reality and one of the most difficult and central of all virtues” (Reference Murdoch2014, 93).

Now, I realize this little sketch of the view leaves many questions unanswered. I cannot and will not try to rehearse my previous arguments or respond to all pressing objections. What I want to do here is explore, first, how this view suggests we think about intellectual arrogance, and, second, how the resulting picture of intellectual arrogance can explain and predict the kinds of cases I’ve presented in the present paper.

Intellectual arrogance, in my view, is a species of vicious and distracting pride. There are other species. In particular, arrogance seems to me to involve an inflatedFootnote 20 view of oneself and one’s merits, whereas other forms of vicious pride like vanity do not, or not necessarily. But it is not the specifics of arrogance that I think do the explanatory work of correctly sorting stubborn fools and the arrogantly open-minded; it is, rather, that intellectually arrogant agents are among the viciously proud agents distracted by their egos. Intellectually arrogant agents are concerned with having high status and importance as intellectual agents, in a way that hinders or precludes valuable intellectual activity. And we can thus expect intellectually arrogant agents, as such, to seek status and importance in ways that are epistemically unproductive or counterproductive, distracting. This status-seeking tendency can, I believe, explain why overconfidence and resilience in the face of challenges to our beliefs can manifest arrogance, but it can also explain how certain conversion processes manifest arrogance too. I’ll take these in turn.

5.1. Belief Challenge as a Threat to Status

To encounter evidence that throws doubt on a previously held belief, or to encounter a person or source who believes differently, is to encounter a prima facie threat to one’s status and importance as an intellectual agent. This is because, by the lights of the new challenge, the belief one currently holds or previously held is wrong.

Now, it is no sin to be wrong. To be wrong is not to be intellectually unimportant; it is not even necessarily to be irrational or blameworthy in one’s particular, false belief. But being wrong or in error does tell in some way, however large or small, against one’s being particularly epistemically excellent and important. When one’s belief is false, it seems very likely that either one’s epistemic position—roughly speaking, one’s evidence or one’s perspective on the issue at hand—or one’s epistemic conduct or evaluation of one’s evidence was wanting.

To have evaluated one’s evidence poorly is clearly an epistemic defect bearing on the intellectual excellence or importance of an agent. If the reason one previously held a false view is, in part, that one was sloppy, flatfooted, or confused in one’s thinking, then, well, perhaps one is not really an important, excellent thinker of high status. If instead only one’s epistemic position was wanting—if one simply wasn’t in a position to see the relevant facts bearing on a question—this might not initially seem like a blow to intellectual status or importance. Why would it damage my status to admit that I previously had access to a bad batch of evidence, over which I may have had no control? But I think the answer is that agents do (feel they) have some control over the evidence they have, and it is more impressive, ceteris paribus, if an epistemic agent is not only highly competent but highly effective, having transcended or avoided any limitations stemming from a poor informational environment.Footnote 21

Another dimension of variation, relevant to the cases presented earlier in this section, is also important. Agents feel more ownership over some of their beliefs than others. Beliefs that are actively, rather than passively, acquired, where agents have put time and reflection into making up their minds, may be more tightly bound up with agents’ epistemic status than beliefs acquired passively or casually. Going back to the introduction: consider Bob’s pet political view on the border wall, which he’s presumably been thinking about for years. Bob’s belief about the border wall is something he had to glean from his own special sources, and it’s something he’s decided to take a stand for and to associate with his identity as a political thinker. If this theory turns out to look stupid, then, well, Bob turns out to look stupid.

On the other hand, Sam and Gio feel no very strong degree of ownership over their beliefs about fluoride or atheism. These are just things they accepted, passively and reflexively, as kids. They believed them, yes. But we can feel more or less agency over our beliefs, and Sam and Gio felt little agency over these. They would not have said that these beliefs manifested much of their intellectual greatness or importance (though, given that they seem to be generally intellectually arrogant, there were presumably other beliefs that they thought did manifest it).

So, the extent to which a challenge to one of our beliefs constitutes a threat to our status seems to depend in part on (i) the available explanations as to why our initial belief would have been wrong (Do these explanations tend to paint us as grossly epistemically irresponsible, or just understandably epistemically unlucky?); and also (ii) the extent to which the belief in question is bound up with our sense of epistemic agency (How passively versus actively has this belief been acquired, and to what extent do we feel this belief manifests our intellectual character?). All challenges to one’s beliefs seem to constitute some threat to one’s status. But the extent of the threat will vary from belief to belief and from agent to agent. The theory that a scientist has spent her career defending is very different than her casual, passive belief that fluoride is good for teeth. Challenges to the first typically present a stronger threat to her intellectual status than challenges to the latter.Footnote 22

In general, then, it seems we should expect arrogant people to be more resilient in those of their beliefs over which they feel more ownership than those of their beliefs where error cannot be easily, “innocently” explained.

This will not, however, fully explain our cases in the previous section. Recall that Sam and Gio’s arrogance seems to be manifest in their very eagerness to revise. Here, I’ve defended the claim that all challenges to one’s beliefs constitute some threat to status, and this would seem to support the claim that all arrogant people tend toward belief resilience—in some cases more than others. But in the following subsection, I want to complicate the picture by extending our reflection on how belief revision can sometimes present opportunities for greater agency over one’s beliefs.

5.2. Belief Challenge as a Status-Gaining Opportunity

Arrogant people, as such, love importance and status. But arrogant people needn’t be (small c) conservative. They needn’t care whether they attain/maintain importance and status by protecting their preexisting views or by adopting novel, different ones. And challenges to one’s beliefs can also present status-gaining opportunities.

We’ve already noted that passively acquired beliefs seem less bound up with one’s intellectual status than actively acquired views. Passive beliefs, ceteris paribus, also seem to contribute less to one’s intellectual excellence and importance. It is better, from the perspective of the arrogant, to not only possess truth and knowledge but to possess it actively, by virtue of having figured things out “for oneself”Footnote 23.

This would seem to generate a pro tanto preference for actively acquired beliefs, among the arrogant. And challenges to our beliefs—calls to revise and rethink—can pose opportunities precisely for taking active ownership over those beliefs. Where previous views were passive or unreflective, a new view stands to redound much more strongly to the believer’s credit.

I think something like this is going on in the cases of Sam and Gio. As we reviewed in the previous subsection, it does not cost them much to admit that they were wrong about fluoride or atheism. And, moreover, notice that they actually get a lot of (self-perceived, and possibly other-perceived as well) status out of having seen through, for themselves, their previous errors. Sam and Gio face situations where abandoning their prior beliefs affords the opportunity to take a more active and important role, as epistemic agents, relative to the role they play simply maintaining the beliefs they acquired passively in their families. Whereas Bob saw his status as an intellectual agent as bound up with defending his pre-existing position, Sam and Gio perceived that a yet greater status was available to them as independently-minded apostates.

Here, we see a quite general way in which challenges to passively held beliefs not only pose less of a threat to the arrogant, but present positive opportunities for gaining intellectual status. It makes sense, then, that the intellectually arrogant would display a pro tanto eagerness to root out passive, unreflective beliefs that would compete, in individual cases, with the general arrogant reluctance to admit error surveyed in the previous subsection.

This arrogant eagerness to revise might be supplemented in some cases by the specific attractions of iconoclastic views. Unusual views that run counter to “mainstream” narratives may stand to put one in a position of influence or esteem. Think about Sam and the non-helpfulness of fluoride, or the general attraction to conspiracy theories. An arrogant person, particularly concerned to see themselves as intellectually important, will be particularly susceptible to the promise of being in an elite club of knowers.

One further wrinkle here. Revising one’s beliefs affords an opportunity to take active ownership over them. Revising one’s beliefs to adopt iconoclastic positions affords an opportunity to join an elite and fashionable group. But, in addition, revising one’s beliefs sometimes affords us the opportunity to tell triumphalist narratives, about how we overcame—presumably, once and for all—our previously limited perspectives or confused thinking. In particular, I’m thinking about cases in which one’s previous error seems to have been engineered by some deceptive third party. For Sam, it’s the government who propounds the fluoride “hoax.” For Gio, maybe it’s just “those Godless heathens” who dominate secular culture. But the specter of a force that one has to overcome or beat in order to gain the truth seems to add a special importance and excellence to such gains—at least, that is, by the lights of the arrogant who characteristically want not only to be important and high-status but more important and higher status than others. Arrogant people love winning. And while we sometimes win by maintaining a belief in the face of challenges, we can also win by triumphally seeing through the deceptions that lead us to our original, passive views.

All of this is consistent with the obvious fact that arrogant people are often stubborn—and arrogantly stubborn. My claim in this section is that arrogant people, as such, will display certain patterns of stubbornness but also of “open-mindedness,” or willingness to abandon prior beliefs.

6. Conclusion

I’ve argued that the relationship between intellectual arrogance and confidence or resilience in one’s views is complex, and I’ve highlighted certain kinds of cases that deserve greater consideration: cases of non-arrogant resilience, on the one hand, and arrogant “open-mindedness” on the other.

Separating belief resilience from arrogance, both in general and also in the specific case of responding to disagreement, has at least three important implications.

First, blanket tendencies to criticize steadfast and resilient believers as arrogant are wrongheaded. Many resilient beliefs—even many resilient beliefs that seem liable to other forms of epistemic criticism—have nothing to do with arrogance and do not manifest it.

Second, avoiding intellectual arrogance does not necessarily require one to be non-resilient in one’s views. This is important in allaying a concern one might have about trendy claims to the effect that intellectual humility is the cure to what ails us as a society (including myside bias, polarization, etc.) and that we all need to figure out how to have more of it. Namely: it is plausible that there are some things we really should believe resiliently—things like the moral equality of all persons, for example. Intellectual humility can seem downright dangerous (and intellectual arrogance can seem downright adaptive) if we think of intellectual humility as traveling with a general readiness to revise one’s beliefs, and if we think that a lot of our current beliefs are, well, true and important.

Finally, some intellectually arrogant people, who deserve censure as such, aren’t stubborn and fixed in their views. Changing one’s mind in response to a challenge or a dissenting peer can be a mark of virtue. But it can also manifest or breed vice. DiPaolo (Reference DiPaoloforthcoming, 20) offers a case, for example, of a religious convert who quickly becomes extremely closeminded; the fact that they converted “open-mindedly” comes to serve as a license for them to stop listening to further challenges to their beliefs. My cases here are a bit different because I’ve been interested in cases where belief revision manifests rather than breeds vice. But the spirit of the warning is similar: conversion is no guarantee of epistemic holiness. And we need to be careful not to miss the arrogance of the Sam’s and Gio’s of the world, who are indeed intellectually arrogant in their very willingness to change their minds—when doing so affords the ability to tell a self-aggrandizing story.Footnote 24

Laura Frances Callahan is an Associate Professor of Philosophy at the University of Notre Dame. Her research interests cluster within epistemology, ethics, and the philosophy of religion, and she led a 2022–2024 grant from the John Templeton Foundation on Intellectual Humility and Oppression.

Footnotes

1 Here, I’ll draw on the (large) literature about intellectual humility in order to think about intellectual arrogance, since these phenomena are closely related. Intellectual humility is commonly cast either as a mean between intellectual servility/timidity/diffidence and intellectual arrogance/dogmatism (cf. Whitcomb et al. (Reference Whitcomb, Battaly, Baehr and Howard-Snyder2017), Hazlett (Reference Hazlett2012, Reference Hazlett, Gordon and Cartere2017), Tanesini (Reference Tanesini2018), or simply as the opposite of intellectual arrogance and other vicious forms of intellectual pride (cf. Roberts & Wood, Reference Roberts and Wood2007). Now, there is some danger in equating intellectual arrogance with intellectual humility’s prideful opposite; namely, we risk missing distinctions among varieties of vicious pride such as intellectual vanity, intellectual grandiosity, etc. (Again, cf. Roberts & Wood, Reference Roberts and Wood2007). Still, discussions about intellectual humility often suggest something about how we should think about intellectual arrogance as well.

2 Kidd’s (Reference Kidd2016) view of intellectual humility suggests an account of arrogance along these lines. According to Kidd, intellectual humility is well-regulated confidence in one’s beliefs—regulated, that is, according to whether one meets various “confidence conditions,” that is, conditions of epistemic capacity, access to quality evidence and testimony, and so forth. Basically, the intellectually humble person is not overconfident (or underconfident), relative to her epistemic position. This suggests that the arrogant person is precisely someone who is (or is supposed to become) overconfident, relative to her epistemic position.

3 I am reading the statement as equivalent to: “Intellectually humble people always (or perhaps, at least when they are acting in accordance with their characteristic intellectual humility) moderate their views in light of the evidence.” I’ll argue that’s false. But, of course, statements involving generics can be understood variously, and perhaps Levy meant something else, something arguably true.

4 Roberts and Wood (Reference Roberts and Wood2007, 236) catalogue a number of traits opposed to humility which I could call varieties of vicious pride: “arrogance, vanity, conceit, egotism, hyper-autonomy, grandiosity, pretentiousness, snobbishness, impertinence (presumption), haughtiness, self-righteousness, domination, selfish ambition, and self-complacency.”

5 Though perhaps they should not be—see again Roberts and Wood (Reference Roberts and Wood2007, 236) quoted in fn. 5. For an additional datum on synonymity: Kidd (Reference Kidd2016) repeatedly refers to “arrogance and dogmatism” in one sweep as vices opposed to intellectual humility, suggesting a synonymous reading.

6 The Feldman, Christensen, and Carter and Pritchard references in this section are also discussed in Beebe and Matheson (Reference Beebe and Matheson2023, 429).

7 This despite the fact that Pritchard (Reference Pritchard2018, Reference Pritchard, Silva-Filho and Tateo2019) explicitly disavows the necessity of conciliation for intellectual humility.

8 Beebe and Matheson found significant correlations between conciliationist tendencies and intellectual humility only using two of the three measurement scales with which they assessed participants’ intellectual humility. Namely, these correlations arose when using scales from Krumrei-Mancuso and Rouse (Reference Krumrei-Mancuso and Rouse2016) and Leary et al. (Reference Leary, Diebels, Davisson, Jongman-Sereno, Isherwood, Raimi and Hoyle2017), discussed below. Given these measures, it seems highly unsurprising that conciliationist tendencies would be positively correlated with intellectual humility.

9 Peters et al. (Reference Peters, Turner and Battaly2025) also discuss the way that psychological scales of intellectual humility sometimes bake in a link between intellectual humility and open-mindedness, in the sense of readiness to revise beliefs.

10 Here, I’m ignoring the finer distinctions noted in fn. 5 to see what psychological scales suggest about intellectual arrogance in particular.

11 See Stroud (Reference Stroud2006) for influential defense of the view that there are such reasons.

12 See, for example, Howard-Snyder and McKaughan (Reference Howard-Snyder and McKaughan2022), Buchak (Reference Buchak, Chandler and Harrison2012), and Jackson (Reference Jackson2021).

13 See Callahan (Reference Callahanforthcoming).

14 Thanks to Stephen Ogden for raising this concern.

15 See Roberts and Wood (Reference Roberts and Wood2007, 243–250) on arrogance as a “disposition to ‘infer’ some illicit entitlement.” (ibid., 243).

16 I take no stand on the best way to define open mindedness. See Baehr (Reference Baehr2011, 162) for one interesting view. Here, I am employing something like the view of Peters et al. (Reference Peters, Turner and Battaly2025), on which open-mindedness centrally involves the tendency to engage with others and “a willingness to revise one’s extant beliefs” through such engagement. But given that I do not want to be committed to that view, I also do not want to claim that Sam and Gio are genuinely open-minded. When I call them “arrogantly open-minded,” what I mean is that their eagerness or readiness to abandon current beliefs—whether or not that’s exactly the same thing as open-mindedness—is of an arrogant kind.

17 Levy (Reference Levy2023a) makes a case for another, related reason to be attracted to contrarian positions that suggest intellectual autonomy: one is engaged in intellectual virtue signaling. Virtue signaling in general involves a display partly or significantly motivated by the desire to impress others. Intellectual virtue signaling involves a display of one’s putative intellectual virtues—often in the form of a “hot take” on some controversial issue, accompanied by complex argument, produced quickly—motivated (partly or significantly) by the desire for “the spotlight” (Levy, Reference Levy2023a, 314). Intellectual virtue signaling, in Levy’s telling, sounds like a necessarily intellectually vain activity. But there are some subtle distinctions between vanity and arrogance (cf. Roberts & Wood, Reference Roberts and Wood2007), so I am not sure whether the virtue signaler is necessarily intellectually arrogant as well.

18 See also Cosgrove and Murphy (Reference Cosgrove and Murphy2023).

19 See Callahan (Reference Callahan2024).

20 Inflated, that is, relative to the evidence one has about one’s merits. Not inflated relative to one’s actual merits, since one can “innocently”—that is, nonarrogantly—respond to misleading evidence to the effect that one has more merits than one in fact does.

21 Some epistemologists claim we have epistemic responsibilities not only to evaluate evidence in good ways but to gather and acquire evidence in good ways. Cf. Flores and Woodward (Reference Flores and Woodard2023).

22 This is not to say that we cannot feel status-threatened with respect to passively and casually acquired beliefs. Very arrogant agents ought to be particularly sensitive to status threats and may feel highly threatened by belief challenges across the board.

23 Of course, Sam’s and Gio’s cases do not involve them figuring things out on their own. They get testimony that prompts them to believe in Christianity or the badness of fluoride. But these are cases of deliberate and agential testimonial acceptance, in contrast to the merely passive acceptance of their previous views. And as Zagzebski (Reference Zagzebski2012) argues, deliberate and reflective deference to authority manifests an important form of autonomy.

24 Many thanks to participants in a January 2024 Center for Philosophy of Religion discussion group at the University of Notre Dame as well as a March 2024 Humility and Arrogance conference in Tempe, Arizona, for comments and discussion on earlier versions of this project. Thanks especially to Nathan Ballantyne and Thomas Kelly for encouragement to write the paper, and also thanks to Katelyn O’Dell for editorial help and two anonymous referees for this journal. This publication was made possible, in part, through the support of Grant 62636 from the John Templeton Foundation. The opinions expressed in this publication are those of the author(s) and do not necessarily reflect the views of the John Templeton Foundation.

References

Baehr, J. S. (2011). The inquiring mind: On intellectual virtues and virtue epistemology. Oxford University Press.10.1093/acprof:oso/9780199604074.001.0001CrossRefGoogle Scholar
Beebe, J. R., & Matheson, J. (2023). Measuring virtuous responses to peer disagreement: The intellectual humility and actively open-minded thinking of conciliationists. Journal of the American Philosophical Association, 9(3), 426449. https://doi.org/10.1017/apa.2022.8.CrossRefGoogle Scholar
Ben-Ze’ew, A. (1993). The virtue of modesty. American Philosophical Quarterly, 30(3), 235246.Google Scholar
Broome, J. (2013). Rationality through reasoning. Wiley-Blackwell.10.1002/9781118609088CrossRefGoogle Scholar
Buchak, L. (2012). Can it be rational to have faith? In Chandler, J. & Harrison, V. S. (Eds.), Probability in the philosophy of religion (pp. 225248). Oxford University Press.10.1093/acprof:oso/9780199604760.003.0012CrossRefGoogle Scholar
Callahan, L. (2024). Intellectual humility: A no-distraction account. Philosophy and Phenomenological Research, 108, 320337. https://doi.org/10.1111/phpr.12965.CrossRefGoogle Scholar
Callahan, L. (forthcoming). Intellectual humility and Christian faith. Religious Studies, 115. https://doi.org/10.1017/S0034412524000362.Google Scholar
Carter, A., & Pritchard, D. (2016). Intellectual humility, knowledge-how, and disagreement. In Chienkuo, M., Slote, M., & Sosa, E. (Eds.), Moral and intellectual virtues in Western and Chinese philosophy: The turn toward virtue (pp. 4963). Routledge.Google Scholar
Christensen, D. (2010). Higher order evidence. Philosophy and Phenomenological Research, 81(1), 185215. https://doi.org/10.1111/j.1933-1592.2010.00366.CrossRefGoogle Scholar
Christensen, D. (2013). Epistemic modesty defended. In Christensen, D. & Lackey, J. (Eds.), The epistemology of disagreement: New essays (p. 77). Oxford University Press.10.1093/acprof:oso/9780199698370.001.0001CrossRefGoogle Scholar
Cichocka, A., Marchlewska, M., & Agnieszka Golec, D. Z. (2016). Does self-love or self-hate predict conspiracy beliefs? Narcissism, self-esteem, and the endorsement of conspiracy theories. Social Psychological and Personality Science, 7(2), 157166. https://doi.org/10.1177/1948550615616170.CrossRefGoogle Scholar
Cosgrove, T. J., & Murphy, C. P. (2023). Narcissistic susceptibility to conspiracy beliefs exaggerated by education, reduced by cognitive reflection. Frontiers in Psychology, 14, 1164725. https://doi.org/10.3389/fpsyg.2023.1164725.CrossRefGoogle ScholarPubMed
DiPaolo, J. (forthcoming). “I’m, like, a very smart person” on self-licensing and perils of reflection. Oxford Studies in Epistemology.Google Scholar
Driver, J. (1989). The virtues of ignorance. Journal of Philosophy, 86(7), 373384.10.2307/2027146CrossRefGoogle Scholar
Feldman, R. (2006). Reasonable religious disagreements. In Antony, L. (Ed.), Philosophers without Gods: Meditations on atheism and the secular life (pp. 194214). Oxford University Press.Google Scholar
Flores, C., & Woodard, E. (2023). Epistemic norms on evidence-gathering. Philosophical Studies, 180(9), 25472571. https://doi.org/10.1007/s11098-023-01978-8.CrossRefGoogle Scholar
Hawley, K. (2014). Partiality and prejudice in trusting. Synthese, 191(9), 20292047.10.1007/s11229-012-0129-4CrossRefGoogle Scholar
Hazlett, A. (2012). Higher-order epistemic attitudes and intellectual humility. Episteme, 9(3), 205223. https://doi.org/10.1017/epi.2012.11.CrossRefGoogle Scholar
Hazlett, A. (2017). Intellectual pride. In Gordon, E. C. & Cartere, J. A. (Eds.), The moral psychology of pride. Rowman and Littlefield.Google Scholar
Holton, R. (2014). Intention as a model for belief. In Vargas, M. & Yaffe, G. (Eds.), Rational and social agency: The philosophy of Michael Bratman. Oxford University Press.Google Scholar
Howard-Snyder, D., & McKaughan, D. J. (2022). Faith and resilience. International Journal for Philosophy of Religion, (3) https://doi.org/10.1007/s11153-021-09820-z.CrossRefGoogle ScholarPubMed
Hoyle, R. H., Davisson, E. K., Diebels, K. J., & Leary, M. R. (2016). Holding specific views with humility: Conceptualization and measurement of specific intellectual humility. Personality and Individual Differences, 97, 165172. https://doi.org/10.1016/j.paid.2016.03.043.CrossRefGoogle Scholar
Imhoff, R., & Lamberty, P. K. (2017). Too special to be duped: Need for uniqueness motivates conspiracy beliefs. European Journal of Social Psychology, 47(6), 724734. https://doi.org/10.1002/ejsp.2265.CrossRefGoogle Scholar
Jackson, E. (2021). Belief, faith, and hope: On the rationality of long-term commitment. Mind, 130(517), 3557. https://doi.org/10.1093/mind/fzaa023.CrossRefGoogle Scholar
Kidd, I. J. (2016). Intellectual humility, confidence, and argumentation. Topoi, 35(2), 395402.10.1007/s11245-015-9324-5CrossRefGoogle Scholar
Krumrei-Mancuso, E. J., & Rouse, S. V. (2016). The development and validation of the comprehensive intellectual humility scale. Journal of Personality Assessment, 98(2), 209221. https://doi.org/10.1080/00223891.2015.1068174.CrossRefGoogle ScholarPubMed
Lantian, A., Muller, D., Nurra, C., & Douglas, K. M. (2017). I know things they don’t know! Social Psychology, 48(3), 160173. https://doi.org/10.1027/1864-9335/a000306.CrossRefGoogle Scholar
Lawlor, K. (2014). Exploring the stability of belief: Resiliency and temptation. Inquiry: An Interdisciplinary Journal of Philosophy, 57(1), 127. https://doi.org/10.1080/0020174x.2014.858414.CrossRefGoogle Scholar
Leary, M. R., Diebels, K. J., Davisson, E. K., Jongman-Sereno, K. P., Isherwood, J. C., Raimi, K. T., … Hoyle, R. H. (2017). Cognitive and interpersonal features of intellectual humility. Personality and Social Psychology Bulletin, 43(6), 793813. https://doi.org/10.1177/0146167217697695.CrossRefGoogle ScholarPubMed
Levy, N. (2022). Do your own research! Synthese, 200(5), 119. https://doi.org/10.1007/s11229-022-03793-w.CrossRefGoogle ScholarPubMed
Levy, N. (2023a). Intellectual virtue signaling. American Philosophical Quarterly, 60(3), 311324. https://doi.org/10.5406/21521123.60.3.07.CrossRefGoogle Scholar
Levy, N. (2023b). Too humble for words. Philosophical Studies, 180(10), 31413160. https://doi.org/10.1007/s11098-023-02031-4.CrossRefGoogle Scholar
Matheson, J. (2024). Epistemic autonomy and intellectual humility: Mutually supporting virtues. Social Epistemology, 38(3), 318330.10.1080/02691728.2023.2258093CrossRefGoogle Scholar
Mills, C. W. (2007). White ignorance. In Sullivan, S. & Tuana, N. (Eds.), Race and epistemologies of ignorance (pp. 1138). State University of New York Press.Google Scholar
Murdoch, I. (2014). The sovereignty of good. New York: Routledge.Google Scholar
Neville, L., Fisk, G. M., & Ens, K. (2025). Psychological entitlement and conspiracy beliefs: Evidence from the COVID-19 pandemic. The Journal of Social Psychology, 165(1), 6587. https://doi.org/10.1080/00224545.2023.2292626.CrossRefGoogle ScholarPubMed
Peters, K., Turner, C., & Battaly, H. (2025) Intellectual humility without open-mindedness: How to respond to extremist views (1st Edn). Episteme, 22, 123. https://doi.org/10.1017/epi.2024.63.CrossRefGoogle Scholar
Pritchard, D. (2018). Intellectual humility and the epistemology of disagreement. Synthese, 198(Suppl. 7), 17111723. https://doi.org/10.1007/s11229-018-02024-5.CrossRefGoogle Scholar
Pritchard, D. (2019). Disagreement, intellectual humility and reflection. In Silva-Filho, W. J. & Tateo, L. (Eds.), Thinking about oneself: The place and value of reflection in philosophy and psychology (pp. 5971). Springer Verlag.10.1007/978-3-030-18266-3_5CrossRefGoogle Scholar
Roberts, R. C., & Wood, W. J. (2007). Intellectual virtues: An essay in regulative epistemology. Oxford University Press.10.1093/acprof:oso/9780199283675.001.0001CrossRefGoogle Scholar
Stroud, S. (2006). Epistemic partiality in friendship. Ethics, 116(3), 498524. https://doi.org/10.1086/500337.CrossRefGoogle Scholar
Tanesini, A. (2018). Intellectual humility as attitude. Philosophy and Phenomenological Research, 96(2), 399420. https://doi.org/10.1111/phpr.12326.CrossRefGoogle Scholar
Whitcomb, D., Battaly, H., Baehr, J., & Howard-Snyder, D. (2017). Intellectual humility: Owning our limitations. Philosophy and Phenomenological Research, 94(3), 509539. https://doi.org/10.1111/phpr.12228.CrossRefGoogle Scholar
Zagzebski, L. T. (2012). Epistemic authority: A theory of trust, authority, and autonomy in belief. Oxford University Press.10.1093/acprof:oso/9780199936472.001.0001CrossRefGoogle Scholar