 What can human cognition tell us about intellectual humility? From developmental science, we might expect intellectual humility is rather a smooth process of assigning value to various ideas and theories about the world as experience dictates leading us to the truth. However, it's not that simple. It appears that human beings are notoriously and apparently naturally disposed to overestimate their capacity to know the truth and underestimate their weaknesses. Furthermore, we're susceptible to all sorts of biases that make knowing difficult. For example, we tend to favor evidence or data received early in our inquiries and we tend to discount the weight of evidence that counts against hypotheses we endorse. Second, evolutionary psychologists have offered some intriguing arguments that these dispositions are embedded within our cognitive structure in ways that we can systematically be led to bias thinking. In some cases, for adaptive reasons we have developed what are known as cognitive heuristics, which are mental shortcuts that allow us to quickly process information which can lead to a more efficient decision making. The downside to this efficiency, these heuristics, is they don't always help us track the truth. That is, they can lead to bias thinking. Cognitive science has come a long way toward understanding the roots of these biases how our cognitive system utilizes biases as a way to efficiently store and sort knowledge, knowledge in which we base our actions. One clear discovery in cognitive science is that these biases govern our decisions and actions and they are not always consciously held or analyzed. In fact, most of the time they're not. The evidence is pretty clear from cognitive science that our natural tendency is toward intellectual arrogance, not intellectual humility. There is an extensive body of research into how chronic and systemic biases can result from certain cognitive heuristics that speed processing. Out of this research into heuristics and biases has grown a number of what are called dual process theories of human cognition. While each theory has different names and different categories assigned to each process these theories broadly share a distinction between fast automatic and intuitive processes called type one processes and slow deliberative and analytic processes known as type two processes. Research into type one and type two thinking is both relevant and impactful for the study of intellectual humility. The heuristics of type one thinking evaluate ideas from within one's own perspective and therefore favor ideas that are readily accessible easily discerned and conformed to prior experience. Type two processing, though it is more deliberate and usually analyzes what type one thinking offers it can also favor what one already knows, believes and intuits even as it decouples from type one representations. While in many instances this self-reliant type one thinking is adequate problems and biases arise when reliance on what one knows and what one intuits does not provide enough information or the right kind of information for the task which in turn produces bias judgments. Many of these biases are well known and well documented. One of the best known is called the confirmation bias which is the tendency to seek confirmation for opinions and beliefs already held to and to ignore disconfirming evidence. We see evidence of this in the way people seek information about political candidates watching news networks that reflect their already existing political leanings. So how might we overcome these biases and exercise intellectual humility? Well it takes some intentional effort. Certain habits and attitudes can help mitigate the natural intellectual arrogance that can result from the cognitive shortcuts we take to make our thinking more efficient. A few of these habits are rule-based thinking that is directing one's attention to processes, objective criteria and rules of analysis like logic that can aid in reducing systematic biases. Accuracy and accountability can help us. Situations that call for accuracy and judgment promote slower and more deliberative thinking and if we value accuracy and are held accountable for our conclusions we tend to exhibit more intellectual humility. Perspective taking can also help. One aspect of an open and flexible thinking disposition that helps mitigate biases is the capacity to weigh evidence for and against a strongly held belief including the opinions and beliefs of others who hold a position different from one's own. This requires a certain capacity for perspective taking a movement from the focus on one's own thoughts to include the perceptions thoughts and ideas of others. Because intellectual humility is a virtue that requires effort for control it seems that this virtue is mostly found in the conscious exercise of type 2 thinking or more specifically through the proper collaboration of type 1 and type 2 processes. Investigating ways to attenuate cognitive biases along the lines outlined here and investigating the traits that appear to create a natural disposition toward intellectual humility would be fruitful areas of research. Just as habits play a role in cultivating any virtue habits of the mind can bring about intellectual humility and other epistemic virtues. They serve to curb the vice of intellectual arrogance even in the face of a cognitive system that can be prone to biased thinking. Perhaps our cognitive system is not in and of itself vicious but it may take a conscious effort and virtuous habits of the mind not to make it so.