What do you call someone who only hears what they want to hear?

There are a few terms that can describe someone who only hears what they want to hear. This selective listening can be intentional or unintentional, but the end result is that the person misses important information by tuning out anything that contradicts their own views or perspectives.

Selective Listening

Selective listening refers to the tendency to focus on particular sounds while ignoring or filtering out others. It’s a common phenomenon – we all engage in some degree of selective listening every day. For example, a parent may immediately hear their child’s voice calling them, even if they’re in a crowded room full of other noises. Or you might easily pick up on your name being mentioned in a group conversation that you’re not fully paying attention to.

This type of selective listening is often unintentional and subconscious. Our brains naturally tune into information that seems relevant or important to us. But sometimes selective listening can become more intentional and problematic. If someone consistently tunes out certain voices or viewpoints, that’s selective listening at its most extreme.

Confirmation Bias

Confirmation bias refers to the tendency to search for, interpret, favor, and recall information in a way that confirms or strengthens one’s prior beliefs or values. It’s a type of cognitive bias that can reinforce preexisting notions and lead to errors in judgment.

Someone exhibiting confirmation bias will often tune out or dismiss any evidence that contradicts their worldview. They hear what they want to hear – information that confirms their preconceptions. But they may miss important facts that don’t align with their beliefs.

Examples of Confirmation Bias

  • Only visiting websites and reading articles that reinforce your political views
  • Discounting critics of someone you admire as being biased or jealous
  • Focusing on evidence that proves your theory while ignoring any evidence that disproves it

The Backfire Effect

The backfire effect refers to a phenomenon where individuals, when confronted with information that contradicts their beliefs, will strengthen their original position rather than modify it. Essentially, providing facts that dispute someone’s core beliefs may simply reinforce those views even more strongly.

This effect helps explain why it can be so difficult to change someone’s mind on polarizing issues. When their beliefs are challenged, they dig in deeper rather than opening themselves up to new information. In this case, they hear what they want to hear – information aligned with their existing narrative – and shut out contradictory facts.

Examples of the Backfire Effect

  • Presenting climate change data to someone who denies climate change often reinforces their denial rather than convincing them.
  • Correcting misinformation on social media can sometimes increase its spread as people double down on false beliefs.
  • Providing evidence against a conspiracy theory may just entrench believers’ views.

Bias Blind Spot

The bias blind spot refers to people’s tendency to recognize cognitive biases and motivated reasoning in others while failing to see it in themselves. Essentially, we’re all much better at identifying biases in other people than we are at recognizing our own biases.

So someone exhibiting the bias blind spot will call out selective listening and confirmation bias in people who disagree with them, but won’t acknowledge that they also suffer from similar biases. They can clearly hear flaws in other people’s thinking but remain oblivious to flaws in their own.

Examples of the Bias Blind Spot

  • Thinking your political opponents only believe propaganda, while considering your own views as being rational and well-informed.
  • Criticizing others for dismissing any evidence that contradicts their beliefs while maintaining your own positions strongly.
  • Believing other people justify bad behavior due to cognitive dissonance while insisting your own rationalizations are reasonable.

Closed-Mindedness

Closed-mindedness refers to an unwillingness to consider alternatives or other viewpoints. Closed-minded people have often already made up their minds on an issue. They are resistant to information that contradicts their pre-established positions, dismissing or ignoring any evidence that challenges their beliefs.

A closed-minded person engages in selective listening by tuning out any voices or facts that threaten their worldview. They hear what confirms their biases and filters out anything that creates cognitive dissonance with their existing narrative and opinions.

Signs of Closed-Mindedness

  • Unwilling to listen to different perspectives
  • Quick to judge or criticize alternate viewpoints
  • Have a rigid mindset and fixed beliefs
  • Focus only on evidence that aligns with current thinking

Motivated Reasoning

Motivated reasoning refers to forming conclusions based on personal wishes or preferences rather than objective logic and facts. People selectively evaluate information in a biased manner to reach desired conclusions.

Instead of making decisions based solely on evidence, motivated reasoning leads people to conclusions most consistent with their emotions, group identities, and other non-objective factors. They essentially hear what benefits them rather than comprehensively evaluating information.

Examples of Motivated Reasoning

  • Supporting politicians that align with your cultural identity regardless of their track record or qualifications
  • Dismissing logical arguments against your desired career path and focusing only on the upsides
  • Believing your child is innocent of misbehavior by selectively focusing on evidence of their good character

The Dunning-Kruger Effect

The Dunning-Kruger effect refers to the cognitive bias whereby people with low expertise or ability in a given area will often overestimate their competence and knowledge. Essentially, some people are too uninformed to realize their own ignorance.

This cognitive bias leads people to make misinformed judgments, as they don’t know enough to accurately assess their own limitations. However, their incompetence deprives them of the metacognitive ability to realize that they have those shortcomings. They hear only their own half-formed opinions rather than objective facts.

Examples of the Dunning-Kruger Effect

  • A poorly qualified leader overestimating their own expertise and drowning out well-informed advisors
  • Someone performing poorly on a test being completely confident they aced it
  • An unskilled debater dominating a discussion by virtue of their ignorance regarding the topic

Cognitive Dissonance

Cognitive dissonance refers to the mental discomfort that arises when someone’s beliefs conflict with new information or when two of their beliefs contradict each other. People will often try to reduce this discomfort by rejecting or avoiding the new information causing the psychological tension.

Cognitive dissonance leads people to selectively hear information that restores consonance while tuning out conflicting facts that increase discomfort. This allows people to stick with their pre-existing narratives by dismissing contradicting evidence as unreliable or unimportant.

Examples of Cognitive Dissonance

  • Smokers hearing about the health risks but tuning it out because they don’t want to quit
  • Voters supporting a candidate whose policies go against their self-interest to maintain loyalty to their party
  • Avoiding news stories that present negative information about a company someone works for or invests in

Groupthink

Groupthink refers to flawed decision-making that can occur when people place a higher priority on group harmony and consensus than critical thinking. It causes members of a group to unconsciously minimize conflict by reaching quick unanimous decisions without properly evaluating alternative options.

Because opposing viewpoints get suppressed, groups affected by groupthink do not consider all relevant facts when making decisions. The drive for agreement leads members to unconsciously bias information processing and engage in selective listening. They hear what reinforces group unity rather than objectively weighing all perspectives.

Examples of Groupthink

  • A corporate team moving forward with an ill-conceived product because no one feels comfortable challenging the boss
  • Friends agreeing on a restaurant choice because no one wants to argue
  • Political parties rallying around specific platforms without any dissent or nuanced discussion

Echo Chambers

Echo chambers refer to situations where people are exposed only to information or opinions that conform with and reinforce their existing views, leading to amplification of selective listening and confirmation bias. This can occur both offline through closed social groups and online via algorithms.

Within echo chambers, people are deprived of contrary perspectives and shielded from challenging their preconceptions. They exist in information bubbles that bounce their own views back at them while blocking out differing opinions and facts. This creates a selective exposure that strengthens confirmation bias.

Examples of Echo Chambers

  • Only watching news channels aligned with your political ideology
  • Social media feeds becoming dominated by viewpoints you already agree with
  • Belonging to groups or organizations with very homogeneous belief systems

How to Avoid Selective Listening

Here are some tips to reduce selective listening and overcome the various cognitive biases that reinforce it:

  • Seek out alternate sources: Make an effort to expose yourself to views that challenge your own by following news sources you disagree with, reading opposing literature, and befriending people with different perspectives.
  • Question your assumptions: Don’t simply accept your pre-existing narrative. Routinely fact check information you want to believe and scrutinize whether your beliefs align with objective facts.
  • Watch for backfire effects: Note when you have a strong emotional response to information that contradicts your beliefs. That’s a sign of cognitive dissonance, and an indicator you should reflect on whether your views are objective.
  • Cultivate intellectual humility: Remind yourself that no one has all the answers, and maintaining an attitude of openness and curiosity will lead to learning.
  • Avoid echo chambers: Recognize when your groups, news sources, and social circles lack diversity of thought. Proactively follow, listen to, and engage with people outside your “bubble.”

Conclusion

Selective listening leads people to tune out information that contradicts their preconceived narrative. This natural human tendency is exacerbated by a number of cognitive biases, such as confirmation bias, the backfire effect, motivated reasoning, and groupthink. Being aware of these biases and making an effort to expose oneself to alternate perspectives is key to thinking critically and avoiding the trap of only hearing what you want to hear.

Leave a Comment