Jump to content

Confirmation bias


swansont

Recommended Posts

A must read, especially if you are ideologically predisposed to believing in confirmation bias.

 

http://youarenotsosmart.com/2010/06/23/confirmation-bias/

 

Punditry is a whole industry built on confirmation bias.

 

Rush Limbaugh and Keith Olbermann, Glenn Beck and Arianna Huffington, Rachel Maddow and Ann Coulter – these people provide fuel for beliefs, they pre-filter the world to match existing world-views.

 

If their filter is like your filter, you love them. If it isn’t, you hate them.

 

Whether or not pundits are telling the truth, or vetting their opinions, or thoroughly researching their topics is all beside the point. You watch them not for information, but for confirmation.

Link to comment
Share on other sites

Electronic media is a culprit too.

 

It can now monitor who reads its news, what they're reading, and when they're reading it. (Something papers, radio and TV can't do.) Media thinks it's catering to our concerns when it feeds us similar stuff. So, suddenly, it seems to us as though the whole world is doing cruel things to kittens if a cruelty-to-a-kitten story gets a lot of hits and media decides to scrounge through every tidbit of news to give us similar ones. I don't need commentators or media warping my perspective of reality.

Link to comment
Share on other sites

There's a whole group of these related biases:

 

Prior attitude effect. Subjects who feel strongly about an issue - even when encouraged to be objective - will evaluate supportive arguments more favorably than contrary arguments.

 

Disconfirmation bias. Subjects will spend more time and cognitive resources denigrating contrary arguments than supportive arguments.

 

Confirmation bias. Subjects free to choose their information sources will seek out supportive rather than contrary sources.

 

Attitude polarization. Exposing subjects to an apparently balanced set of pro and con arguments will exaggerate their initial polarization.

 

Attitude strength effect. Subjects voicing stronger attitudes will be more prone to the above biases.

 

Sophistication effect. Politically knowledgeable subjects, because they possess greater ammunition with which to counter-argue incongruent facts and arguments, will be more prone to the above biases.

 

from http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/

Link to comment
Share on other sites

I have a conservative friend who has gradually built up his anti-Obama crusade over time, building each event one on top of the other, as if that actually makes sense. Every time something happens, he just tacks it onto the end of the long chain of "reason" he's put together, as if he had an open mind at the beginning and only gradually came by his considered opinion.

 

Unfortunately I think this sort of thing is actually commonplace. I suspect my friend is just more open about it than most people are.

 

(I think I'll send him this article and see what he says.) :)

Link to comment
Share on other sites

There's a whole group of these related biases.

Thanks, ecoli, this is great! Many people suffer from "disconfirmation bias" ... they can give a million reasons why the other side is wrong instead of giving one reason why they're right. Now I know what to call it.

Link to comment
Share on other sites

Nice,

 

Just like ewmon, I now have a name to put what I try to convince people of what they're doing! (though we all know they'll deny it and keep on truckin', but that's the fun part)

Link to comment
Share on other sites

I have a conservative friend who has gradually built up his anti-Obama crusade over time, building each event one on top of the other, as if that actually makes sense. Every time something happens, he just tacks it onto the end of the long chain of "reason" he's put together, as if he had an open mind at the beginning and only gradually came by his considered opinion.

 

Unfortunately I think this sort of thing is actually commonplace. I suspect my friend is just more open about it than most people are.

 

(I think I'll send him this article and see what he says.) :)

 

Perhaps you can get your friend to write down his predictions in advanced as an informal Bayesian test.

 

It seems that errors in reasoning of the above kind often arrive because people are more interested in defending their priors (estimation of probability before making an observation) than updating them based on that information.

 

The basic concept here is that expected evidence must be conserved

To continue your example, your friend is convinced that Obama is not a good president and points to a cherry picked list to prove that point. Having to make predictions prevents that kind of cherry picking because it forces you to test your model based on observable evidence.

 

According to our informal bayesian test we have to ask ourselves two different questions: what's the probability we expect to see evidence x given hypothesis y is true AND what's the probability that hypothesis y is consistent with evidence x?

 

So ask, as an example, what's your estimated probability (confidence) that Barack Obama will raise taxes on the rich if he is a bad president AND what's your probability estimation that Obama is a bad president if we observe him raising taxes on the rich?

 

If we don't observe Obama raising taxes on the rich, your friend (if he's honest with himself) has to lower his confidence that Obama is a bad president and update his model about the Obama administration.

 

The other useful part of this test is that updates in priors shift wildly only when your prediction is wrong or when you're not very confident about your beliefs. If I am very confident in my belief that Barack Obama is a terrible president then observing him doing something I consider terrible isn't going to change my opinion much. But if I see him doing something that I consider very good, than I know that my original confidence about his 'badness' was much too high and I should adjust accordingly.

 

A liberal uncle of mine pointed out the Rand Paul's comments about the civil rights amendment as fantastic proof of what he knew all along: that republicans are racist. I pointed out (though I was probably ignored) that if you already believe that republicans are racist and you expect to observe evidence of republicans demonstrating their racism, then Rand Paul's comments are expected and therefore not very strong evidence. This observation shouldn't really be raising your confidence by all that much, therefore. (It is telling that many examples of republicans not being racist was ignored, however. This is because he did not make the following prediction: Out of a sample of 1000 republican speeches, how many of those will contain racist content based on the hypothesis that republicans are racist)

Edited by ecoli
Link to comment
Share on other sites

That's an interesting idea. For a question as broad as "good president", obviously you would need a fairly large number of question-answer samples, otherwise they can just weasel out with an "everyone makes mistakes", etc. I like it.

 

It might be interesting to try and create a formula for rating pundits based on the degree to which they leverage confirmation bias. It strikes me that Jon Stewart doesn't use it as much as Glenn Beck, for example, but that's a completely subjective (and rather obvious) opinion. But if it could be quantified it might make an interesting (and possibly useful) unit of measure.

Link to comment
Share on other sites

That's an interesting idea. For a question as broad as "good president", obviously you would need a fairly large number of question-answer samples, otherwise they can just weasel out with an "everyone makes mistakes", etc. I like it.

 

Agreed. In general the more specific and less general the hypothesis the better it works. Anyone can come with a general hypothesis such as "things happen" but of course this is not very useful to understand the world since your predictions can't be discriminative.

 

Still, I find it to be a useful heuristic when trying to avoid bad heuristics (which seems to be how people wind up employing the confirmation bias - and others - in the first place.

 

 

 

It might be interesting to try and create a formula for rating pundits based on the degree to which they leverage confirmation bias. It strikes me that Jon Stewart doesn't use it as much as Glenn Beck, for example, but that's a completely subjective (and rather obvious) opinion. But if it could be quantified it might make an interesting (and possibly useful) unit of measure.

 

I would agree with this as well. It seems as when Stewart employs these biases, it's in jest and obviously so.

 

However, we can test this hypothesis: how many instances of confirmation biases would we expect to observe in x hours of watching Glen Beck? I'd say maybe 5 per hour (arbitrary). So what's my confidence (prior probability) that I'll observe 5 per hour given the hypothesis that Glen Beck uses lots of confirmation biases seriously? I'll say 80% confidence rate.

 

If I fail to see that, I'll have to downgrade my confidence and/or device a better test (to avoid sampling biases - we should expect a non even distribution of confirmation bias use and shouldn't expect that 1 hour of Glen Beck is representative).

 

We can apply the same tools to watching John Stewart too to compare.

Link to comment
Share on other sites

We can apply the same tools to watching John Stewart too to compare.

 

That Stewart lampoons both the left and right should be an indication that he exhibits less confirmation bias than most pundits.

 

 

As with Pangloss's anecdote, I see a whole lot of behavior as hating someone, and then coming up with reasons to justify that feeling.

Link to comment
Share on other sites

That Stewart lampoons both the left and right should be an indication that he exhibits less confirmation bias than most pundits.

 

 

As with Pangloss's anecdote, I see a whole lot of behavior as hating someone, and then coming up with reasons to justify that feeling.

 

I think it really says more about what type of confirmation bias the daily show employs than a rating from low to high - his bias is just less partisan than most pundits, but not necessarily less intense overall.

 

He doesn't even really try to hide it either - he has a comedy show and tries to highlight people in politics that are easy to lampoon. He shows us a world where politics is riddled with bumbling, stumbling self contradicting fools, and that's the world people expect to see when they tune in, whether it's populated from the right or left.

He's also quite happy to cherry pick or take things somewhat out of context to exemplify the laugh he's going for, so it's not really objectively covering events either - it's just less politically biased since stupid is an equal opportunity selector.

Link to comment
Share on other sites

That Stewart lampoons both the left and right should be an indication that he exhibits less confirmation bias than most pundits.

 

As an axiom, I'm not sure that works. Just to give a counter-example, Bill O'Reilly defends/attacks both liberal and conservative viewpoints, but still exhibits a significant amount of confirmation bias (IMO).

 

But perhaps this is due to his having a third political position, distinct from liberals and conservatives, which I usually describe as "populist". Jon Stewart has a pretty obvious ideological preference preference, but I don't think it could really be described as, say, an agenda. Maybe that's part of it.

Link to comment
Share on other sites

I think the view most reinforced by Jon Stewart's confirmation bias is "politicians are ridiculous."

 

My guess is that Stewart is quite aware of the problems of sampling biases in only running clips and commentary of politicians acting stupidly.

 

So I don't think he's saying, Watch these ten clips of politicians acting stupidly. We conclude that the vast majority of politicians act stupidly.

 

To analyze the situation better we should take a step back and analyze the source of confirmation and other biases, and I think this is the Narrative fallacy.

 

From Nassim Taleb's the Black Swan:

 

The narrative fallacy addresses our limited ability to look at sequences of facts without weaving an explanation into them, or, equivalently, forcing a logical link, an arrow of relationship upon them. Explanations bind facts together. They make them all the more easily remembered; they help them make more sense. Where this propensity can go wrong is when it increases our impression of understanding.

 

When people think about 'truths' and 'facts' (political, and otherwise) it's not abstractly. We embed or image of the world into narrative and stories.

 

We fall easy prey to the confirmation bias because it is very easy to update a narrative we already have with new information. Furthermore, we seek information that fits pre-existing narratives making it difficult to update our mental models or change our minds about things.

 

Given that we can't escape using narratives, being honest that we're using a narrative goes a long way in being honest about our understanding.

 

Getting back to the original point, I think this is why the Daily Show (and the Colbert Report, despite being not as funny) works where other networks fail:

 

Where the mainstream media will deny to the death that they're telling stories, and that "facts" and "fair and balanced" reporting is really just a narrative that only approximates the 'truth', The Daily Show embraces the 'News as Narrative' concept.

 

As a result, they make the news funny and interested that people feel comfortable with. Stewart may be guilty of confirmation biases, but by placing the news inside a specific story, he doesn't become guilty of drawing larger conclusions that looking outside the narrative might disprove.

 

I think this is why he comes across as ideologically neutral as well. Not because the man himself is neutral, but he's knowingly trying to tell a story (and so making it a good one), where everyone else is trying to tell 'the truth' and are actually telling a bad story.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.