Jump to content

Confirmation bias: How can you prevent it in yourself?


Recommended Posts

Is simple awareness of the fact of confirmation bias enough to reduce its impact on having an open mind and being able to consider all information critically and on it's own merit? I find myself overwhelmed by information these days, and I need to filter that which I should consider and that which I shouldn't. Often I select by what the source is, however I have sometimes seen the same article posted by two sepearte sources, one which I typically disregard and one which I would typically consider reading.

I have a feeling some sources will be selective in their information for a particular agenda, or because of a particular ethos, some credible information maybe fit with their agenda and this is a facet of their confirmation bias. I also feel that some sources will liberally sprinkle rational articles in with a mutlitude of pseudoscience, so that it can be hard to tell the difference when glossing over the information, some appears true and thus disguises the lies via a herd effect.

Finally, if confirmation bias isn't as important an effect as is currently thought, and someone points this out in their research, where it is overwhelmed by research which shows it is important, would paradoxically, confirmation bias, lead us to disregard the research which contradicts its effect? :P

Link to comment
Share on other sites

I think one way to help avoiding confirmation bias is to put your initial questions or statements in such a way that they do not prejudice towards a certain outcome i.e. the results that you want to see.

Edited by StringJunky
Link to comment
Share on other sites

  • 2 years later...

Sorcerer,

 

This is an incredibly important topic, for a reason I just recently noticed. It ties in with our dopamine reward system. Like StringJunky says you try to be objective by avoiding phrasing the question in a way that would prejudice the undertaking toward certain results you "want" to see. Except, according to my current thinking about dopamine and its role in our evolution and consciousness and thinking abilities, it is important to be right, to get a match between what you thought the world was like, and what it objectively turns out to be. That is, the whole reason for the undertaking, the theory, the test, the study, is to confirm the match, to "be right" about the world. That is what provides the good feeling, the feeling of victory, the feeling of correctness, the getting of the joke, the understanding of reality, the having of an insight (as precipitated this post in my case), the finding that the thought fit the place, is the whole reason we want to survive and live and enjoy the place, in the first place.

 

So it would be unlikely that one would structure a study, or even undertake a study if one was not expecting to find a successful match between model and reality.

 

This idea, confirmation bias, is thusly unavoidable, and actually there is no real reason to avoid it completely, as that finding out the world is not what you thought it was, is usually depressing and people, in all areas of endeavor "want" to strengthen their narrative, to prove they are right, and the other wrong.

 

So, although complex, being right, is a natural, survival requirement, that is closely tied to human socialization activities and personal feelings of accomplishment and worth.

 

We have no way to avoid it, without becoming something other than human. And confirmation of one's worldview is not such a bad thing.

 

It does cause great rifts in societies, as people tend to double down when they are proven wrong, in an effort to "be right" in the end, but it gets ridiculous as often people cut off their nose, to spite their face. Marriage counselors will often point out that you sometimes have to decide whether it is more important to be happy or right.

 

So it is complex, but to be "objectively" right, fulfills no purpose. If it doesn't make a person subjectively right, there is no dopamine, and the whole idea of "being right" is bypassed.

 

Regards, TAR

Link to comment
Share on other sites

 

So it would be unlikely that one would structure a study, or even undertake a study if one was not expecting to find a successful match between model and reality.

 

 

 

Would it be similar to someone posing the same hypothesis over and over, and ignoring people who present evidence that contradicts it?

Link to comment
Share on other sites

If I am at work and am not sure about something I have tested and found (or wanted) to be good I usually give the test to someone else to repeat for me to see if they get the same result. I sometimes (if it is an additive or something being tested), give someone else the test without telling them what I suspect the outcome to be. I do this to avoid my own bias and also that of the tester.

Edited by DrP
Link to comment
Share on other sites

DrP,

 

At my work, a couple years ago, I tested software. The tests we would run were very specifically written to perform a certain set of conditions, and expect a certain outcome. The expected outcome was part of testing process. Pass or fail. Correct or incorrect. Fulfills requirements or a bug has been identified.

 

What kind of tests do you run if you are not expecting a certain thing to occur or not occur in some measured time to some measured degree?

 

Regards, TAR


I am going to mix two random things together, and write down what happens or does not happen?

 

Don't I have to specify what it is I am testing for?


SwansonT,

 

People expected GWs to show up in LIGO.

 

And everyone felt wonderful when Einstein was again proven correct and the predictions of GR and SR were found again to be workable theories that fit reality.

 

Regards, TAR

Link to comment
Share on other sites

Tar,

 

the tests I was thinking of here are where the improvements are very subtle in as system that kicks out a range of results from run to run. It isn't exactly reproducible every time. Therefore we have to do dozens of runs and average the result. Also there are tests that aren't so clear cut. Lets say you add an ingredient to improve 'tastiness' to something - how do you measure 'tastiness'? it is subjective to the people doing the testing, you need to average the results over thousands of tests to ensure the averaging out of peoples spectrum of tastes. I would get others to test the 'tastiness' of something I think I have improved as I wouldn't want my personal bias to influence the results. (this is a little more subjective than tests I am talking about, but just an example). I would also not want to tell the person which one I thought was more tasty either in case it biased their opinion of the 'tastiness' of the product.

 

Maybe tastiness isn't a good example, but it is the first one that came to mind. We do small scale tests in the lab and have to design tests and come up with comparisons between things that can only really be tested on a large scale costing thousands. If we can get some idea of how a system will work before paying for large tests costing 10s of thousands the that is good.

Link to comment
Share on other sites

SwansonT,

 

People expected GWs to show up in LIGO.

 

And everyone felt wonderful when Einstein was again proven correct and the predictions of GR and SR were found again to be workable theories that fit reality.

 

Regards, TAR

That is a massively simplistic view of the experiment, and does not sufficiently support your hypothesis.

 

If the motivation is solely to get an answer that matches predictions, you would be satisfied with false positives. The fact that scientists (including in LIGO) take great pains to exclude false positives is evidence against your position.

Link to comment
Share on other sites

That is a massively simplistic view of the experiment, and does not sufficiently support your hypothesis.

 

If the motivation is solely to get an answer that matches predictions, you would be satisfied with false positives. The fact that scientists (including in LIGO) take great pains to exclude false positives is evidence against your position.

This is exemplified by the other GW experiment, who's name escapes me, when they put out their results for external examination and an electrical fault was discovered.

Link to comment
Share on other sites

SwansonT,

 

Not inconsistent with my thought, to want to be sure about the match. We absolutely do not like to be fooled about reality. We "want" to be sure.

 

Regards, TAR

Link to comment
Share on other sites

We want to be right. Consider again how people defend their own narrative. Consider, in terms of my thought, how people divide other people into we and they categories. Those that are right and those that are wrong. I am talking about psychology. How we think about ourselves and others.

Link to comment
Share on other sites

This is exemplified by the other GW experiment, who's name escapes me, when they put out their results for external examination and an electrical fault was discovered.

 

 

Gran Sasso neutrino experiment collaboration with CERN?

 

 

Not inconsistent with my thought, to want to be sure about the match. We absolutely do not like to be fooled about reality. We "want" to be sure.

The inconsistency is with your motivation. Much like a scientific inquiry, you need to remove other possibilities from consideration. To not do so and assert that you are right is...confirmation bias.

 

"it would be unlikely that one would structure a study, or even undertake a study if one was not expecting to find a successful match between model and reality"

 

One could also say it would be unlikely because it would be grossly stupid and incompetent to make an experiment that did not have the possibility or capability of confirming an hypothesis.

 

I know scientists and engineers who work on projects that are unlikely to succeed. I have worked on such a project (it was looking for physics beyond the standard model. It found none). Where's the confirmation bias and where's the dopamine reward?

 

We want to be right. Consider again how people defend their own narrative. Consider, in terms of my thought, how people divide other people into we and they categories. Those that are right and those that are wrong. I am talking about psychology. How we think about ourselves and others.

You keep changing the narrative. Wanting to be right is not confirmation bias. Wanting to think you are right and thereby not considering all information is.

 

————

 

I have worked on such a project (it was looking for physics beyond the standard model. It found none).

Here's a story about that. I was a postdoc at TRIUMF. We were trying to trap radioactive potassium (K-37, with a half-life of 1.226 s) in order to do nuclear theory experiments with it.

 

Nobody knew the salient details for trapping it, because nobody had done spectroscopy on it. We searched where the theory said it should be, and after several days, eventually we saw a bright blob in the camera at some frequency of light. We told the others in the group about it, but that was it. A day or two later at lunch, someone said they had heard a rumor that we had trapped the isotope. My fellow postdoc was adamant that we could not confirm it publicly until we had finished our tests to make sure it was not some anomaly, like scattered light.

 

You will find the accelerator community, with its "5-sigma" criterion for reporting new phenomena, is especially careful in announcing results. Other groups, who have to rely less on raw statistics, have other criteria.

 

That's pretty much the opposite of confirmation bias.

Link to comment
Share on other sites

 

 

Gran Sasso neutrino experiment collaboration with CERN?

 

 

 

The inconsistency is with your motivation. Much like a scientific inquiry, you need to remove other possibilities from consideration. To not do so and assert that you are right is...confirmation bias.

 

"it would be unlikely that one would structure a study, or even undertake a study if one was not expecting to find a successful match between model and reality"

 

One could also say it would be unlikely because it would be grossly stupid and incompetent to make an experiment that did not have the possibility or capability of confirming an hypothesis.

 

I know scientists and engineers who work on projects that are unlikely to succeed. I have worked on such a project (it was looking for physics beyond the standard model. It found none). Where's the confirmation bias and where's the dopamine reward?

 

 

You keep changing the narrative. Wanting to be right is not confirmation bias. Wanting to think you are right and thereby not considering all information is.

 

SwansonT,

 

The dopamine reward is in the confirmation of the standard model.

 

Take for instance, every conversation on this board, at least the threads I have been part of, where people choose up sides and either defend a hypothesis or seek to find fault with it. There is, according to my thesis, a need for people to be right. To look at the evidence that backs up their claim, and doubt the evidence that contradicts it. I am not saying I do not have confirmation tendencies in terms of my thesis, I am saying that when I see agreement with my thesis, it makes me feel good, makes me feel smart, makes me feel like I have an insight to share with you folks, thereby increasing your grasp of reality and your ability to help confront the difficulties we face, as a group.

 

You have not yet here, in this thread, provided any evidence that my thesis is faulty. You gave an example of running a test to find a match with reality beyond the current standard model. The example actually coincides exactly with my thought. We want to know that our model matches the place. We trust others to tell us what they see, to be more sure that our eyes are not deceiving us. Edison tested thousands of materials to find the one that burned the longest. The failures were successes in that he could cross this or that material off the list. He was after the best fit to the problem of what to send electricity through, in a vacuum, to provide light for the longest period of time reasonably possible.

 

Regards, TAR

Link to comment
Share on other sites

 

Perhaps you are thinking of OPERA?

 

https://en.wikipedia.org/wiki/Faster-than-light_neutrino_anomaly

 

 

 

 

Gran Sasso neutrino experiment collaboration with CERN?

That's the one.

 

 

....Take for instance, every conversation on this board, at least the threads I have been part of, where people choose up sides and either defend a hypothesis or seek to find fault with it. There is, according to my thesis, a need for people to be right.....

 

Only in some people and they are the ones likely to fall to confirmation bias. Obviously, there is a desire but good scientists are acutely aware of stuff like this. That's why peer review is such a good mechanism and those scientists in the Opera experiment are fine examples of this using it and clearly negate your hypothesis that it is a ubiquitous trait. A lot of the regulars here know that facts can change with new evidence and only defend a stance in the face of current evidence.

Link to comment
Share on other sites

SwansonT,

 

The dopamine reward is in the confirmation of the standard model.

But that wasn't the goal of the experiment. Basically you're claiming that whether an experiment succeeds or fails, you get a dopamine reward.

 

Take for instance, every conversation on this board, at least the threads I have been part of, where people choose up sides and either defend a hypothesis or seek to find fault with it. There is, according to my thesis, a need for people to be right. To look at the evidence that backs up their claim, and doubt the evidence that contradicts it. I am not saying I do not have confirmation tendencies in terms of my thesis, I am saying that when I see agreement with my thesis, it makes me feel good, makes me feel smart, makes me feel like I have an insight to share with you folks, thereby increasing your grasp of reality and your ability to help confront the difficulties we face, as a group.

But we're talking about confirmation bias. There is an implication that confirmation bias must be involved.

 

You have not yet here, in this thread, provided any evidence that my thesis is faulty. You gave an example of running a test to find a match with reality beyond the current standard model. The example actually coincides exactly with my thought. We want to know that our model matches the place. We trust others to tell us what they see, to be more sure that our eyes are not deceiving us. Edison tested thousands of materials to find the one that burned the longest. The failures were successes in that he could cross this or that material off the list. He was after the best fit to the problem of what to send electricity through, in a vacuum, to provide light for the longest period of time reasonably possible.

 

Regards, TAR

Our model was that the standard model was wrong. Basically you've recast everything as a success, and as a result your idea is not falsifiable. Everything can be seen as supporting it. Which is...confirmation bias.

Link to comment
Share on other sites

String Junky, SwansonT,

 

Understood that good scientists, on purpose, try to minimize confirmation bias. And, understood that a major tenant of science is an effort to falsify a claim, particularly a belief of your own...but to my thesis, that dopamine flows, when we are correct about reality and when the match between our model and reality is secure, go both considerations. I know I feel good, when I think I am correctly understanding the place, and I assume in this idea, that if I feel good about a thing (get dopamine) it is very possible, if not probable that other human beings with the same evolutionarily developed serotonin, norepinephrine, dopamine, need, motivation, reward system, would find pleasure and good feelings of comfort and security and victory if they experienced a similar set of circumstances as I did, when I felt good, and smart and alive and victorious.

 

To this, it is not wrong to think yourself right. It is human. And sometimes, especially with the help of others, you feel right twice. Once, when your model matches the place, and twice when your model matches someone else's model of the place.

 

Regards, TAR

Link to comment
Share on other sites

SwansonT,

 

Well then go ahead and discuss.

 

I thought it central to the discussion. But go ahead and discuss it without human neurotransmitters involved. I personally don't see how that is going to work out for you.

 

Regards, TAR

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.