Since
confirmation bias can be applied to various decision-making tasks and counter-argument may serve as a useful tool for eliminating it, future research should extend the proposed concept to other decision-making areas to test the effect of counter-argument on decision-making.
Just one example, which psychologists call
confirmation bias: once we have formed an opinion we tend to accept new evidence which supports it but disregard new evidence that challenges it.
Study after study demonstrates reason's deficiencies, such as the oft-noted
confirmation bias (the tendency to recall, select, or interpret evidence in a way that supports one's preexisting beliefs) and people's poor performance on straightforward logic puzzles.
Lastly, their data show that model-tracing can detect interesting patterns of student inquiry such as
confirmation bias and overcoming confirmation basis.
An interesting example of the latter is the
confirmation bias, which causes us to look for evidence confirming already existing beliefs and ignore or reinterpret evidence countering those beliefs.
*
confirmation bias: confirming what you expect to find
In this context, the most relevant of these is '
confirmation bias,' which is the human tendency to find ways to confirm preconceptions rather than refute them.
This error is typically called '
confirmation bias', sometimes also 'the availability error', 'the primacy effect', 'belief persistence', 'positivity bias', or the 'congruence heuristic' (Gilovich, 1991; Sutherland, 1992; Nickerson, 1998).
Confirmation bias leads people to embrace new information that reinforces their existing assumptions and to reject information that challenges them.
They include the affirmation bias (our tendency to agree with one another, or "go along to get along"), the
confirmation bias (the tendency to overweigh evidence that tends to support what we already believe and to underweigh or ignore evidence that does not), the endowment effect (the tendency to overvalue things that are our own), the fixed-pie perspective (the belief that the totality of benefits to be distributed among participants is like a pie that cannot be enlarged, so every piece our opponent gets is a piece that we don't get), the fundamental attribution error (attributing a different and usually more admirable cause to our own behavior than to someone else's behavior), and loss aversion (the tendency to give undue weight to losses so that they loom larger than gains).
One inherent danger associated with theories is the tendency to search for or interpret information in a way that confirms one's preconceptions--an error referred to as
confirmation bias.