Overconfidence effect

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The overconfidence effect is a well-established bias in which someone's subjective confidence in their judgments is reliably greater than their objective accuracy, especially when confidence is relatively high.[1] For example, in some quizzes, people rate their answers as "99% certain" but are wrong 40% of the time. It has been proposed that a metacognitive trait mediates the accuracy of confidence judgments,[2] but this trait's relationship to variations in cognitive ability and personality remains uncertain.[1] Overconfidence is one example of a miscalibration of subjective probabilities.


[edit] Demonstration

The most common way in which overconfidence has been studied is by asking people how confident they are of specific beliefs they hold or answers they provide. The data show that confidence systematically exceeds accuracy, implying people are more sure that they are correct than they deserve to be. If human confidence had perfect calibration, judgments with 100% confidence would be correct 100% of the time, 90% confidence correct 90% of the time, and so on for the other levels of confidence. By contrast, the key finding is that confidence exceeds accuracy so long as the subject is answering hard questions about an unfamiliar topic. For example, in a spelling task, subjects were correct about 80% of the time when they were "100% certain."[3] Put another way, the error rate was 20% when subjects expected it to be 0%. In a series where subjects made true-or-false responses to general knowledge statements, they were overconfident at all levels. When they were 100% certain of their answer to a question, they were wrong 20% of the time.[4]

In a confidence-intervals task, where subjects had to judge quantities such as the total egg production of the U.S. or the total number of physicians and surgeons in the Boston Yellow Pages, they expected an error rate of 2% when their real error rate was 46%.[5] Once subjects had been thoroughly warned about the bias, they still showed a high degree of overconfidence.

Overprecision is the excessive confidence that one knows the truth. For reviews, see Harvey (1997) or Hoffrage (2004).[6][7] Much of the evidence for overprecision comes from studies in which participants are asked about their confidence that individual items are correct. This paradigm, while useful, cannot distinguish overestimation from overprecision; they are one and the same in these item-confidence judgments. After making a series of item-confidence judgments, if people try to estimate the number of items they got right, they do not tend to systematically overestimate their scores. The average of their item-confidence judgments exceeds the count of items they claim to have gotten right.[8] One possible explanation for this is that item-confidence judgments were inflated by overprecision, and that their judgments do not demonstrate systematic overestimation.

[edit] Confidence intervals

The strongest evidence of overprecision comes from studies in which participants are asked to indicate how precise their knowledge is by specifying a 90% confidence interval around estimates of specific quantities. If people were perfectly calibrated, their 90% confidence intervals would include the correct answer 90% of the time.[5] In fact, hit rates are often as low as 50%, suggesting people have drawn their confidence intervals too narrowly, implying that they think their knowledge is more accurate than it actually is.

[edit] Planning fallacy

The planning fallacy describes the tendency for people to overestimate their rate of work or to underestimate how long it will take them to get things done.[9] It is strongest for long and complicated tasks, and disappears or reverses for simple tasks that are quick to complete.

[edit] Illusion of control

Illusion of control describes the tendency for people to behave as if they might have some control when in fact they have none.[10] However, evidence does not support the notion that people systematically overestimate how much control they have; when they have a great deal of control, people tend to underestimate how much control they have.[11]

[edit] Contrary evidence

Wishful thinking effects, in which people overestimate the likelihood of an event because of its desirability are relatively rare.[12] This may be in part because people engage in more defensive pessimism in advance of important outcomes,[13] in an attempt to reduce the disappointment that follows overly optimistic predictions.[14]

[edit] Overplacement

Overplacement is the false belief that one is better than others. For a review, see Alicke and Govorun (2005).[15]

[edit] Better-than-average effects

Perhaps the most celebrated better-than-average finding is Svenson’s (1981) finding that 93% of American drivers rate themselves as better than the median.[16] The frequency with which school systems claim their students outperform national averages has been dubbed the “Lake Wobegon” effect, after Garrison Keillor’s apocryphal town in which “all the children are above average.”[17] Overplacement has likewise been documented in a wide variety of other circumstances.[18] Kruger (1999), however showed that this effect is limited to “easy” tasks in which success is common or in which people feel competent. For difficult tasks, the effect reverses itself and people believe they are worse than others.[19]

[edit] Comparative-optimism effects

Some researchers have claimed that people think good things are more likely to happen to them than to others, whereas bad events were less likely to happen to them than to others.[20] But others (Chambers & Windschitl, 2004; Chambers, Windschitl, & Suls, 2003; Kruger & Burrus, 2004) have pointed out that prior work tended to examine good outcomes that happened to be common (such as owning one’s own home) and bad outcomes that happened to be rare (such as being struck by lightning).[21][22][23] Event frequency accounts for a proportion of prior findings of comparative optimism. People think common events (such as living past 70) are more likely to happen to them than to others, and rare events (such as living past 100) are less likely to happen to them than to others.

[edit] Positive illusions

Taylor and Brown (1988) have argued that people cling to overly positive beliefs about themselves, illusions of control, and beliefs in false superiority, because it helps them cope and thrive.[24] While there is some evidence that optimistic beliefs are correlated with better life outcomes, most of the research documenting such links is vulnerable to the alternative explanation that their forecasts are accurate. The cancer patients who are most optimistic about their survival chances are optimistic because they have good reason to be.

[edit] Contrary evidence

Recent work has critiqued the methodology used in older research on overplacement, calling some of the effects documented in prior research into question.[vague][25]

[edit] Practical implications

"Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion."

Daniel Kahneman[26]

Overconfidence has been called the most “pervasive and potentially catastrophic” of all the cognitive biases to which human beings fall victim.[27] It has been blamed for lawsuits, strikes, wars, and stock market bubbles and crashes.

Strikes, lawsuits, and wars could arise from overplacement. If plaintiffs and defendants were prone to believe that they were more deserving, fair, and righteous than their legal opponents, that could help account for the persistence of inefficient enduring legal disputes.[28] If corporations and unions were prone to believe that they were stronger and more justified than the other side, that could contribute to their willingness to endure labor strikes.[29] If nations were prone to believe that their militaries were stronger than were those of other nations, that could explain their willingness to go to war.[30]

Overprecision could have important implications for investing behavior and stock market trading. Because Bayesians cannot agree to disagree,[31] classical finance theory has trouble explaining why, if stock market traders are fully rational Bayesians, there is so much trading in the stock market. Overprecision might be one answer.[32] If market actors are too sure their estimates of an asset’s value is correct, they will be too willing to trade with others who have different information than they do.

Oskamp (1965) tested groups of clinical psychologists and psychology students on a multiple-choice task in which they drew conclusions from a case study.[33] Along with their answers, subjects gave a confidence rating in the form of a percentage likelihood of being correct. This allowed confidence to be compared against accuracy. As the subjects were given more information about the case study, their confidence increased from 33% to 53%. However their accuracy did not significantly improve, staying under 30%. Hence this experiment demonstrated overconfidence which increased as the subjects had more information to base their judgment on.[33]

Even if there is no general tendency toward overconfidence, social dynamics and adverse selection could conceivably promote it. For instance, those most likely to have the courage to start a new business are those who most overplace their abilities relative to those of other potential entrants. And if voters find confident leaders more credible, then contenders for leadership learn that they should express more confidence than their opponents in order to win election.[34]

Overconfidence can be beneficial to individual self-esteem as well as giving an individual the will to succeed in their desired goal. Just believing in oneself may give one the will to take one's endeavours further than those who do not.[35]

[edit] Related biases

  • Overconfidence bias often serves to increase the effects of escalating commitment - causing decision makers to refuse to withdraw from a losing situation, or to continue to throw good money, effort, time and other resources after bad investments.[citation needed]
  • People often tend to ignore base rates or undervalue their effect. For example, if one is competing against individuals who are already winners of previous competitions, one's odds of winning should be adjusted downward considerably. People tend to fail to do so sufficiently.[citation needed]

[edit] Core self-evaluations

Very high levels of core self-evaluations (CSE), a stable personality trait composed of locus of control, neuroticism, self-efficacy, and self-esteem,[36] may lead to the overconfidence effect. People who have high core self-evaluations will think positively of themselves and be confident in their own abilities,[36] although extremely high levels of CSE may cause an individual to be more confident than is warranted.

[edit] See also

[edit] Notes

  1. ^ a b Pallier, Gerry, et al. "The role of individual differences in the accuracy of confidence judgments." The Journal of General Psychology 129.3 (2002): 257+.
  2. ^ Stankov, L. (1999). Mining on the "no man's land" between intelligence and personality. In P.L. Ackerman, P.C. Kyllonen, & R.D. Roberts (Eds.), Learning and individual differences: Process, trait, and content determinants (pp. 314–337). Washington, DC: American Psychological Association.
  3. ^ Adams, P. A., & Adams, J. K. (1960). Confidence in the recognition and reproduction of words difficult to spell. The American Journal of Psychology, 73(4), 544-552.
  4. ^ Lichtenstein, Sarah; Baruch Fischhoff, Lawrence D. Phillips (1982). "Calibration of probabilities: The state of the art to 1980". In Daniel Kahneman, Paul Slovic, Amos Tversky. Judgment under uncertainty: Heuristics and biases. Cambridge University Press. pp. 306–334. ISBN 978-0-521-28414-1. 
  5. ^ a b Alpert, Marc; Howard Raiffa (1982). "A progress report on the training of probability assessors". In Daniel Kahneman, Paul Slovic, Amos Tversky. Judgment under uncertainty: Heuristics and biases. Cambridge University Press. pp. 294–305. ISBN 978-0-521-28414-1. 
  6. ^ Harvey, N. (1997). Confidence in judgment. Trends in Cognitive Sciences, 1(2), 78-82.
  7. ^ Hoffrage, Ulrich (2004). "Overconfidence". In Rüdiger Pohl. Cognitive Illusions: a handbook on fallacies and biases in thinking, judgement and memory. Psychology Press. ISBN 978-1-84169-351-4. 
  8. ^ Gigerenzer, G. (1993). The bounded rationality of probabilistic mental modules. In K. I. Manktelow & D. E. Over (Eds.), Rationality (pp. 127-171). London: Routledge.
  9. ^ Buehler, R., Griffin, D., & Ross, M. (1994). Exploring the "planning fallacy": Why people underestimate their task completion times. Journal of Personality and Social Psychology, 67(3), 366-381.
  10. ^ Langer, E. J. (1975). The illusion of control. Journal of Personality and Social Psychology, 32(2), 311-328.
  11. ^ Gino, F., Sharek, Z., & Moore, D. A. (2011). Keeping the illusion of control under control: Ceilings, floors, and imperfect calibration. Organizational Behavior & Human Decision Processes, 114, 104-114.
  12. ^ Krizan, Z., & Windschitl, P. D. (2007). The influence of outcome desirability on optimism. Psychological Bulletin, 133(1), 95-121.
  13. ^ Norem, J. K., & Cantor, N. (1986). Defensive pessimism: Harnessing anxiety as motivation. Journal of Personality and Social Psychology, 51(6), 1208-1217.
  14. ^ McGraw, A. P., Mellers, B. A., & Ritov, I. (2004). The affective costs of overconfidence. Journal of Behavioral Decision Making, 17(4), 281-295.
  15. ^ Alicke, M. D., & Govorun, O. (2005). The better-than-average effect. In M. D. Alicke, D. Dunning & J. Krueger (Eds.), The self in social judgment (pp. 85-106). New York: Psychology Press.
  16. ^ Svenson, O. (1981). Are we less risky and more skillful than our fellow drivers? Acta Psychologica, 47, 143-151.
  17. ^ Cannell, J. J. (1989). How public educators cheat on standardized achievement tests: The "Lake Wobegon" report.
  18. ^ Dunning, D. (2005). Self-insight: Roadblocks and detours on the path to knowing thyself. New York: Psychology Press.
  19. ^ Kruger, J. (1999). Lake Wobegon be gone! The "below-average effect" and the egocentric nature of comparative ability judgments. Journal of Personality and Social Psychology, 77(2), 221-232.
  20. ^ Weinstein, N. D. (1980). Unrealistic optimism about future life events. Journal of Personality and Social Psychology, 39(5), 806-820.
  21. ^ Chambers, J. R., & Windschitl, P. D. (2004). Biases in social comparative judgments: The role of nonmotivational factors in above-average and comparative-optimism effects. Psychological Bulletin, 130(5).
  22. ^ Chambers, J. R., Windschitl, P. D., & Suls, J. (2003). Egocentrism, event frequency, and comparative optimism: When what happens frequently is "more likely to happen to me". Personality and Social Psychology Bulletin, 29(11), 1343-1356.
  23. ^ Kruger, J., & Burrus, J. (2004). Egocentrism and focalism in unrealistic optimism (and pessimism). Journal of Experimental Social Psychology, 40(3), 332-340.
  24. ^ Taylor, S. E., & Brown, J. D. (1988). Illusion and well-being: a social psychological perspective on mental health. Psychological Bulletin, 103(2), 193-210.
  25. ^ Harris, A. J. L., & Hahn, U. (2011). Unrealistic Optimism about Future Life Events: A cautionary note. Psychological Review, 118(1), 135-154.
  26. ^ Kahneman, Daniel (19 October 2011). "Don't Blink! The Hazards of Confidence". New York Times. Retrieved 25 October 2011. 
  27. ^ Plous, S. (1993). The psychology of judgment and decision making. New York: McGraw-Hill.
  28. ^ Thompson, L., & Loewenstein, G. (1992). Egocentric interpretations of fairness and interpersonal conflict. Organizational Behavior and Human Decision Processes, 51(2), 176-197.
  29. ^ Babcock, L., & Olson, C. (1992). The causes of impasses in labor disputes. Industrial Relations, 31, 348-360.
  30. ^ Johnson, D. D. P. (2004). Overconfidence and war: The havoc and glory of positive illusions. Cambridge, MA: Harvard University Press.
  31. ^ Aumann, R. J. (1976). Agreeing to disagree. Annals of Statistics, 4, 1236-1239.
  32. ^ Daniel, K. D., Hirshleifer, D. A., & Sabrahmanyam, A. (1998). Investor psychology and security market under- and overreactions. Journal of Finance, 53(6), 1839-1885.
  33. ^ a b Oskamp, Stuart (1965). "Overconfidence in case-study judgements". The Journal of Consulting Psychology (American Psychological Association) 2: 261–265.  reprinted in Kahneman, Daniel; Paul Slovic, Amos Tversky (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press. pp. 287–293. ISBN 978-0-521-28414-1. 
  34. ^ Radzevick, J. R., & Moore, D. A. (2011). Competing to be certain (but wrong): Social pressure and overprecision in judgment. Management Science, 57(1), 93-106.
  35. ^ Fowler, James, and Dominic Johnson. "On Overconfidence." Seed Magazine. Seed Magazine, January 7, 2011. Web. 22 Jul 2011. http://seedmagazine.com/content/article/on_overconfidence/
  36. ^ a b Judge, T. A., Locke, E. A., & Durham, C. C. (1997). The dispositional causes of job satisfaction: A core evaluations approach. Research in Organizational Behavior, 19, 151–188.

[edit] References

  • Adams, P. A., & Adams, J. K. (1960). Confidence in the recognition and reproduction of words difficult to spell. The American Journal of Psychology, 73(4), 544-552.
  • Cannell, J. J. (1989). How public educators cheat on standardized achievement tests: The "Lake Wobegon" report.
  • Johnson, D. D. P. (2004). Overconfidence and war: The havoc and glory of positive illusions. Cambridge, MA: Harvard University Press.
  • Larrick, R. P., Burson, K. A., & Soll, J. B. (2007). Social comparison and confidence: When thinking you're better than average predicts overconfidence (and when it does not). Organizational Behavior & Human Decision Processes, 102(1), 76-94.

[edit] Further reading

  • Larrick, R. P., Burson, K. A., & Soll, J. (2007). Social comparison and confidence: When thinking you're better than average predicts overconfidence (and when it does not). Organizational Behavior & Human Decision Processes, 102(1), 76-94.
  • Moore, D. A., & Healy, P. J. (2008). The trouble with overconfidence. Psychological Review, 115(2), 502-517.
  • Baron, Johnathan (1994). Thinking and Deciding. Cambridge University Press. pp. 219–224. ISBN 0-521-43732-6. 
  • Gilovich, Thomas; Dale Griffin, Daniel Kahneman (Eds.). (2002). Heuristics and biases: The psychology of intuitive judgment. Cambridge, UK: Cambridge University Press. ISBN 0-521-79679-2
  • Sutherland, Stuart (2007). Irrationality. Pinter & Martin. pp. 172–178. ISBN 978-1-905177-07-3.