iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: http://en.wikipedia.org/wiki/Decision_sciences
Decision theory - Wikipedia Jump to content

Decision theory

From Wikipedia, the free encyclopedia
(Redirected from Decision sciences)
Wagrez's "The Judgement of Paris": Paris, dressed in medieval livery and holding the apple of discord, chats with Athena, Aphrodite, and Hera.
The mythological judgement of Paris required selecting from three incomparable alternatives (the goddesses shown).

Decision theory or the theory of rational choice is a branch of probability, economics, and analytic philosophy that uses the tools of expected utility and probability to model how individuals would behave rationally under uncertainty.[1][2] It differs from the cognitive and behavioral sciences in that it is mainly prescriptive and concerned with identifying optimal decisions for a rational agent, rather than describing how people actually make decisions. Despite this, the field is important to the study of real human behavior by social scientists, as it lays the foundations for the rational agent models used to mathematically model and analyze individuals in fields such as sociology, economics, criminology, cognitive science, and political science.

Normative and descriptive

[edit]

Normative decision theory is concerned with identification of optimal decisions where optimality is often determined by considering an ideal decision maker who is able to calculate with perfect accuracy and is in some sense fully rational. The practical application of this prescriptive approach (how people ought to make decisions) is called decision analysis and is aimed at finding tools, methodologies, and software (decision support systems) to help people make better decisions.[3][4]

In contrast, descriptive decision theory is concerned with describing observed behaviors often under the assumption that those making decisions are behaving under some consistent rules. These rules may, for instance, have a procedural framework (e.g. Amos Tversky's elimination by aspects model) or an axiomatic framework (e.g. stochastic transitivity axioms), reconciling the Von Neumann-Morgenstern axioms with behavioral violations of the expected utility hypothesis, or they may explicitly give a functional form for time-inconsistent utility functions (e.g. Laibson's quasi-hyperbolic discounting).[3][4]

Prescriptive decision theory is concerned with predictions about behavior that positive decision theory produces to allow for further tests of the kind of decision-making that occurs in practice. In recent decades, there has also been increasing interest in "behavioral decision theory", contributing to a re-evaluation of what useful decision-making requires.[5][6]

Types of decisions

[edit]

Choice under uncertainty

[edit]

The area of choice under uncertainty represents the heart of decision theory. Known from the 17th century (Blaise Pascal invoked it in his famous wager, which is contained in his Pensées, published in 1670), the idea of expected value is that, when faced with a number of actions, each of which could give rise to more than one possible outcome with different probabilities, the rational procedure is to identify all possible outcomes, determine their values (positive or negative) and the probabilities that will result from each course of action, and multiply the two to give an "expected value", or the average expectation for an outcome; the action to be chosen should be the one that gives rise to the highest total expected value. In 1738, Daniel Bernoulli published an influential paper entitled Exposition of a New Theory on the Measurement of Risk, in which he uses the St. Petersburg paradox to show that expected value theory must be normatively wrong. He gives an example in which a Dutch merchant is trying to decide whether to insure a cargo being sent from Amsterdam to St. Petersburg in winter. In his solution, he defines a utility function and computes expected utility rather than expected financial value.[7]

In the 20th century, interest was reignited by Abraham Wald's 1939 paper pointing out that the two central procedures of sampling-distribution-based statistical-theory, namely hypothesis testing and parameter estimation, are special cases of the general decision problem.[8] Wald's paper renewed and synthesized many concepts of statistical theory, including loss functions, risk functions, admissible decision rules, antecedent distributions, Bayesian procedures, and minimax procedures. The phrase "decision theory" itself was used in 1950 by E. L. Lehmann.[9]

The revival of subjective probability theory, from the work of Frank Ramsey, Bruno de Finetti, Leonard Savage and others, extended the scope of expected utility theory to situations where subjective probabilities can be used. At the time, von Neumann and Morgenstern's theory of expected utility[10] proved that expected utility maximization followed from basic postulates about rational behavior.

The work of Maurice Allais and Daniel Ellsberg showed that human behavior has systematic and sometimes important departures from expected-utility maximization (Allais paradox and Ellsberg paradox).[11] The prospect theory of Daniel Kahneman and Amos Tversky renewed the empirical study of economic behavior with less emphasis on rationality presuppositions. It describes a way by which people make decisions when all of the outcomes carry a risk.[12] Kahneman and Tversky found three regularities – in actual human decision-making, "losses loom larger than gains"; people focus more on changes in their utility-states than they focus on absolute utilities; and the estimation of subjective probabilities is severely biased by anchoring.

Intertemporal choice

[edit]

Intertemporal choice is concerned with the kind of choice where different actions lead to outcomes that are realized at different stages over time.[13] It is also described as cost-benefit decision making since it involves the choices between rewards that vary according to magnitude and time of arrival.[14] If someone received a windfall of several thousand dollars, they could spend it on an expensive holiday, giving them immediate pleasure, or they could invest it in a pension scheme, giving them an income at some time in the future. What is the optimal thing to do? The answer depends partly on factors such as the expected rates of interest and inflation, the person's life expectancy, and their confidence in the pensions industry. However even with all those factors taken into account, human behavior again deviates greatly from the predictions of prescriptive decision theory, leading to alternative models in which, for example, objective interest rates are replaced by subjective discount rates.

Interaction of decision makers

[edit]
An electronic simulation room at the Naval War College during a 1958 wargame: against the far wall, a large map shows the outline of landmasses and some firing solutions. Suited men sit at desks on the floor, papers in front of them, most staring up at the map. Against the right wall, uniformed ensigns plot ship locations on (washed-out) screens.
Military planners often conduct extensive simulations to help predict the decision-making of relevant actors.

Some decisions are difficult because of the need to take into account how other people in the situation will respond to the decision that is taken. The analysis of such social decisions is often treated under decision theory, though it involves mathematical methods. In the emerging field of socio-cognitive engineering, the research is especially focused on the different types of distributed decision-making in human organizations, in normal and abnormal/emergency/crisis situations.[15]

Complex decisions

[edit]

Other areas of decision theory are concerned with decisions that are difficult simply because of their complexity, or the complexity of the organization that has to make them. Individuals making decisions are limited in resources (i.e. time and intelligence) and are therefore boundedly rational; the issue is thus, more than the deviation between real and optimal behavior, the difficulty of determining the optimal behavior in the first place. Decisions are also affected by whether options are framed together or separately; this is known as the distinction bias.

Heuristics

[edit]
A ball inside a spinning roulette wheel
The gambler's fallacy: even when the roulette ball repeatedly lands on red, it is no more likely to land on black the next time.

Heuristics are procedures for making a decision without working out the consequences of every option. Heuristics decrease the amount of evaluative thinking required for decisions, focusing on some aspects of the decision while ignoring others.[16] While quicker than step-by-step processing, heuristic thinking is also more likely to involve fallacies or inaccuracies.[17]

One example of a common and erroneous thought process that arises through heuristic thinking is the gambler's fallacy — believing that an isolated random event is affected by previous isolated random events. For example, if flips of a fair coin give repeated tails, the coin still has the same probability (i.e., 0.5) of tails in future turns, though intuitively it might seems that heads becomes more likely.[18] In the long run, heads and tails should occur equally often; people commit the gambler's fallacy when they use this heuristic to predict that a result of heads is "due" after a run of tails.[19] Another example is that decision-makers may be biased towards preferring moderate alternatives to extreme ones. The compromise effect operates under a mindset that the most moderate option carries the most benefit. In an incomplete information scenario, as in most daily decisions, the moderate option will look more appealing than either extreme, independent of the context, based only on the fact that it has characteristics that can be found at either extreme.[20]

Alternatives

[edit]

A highly controversial issue is whether one can replace the use of probability in decision theory with something else.

Probability theory

[edit]

Advocates for the use of probability theory point to:

  • the work of Richard Threlkeld Cox for justification of the probability axioms,
  • the Dutch book paradoxes of Bruno de Finetti as illustrative of the theoretical difficulties that can arise from departures from the probability axioms, and
  • the complete class theorems, which show that all admissible decision rules are equivalent to the Bayesian decision rule for some utility function and some prior distribution (or for the limit of a sequence of prior distributions). Thus, for every decision rule, either the rule may be reformulated as a Bayesian procedure (or a limit of a sequence of such), or there is a rule that is sometimes better and never worse.

Alternatives to probability theory

[edit]

The proponents of fuzzy logic, possibility theory, Dempster–Shafer theory, and info-gap decision theory maintain that probability is only one of many alternatives and point to many examples where non-standard alternatives have been implemented with apparent success. Notably, probabilistic decision theory can sometimes be sensitive to assumptions about the probabilities of various events, whereas non-probabilistic rules, such as minimax, are robust in that they do not make such assumptions.

Ludic fallacy

[edit]

A general criticism of decision theory based on a fixed universe of possibilities is that it considers the "known unknowns", not the "unknown unknowns":[21] it focuses on expected variations, not on unforeseen events, which some argue have outsized impact and must be considered – significant events may be "outside model". This line of argument, called the ludic fallacy, is that there are inevitable imperfections in modeling the real world by particular models, and that unquestioning reliance on models blinds one to their limits.

See also

[edit]

References

[edit]
  1. ^ "Decision theory Definition and meaning". Dictionary.com. Retrieved 2022-04-02.
  2. ^ Hansson, Sven Ove. "Decision theory: A brief introduction". (2005) Section 1.2: A truly interdisciplinary subject.
  3. ^ a b MacCrimmon, Kenneth R. (1968). "Descriptive and normative implications of the decision-theory postulates". Risk and Uncertainty. London: Palgrave Macmillan. pp. 3–32. OCLC 231114.
  4. ^ a b Slovic, Paul; Fischhoff, Baruch; Lichtenstein, Sarah (1977). "Behavioral Decision Theory". Annual Review of Psychology. 28 (1): 1–39. doi:10.1146/annurev.ps.28.020177.000245. hdl:1794/22385.
  5. ^ For instance, see: Anand, Paul (1993). Foundations of Rational Choice Under Risk. Oxford: Oxford University Press. ISBN 0-19-823303-5.
  6. ^ Keren GB, Wagenaar WA (1985). "On the psychology of playing blackjack: Normative and descriptive considerations with implications for decision theory". Journal of Experimental Psychology: General. 114 (2): 133–158. doi:10.1037/0096-3445.114.2.133.
  7. ^ For a review see Schoemaker, P. J. (1982). "The Expected Utility Model: Its Variants, Purposes, Evidence and Limitations". Journal of Economic Literature. 20 (2): 529–563. JSTOR 2724488.
  8. ^ Wald, Abraham (1939). "Contributions to the Theory of Statistical Estimation and Testing Hypotheses". Annals of Mathematical Statistics. 10 (4): 299–326. doi:10.1214/aoms/1177732144. MR 0000932.
  9. ^ Lehmann EL (1950). "Some Principles of the Theory of Testing Hypotheses". Annals of Mathematical Statistics. 21 (1): 1–26. doi:10.1214/aoms/1177729884. JSTOR 2236552.
  10. ^ Neumann Jv, Morgenstern O (1953) [1944]. Theory of Games and Economic Behavior (third ed.). Princeton, NJ: Princeton University Press.
  11. ^ Allais, M.; Hagen, G. M. (2013). Expected Utility Hypotheses and the Allais Paradox: Contemporary Discussions of the Decisions Under Uncertainty with Allais' Rejoinder. Dordrecht: Springer Science & Business Media. p. 333. ISBN 9789048183548.
  12. ^ Morvan, Camille; Jenkins, William J. (2017). Judgment Under Uncertainty: Heuristics and Biases. London: Macat International Ltd. p. 13. ISBN 9781912303687.
  13. ^ Karwan, Mark; Spronk, Jaap; Wallenius, Jyrki (2012). Essays In Decision Making: A Volume in Honour of Stanley Zionts. Berlin: Springer Science & Business Media. p. 135. ISBN 9783642644993.
  14. ^ Hess, Thomas M.; Strough, JoNell; Löckenhoff, Corinna (2015). Aging and Decision Making: Empirical and Applied Perspectives. London: Elsevier. p. 21. ISBN 9780124171558.
  15. ^ Crozier, M. & Friedberg, E. (1995). "Organization and Collective Action. Our Contribution to Organizational Analysis" in Bacharach S.B, Gagliardi P. & Mundell P. (Eds). Research in the Sociology of Organizations. Vol. XIII, Special Issue on European Perspectives of Organizational Theory, Greenwich, CT: JAI Press.
  16. ^ Bobadilla-Suarez S, Love BC (January 2018). "Fast or frugal, but not both: Decision heuristics under time pressure" (PDF). Journal of Experimental Psychology: Learning, Memory, and Cognition. 44 (1): 24–33. doi:10.1037/xlm0000419. PMC 5708146. PMID 28557503.
  17. ^ Johnson EJ, Payne JW (April 1985). "Effort and Accuracy in Choice". Management Science. 31 (4): 395–414. doi:10.1287/mnsc.31.4.395.
  18. ^ Roe RM, Busemeyer JR, Townsend JT (2001). "Multialternative decision field theory: A dynamic connectionst model of decision making". Psychological Review. 108 (2): 370–392. doi:10.1037/0033-295X.108.2.370. PMID 11381834.
  19. ^ Xu J, Harvey N (May 2014). "Carry on winning: the gamblers' fallacy creates hot hand effects in online gambling". Cognition. 131 (2): 173–80. doi:10.1016/j.cognition.2014.01.002. PMID 24549140.
  20. ^ Chuang SC, Kao DT, Cheng YH, Chou CA (March 2012). "The effect of incomplete information on the compromise effect". Judgment and Decision Making. 7 (2): 196–206. CiteSeerX 10.1.1.419.4767. doi:10.1017/S193029750000303X. S2CID 9432630.
  21. ^ Feduzi, A. (2014). "Uncovering unknown unknowns: Towards a Baconian approach to management decision-making". Decision Processes. 124 (2): 268–283.

Further reading

[edit]