39 research outputs found
Recommended from our members
A quantum geometric model of similarity
No other study has had as great an impact on the development of the similarity literature as that of Tversky (1977), which provided compelling demonstrations against all the fundamental assumptions of the popular, and extensively employed, geometric similarity models. Notably, similarity judgments were shown to violate symmetry and the triangle inequality, and also be subject to context effects, so that the same pair of items would be rated differently, depending on the presence of other items. Quantum theory provides a generalized geometric approach to similarity and can address several of Tverskyâs (1997) main findings. Similarity is modeled as quantum probability, so that asymmetries emerge as order effects, and the triangle equality violations and the diagnosticity effect can be related to the context-dependent properties of quantum probability. We so demonstrate the promise of the quantum approach for similarity and discuss the implications for representation theory in general
Recommended from our members
A quantum theoretical explanation for probability judgment errors
A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction, disjunction, inverse, and conditional fallacies, as well as unpacking effects and partitioning effects. Quantum probability theory is a general and coherent theory based on a set of (von Neumann) axioms which relax some of the constraints underlying classic (Kolmogorov) probability theory. The quantum model is compared and contrasted with other competing explanations for these judgment errors including the representativeness heuristic, the averaging model, and a memory retrieval model for probability judgments. The quantum model also provides ways to extend Bayesian, fuzzy set, and fuzzy trace theories. We conclude that quantum information processing principles provide a viable and promising new way to understand human judgment and reasoning
Recommended from our members
The dynamics of decision making when probabilities are vaguely specified
Consider a multi-trial game with the goal of maximizing a quantity Q(N). At each trial N, the player doubles the accumulated quantity, unless the trial number is Y, in which case all is lost and the game ends. The expected quantity for the next trial will favor continuing play, as long as the probability that the next trial is Y is less than one half. Y is vaguely specified (e.g., someone is asked to fill a sheet of paper with digits, which are then permuted to produce Y). Conditional on reaching trial N, we argue that the probability that the next trial is Y is extremely small (much less than one half), and that this holds for any N. Thus, single trial reasoning recommends one should always play, but this guarantees eventual ruin in the game. It is necessary to stop, but how can a decision to stop on N be justified, and how can N be chosen? The paradox and the conflict between what seem to be two equally plausible lines of reasoning are caused by the vagueness in the specification of the critical trial Y. Many everyday reasoning situations involve analogous situations of vagueness, in specifying probabilities, values, and/or alternatives, whether in the context of sequential decisions or single decisions. We present a computational scheme for addressing the problem of vagueness in the above game, based on quantum probability theory. The key aspect of our proposal is the idea that the range of stopping rules can be represented as a superposition state, in which the player cannot be assumed to believe in any specific stopping rule. This scheme reveals certain interesting properties, regarding the dynamics of when to stop to play
Recommended from our members
The conjunction fallacy, confirmation, and quantum theory: comment on Tentori, Crupi, & Russo
The conjunction fallacy refers to situations when a person judges a conjunction to be more likely than one of the individual conjuncts, which is a violation of a key property of classical probability theory. Recently, quantum probability theory has been proposed as a coherent account of these and many other findings on probability judgment âerrorsâ that violate classical probability rules, including the conjunction fallacy. Tentori, Crupi, and Russo (2013) present an alternative account of the conjunction fallacy based on the concept of inductive confirmation. They present new empirical findings consistent with their account, and they also claim that these results are inconsistent with the quantum probability theory account. This comment proves that our quantum probability model for the conjunction fallacy is completely consistent with the main empirical results from Tentori et al. (2013). Furthermore, we discuss experimental tests that can distinguish the two alternative accounts
Recommended from our members
Social Projection and a Quantum Approach for Behavior in Prisoner's Dilemma
Recommended from our members
Quantum probability updating from zero priors (by-passing Cromwellâs rule)
Cromwellâs rule (also known as the zero priors paradox) refers to the constraint of classical probability theory that if one assigns a prior probability of 0 or 1 to a hypothesis, then the posterior has to be 0 or 1 as well (this is a straightforward implication of how Bayesâ rule works). Relatedly, hypotheses with a very low prior cannot be updated to have a very high posterior without a tremendous amount of new evidence to support them (or to make other possibilities highly improbable). Cromwellâs rule appears at odds with our intuition of how humans update probabilities. In this work, we report two simple decision making experiments, which seem to be inconsistent with Cromwellâs rule. Quantum probability theory, the rules for how to assign probabilities from the mathematical formalism of quantum mechanics, provides an alternative framework for probabilistic inference. An advantage of quantum probability theory is that it is not subject to Cromwellâs rule and it can accommodate changes from zero or very small priors to significant posteriors. We outline a model of decision making, based on quantum theory, which can accommodate the changes from priors to posteriors, observed in our experiments
Recommended from our members
Bridging the gap between subjective probability and probability judgments: the Quantum Sequential Sampler
One of the most important challenges in decision theory has been how to reconcile the normative expectations from Bayesian theory with the apparent fallacies that are common in probabilistic reasoning. Recently, Bayesian models have been driven by the insight that apparent fallacies are due to sampling errors or biases in estimating (Bayesian) probabilities. An alternative way to explain apparent fallacies is by invoking different probability rules, specifically the probability rules from quantum theory. Arguably, quantum cognitive models offer a more unified explanation for a large body of findings, problematic from a baseline classical perspective. This work addresses two major corresponding theoretical challenges: first, a framework is needed which incorporates both Bayesian and quantum influences, recognizing the fact that there is evidence for both in human behavior. Second, there is empirical evidence which goes beyond any current Bayesian and quantum model. We develop a model for probabilistic reasoning, seamlessly integrating both Bayesian and quantum models of reasoning and augmented by a sequential sampling process, which maps subjective probabilistic estimates to observable responses. Our model, called the Quantum Sequential Sampler, is compared to the currently leading Bayesian model, the Bayesian Sampler (Zhu, Sanborn, & Chater, 2020) using a new experiment, producing one of the largest datasets in probabilistic reasoning to this day. The Quantum Sequential Sampler embodies several new components, which we argue offer a more theoretically accurate approach to probabilistic reasoning. Moreover, our empirical tests revealed a new, surprising systematic overestimation of probabilities
Recommended from our members
The rational status of quantum cognition
Classic probability theory (CPT) is generally considered the rational way to make inferences, but there have been some empirical findings showing a divergence between reasoning and the principles of classical probability theory (CPT), inviting the conclusion that humans are irrational. Perhaps the most famous of these findings is the conjunction fallacy (CF). Recently, the CF has been shown consistent with the principles of an alternative probabilistic framework, quantum probability theory (QPT). Does this imply that QPT is irrational or does QPT provide an alternative interpretation of rationality? Our presentation consists of three parts. First, we examine the putative rational status of QPT using the same argument as used to establish the rationality of CPT, the Dutch Book (DB) argument, according to which reasoners should not commit to bets guaranteeing a loss. We prove the rational status of QPT by formulating it as a particular case of an extended form of CPT, with separate probability spaces produced by changing context. Second, we empirically examine the key requirement for whether a CF can be rational or not; the results show that participants indeed behave rationally, at least relative to the representations they employ. Finally, we consider whether the conditions for the CF to be rational are applicable in the outside (non-mental) world. Our discussion provides a general and alternative perspective for rational probabilistic inference, based on the idea that contextuality requires either reasoning in separate CPT probability spaces or reasoning with QPT principles
Modeling Concept Combinations in a Quantum-theoretic Framework
We present modeling for conceptual combinations which uses the mathematical
formalism of quantum theory. Our model faithfully describes a large amount of
experimental data collected by different scholars on concept conjunctions and
disjunctions. Furthermore, our approach sheds a new light on long standing
drawbacks connected with vagueness, or fuzziness, of concepts, and puts forward
a completely novel possible solution to the 'combination problem' in concept
theory. Additionally, we introduce an explanation for the occurrence of quantum
structures in the mechanisms and dynamics of concepts and, more generally, in
cognitive and decision processes, according to which human thought is a well
structured superposition of a 'logical thought' and a 'conceptual thought', and
the latter usually prevails over the former, at variance with some widespread
beliefsComment: 5 pages. arXiv admin note: substantial text overlap with
arXiv:1311.605