35 research outputs found

    Contextuality-by-Default: A Brief Overview of Ideas, Concepts, and Terminology

    Full text link
    This paper is a brief overview of the concepts involved in measuring the degree of contextuality and detecting contextuality in systems of binary measurements of a finite number of objects. We discuss and clarify the main concepts and terminology of the theory called "contextuality-by-default," and then discuss a possible generalization of the theory from binary to arbitrary measurements.Comment: Lecture Notes in Computer Science 9535 (with the corrected list of authors) (2016

    A Qualified Kolmogorovian Account of Probabilistic Contextuality

    Full text link
    We describe a mathematical language for determining all possible patterns of contextuality in the dependence of stochastic outputs of a system on its deterministic inputs. The central notion is that of all possible couplings for stochastically unrelated outputs indexed by mutually incompatible values of inputs. A system is characterized by a pattern of which outputs can be "directly influenced" by which inputs (a primitive relation, hypothetical or normative), and by certain constraints imposed on the outputs (such as Bell-type inequalities or their quantum analogues). The set of couplings compatible with these constraints represents a form of contextuality in the dependence of outputs on inputs with respect to the declared pattern of direct influences.Comment: Lecture Notes in Computer Science 8369, 201-212 (2014

    Contextuality and noncontextuality measures and generalized Bell inequalities for cyclic systems

    Get PDF
    Cyclic systems of dichotomous random variables have played a prominent role in contextuality research, describing such experimental paradigms as the Klyachko-Can-Binicioglu-Shumovsky, Einstein-Podolsky-RosenBell, and Leggett-Garg ones in physics, as well as conjoint binary choices in human decision making. Here, we understand contextuality within the framework of the Contextuality-by-Default (CbD) theory, based on the notion of probabilistic couplings satisfying certain constraints. CbD allows us to drop the commonly made assumption that systems of random variables are consistently connected (i.e., it allows for all possible forms of "disturbance" or "signaling" in them). Consistently connected systems constitute a special case in which CbD essentially reduces to the conventional understanding of contextuality. We present a theoretical analysis of the degree of contextuality in cyclic systems (if they are contextual) and the degree of noncontextuality in them (if they are not). By contrast, all previously proposed measures of contextuality are confined to consistently connected systems, and most of them cannot be extended to measures of noncontextuality. Our measures of (non)contextuality are defined by the L-1-distance between a point representing a cyclic system and the surface of the polytope representing all possible noncontextual cyclic systems with the same single-variable marginals. We completely characterize this polytope, as well as the polytope of all possible probabilistic couplings for cyclic systems with given single-variable marginals. We establish that, in relation to the maximally tight Bell-type CbD inequality for (generally, inconsistently connected) cyclic systems, the measure of contextuality is proportional to the absolute value of the difference between its two sides. For noncontextual cyclic systems, the measure of noncontextuality is shown to be proportional to the smaller of the same difference and the L-1-distance to the surface of the box circumscribing the noncontextuality polytope. These simple relations, however, do not generally hold beyond the class of cyclic systems, and noncontextuality of a system does not follow from noncontextuality of its cyclic subsystems

    A New Approach for Assessment of Mental Architecture: Repeated Tagging

    Get PDF
    A new approach to the study of a relatively neglected property of mental architecture—whether and when the already-processed elements are separated from the to-be-processed elements—is proposed. The process of numerical proportion discrimination between two sets of elements defined either by color or by orientation can be described as sampling with or without replacement (characterized by binomial or hypergeometric probability distributions respectively) depending on the possibility to tag an element once or repeatedly. All empirical psychometric functions were approximated by a theoretical model showing that the ability to keep track of the already tagged elements is not an inflexible part of the mental architecture but rather an individually variable strategy which also depends on conspicuity of perceptual attributes. Strong evidence is provided that in a considerable number of trials, observers tagged the same element repeatedly which can only be done serially at two separate time moments

    Decision Making for Inconsistent Expert Judgments Using Negative Probabilities

    Full text link
    In this paper we provide a simple random-variable example of inconsistent information, and analyze it using three different approaches: Bayesian, quantum-like, and negative probabilities. We then show that, at least for this particular example, both the Bayesian and the quantum-like approaches have less normative power than the negative probabilities one.Comment: 14 pages, revised version to appear in the Proceedings of the QI2013 (Quantum Interactions) conferenc

    What is Quantum? Unifying Its Micro-Physical and Structural Appearance

    Full text link
    We can recognize two modes in which 'quantum appears' in macro domains: (i) a 'micro-physical appearance', where quantum laws are assumed to be universal and they are transferred from the micro to the macro level if suitable 'quantum coherence' conditions (e.g., very low temperatures) are realized, (ii) a 'structural appearance', where no hypothesis is made on the validity of quantum laws at a micro level, while genuine quantum aspects are detected at a structural-modeling level. In this paper, we inquire into the connections between the two appearances. We put forward the explanatory hypothesis that, 'the appearance of quantum in both cases' is due to 'the existence of a specific form of organisation, which has the capacity to cope with random perturbations that would destroy this organisation when not coped with'. We analyse how 'organisation of matter', 'organisation of life', and 'organisation of culture', play this role each in their specific domain of application, point out the importance of evolution in this respect, and put forward how our analysis sheds new light on 'what quantum is'.Comment: 10 page

    Integration across time determines path deviation discrimination for moving objects.

    Get PDF
    YesBackground: Human vision is vital in determining our interaction with the outside world. In this study we characterize our ability to judge changes in the direction of motion of objects-a common task which can allow us either to intercept moving objects, or else avoid them if they pose a threat. Methodology/Principal Findings: Observers were presented with objects which moved across a computer monitor on a linear path until the midline, at which point they changed their direction of motion, and observers were required to judge the direction of change. In keeping with the variety of objects we encounter in the real world, we varied characteristics of the moving stimuli such as velocity, extent of motion path and the object size. Furthermore, we compared performance for moving objects with the ability of observers to detect a deviation in a line which formed the static trace of the motion path, since it has been suggested that a form of static memory trace may form the basis for these types of judgment. The static line judgments were well described by a 'scale invariant' model in which any two stimuli which possess the same two-dimensional geometry (length/width) result in the same level of performance. Performance for the moving objects was entirely different. Irrespective of the path length, object size or velocity of motion, path deviation thresholds depended simply upon the duration of the motion path in seconds. Conclusions/Significance: Human vision has long been known to integrate information across space in order to solve spatial tasks such as judgment of orientation or position. Here we demonstrate an intriguing mechanism which integrates direction information across time in order to optimize the judgment of path deviation for moving objects.Wellcome Trust, Leverhulme Trust, NI
    corecore