2,155 research outputs found

    Noncontextuality with Marginal Selectivity in Reconstructing Mental Architectures

    Full text link
    We present a general theory of series-parallel mental architectures with selectively influenced stochastically non-independent components. A mental architecture is a hypothetical network of processes aimed at performing a task, of which we only observe the overall time it takes under variable parameters of the task. It is usually assumed that the network contains several processes selectively influenced by different experimental factors, and then the question is asked as to how these processes are arranged within the network, e.g., whether they are concurrent or sequential. One way of doing this is to consider the distribution functions for the overall processing time and compute certain linear combinations thereof (interaction contrasts). The theory of selective influences in psychology can be viewed as a special application of the interdisciplinary theory of (non)contextuality having its origins and main applications in quantum theory. In particular, lack of contextuality is equivalent to the existence of a "hidden" random entity of which all the random variables in play are functions. Consequently, for any given value of this common random entity, the processing times and their compositions (minima, maxima, or sums) become deterministic quantities. These quantities, in turn, can be treated as random variables with (shifted) Heaviside distribution functions, for which one can easily compute various linear combinations across different treatments, including interaction contrasts. This mathematical fact leads to a simple method, more general than the previously used ones, to investigate and characterize the interaction contrast for different types of series-parallel architectures.Comment: published in Frontiers in Psychology: Cognition 1:12 doi: 10.3389/fpsyg.2015.00735 (special issue "Quantum Structures in Cognitive and Social Science"

    A Bayesian Account of Quantum Histories

    Full text link
    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive `probabilities'. The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely the linearly positive histories originally introduced by Goldstein and Page. Thus we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory.Comment: 24 pages, accepted for publication in Annals of Physics, minor correctio

    Almost quantum correlations

    Get PDF
    Quantum theory is not only successfully tested in laboratories every day but also constitutes a robust theoretical framework: small variations usually lead to implausible consequences, such as faster-than-light communication. It has even been argued that quantum theory may be special among possible theories. Here we report that, at the level of correlations among different systems, quantum theory is not so special. We define a set of correlations, dubbed 'almost quantum', and prove that it strictly contains the set of quantum correlations but satisfies all-but-one of the proposed principles to capture quantum correlations. We present numerical evidence that the remaining principle is satisfied too. © 2015 Macmillan Publishers Limited

    Entanglement, intractability and no-signaling

    Full text link
    We consider the problem of deriving the no-signaling condition from the assumption that, as seen from a complexity theoretic perspective, the universe is not an exponential place. A fact that disallows such a derivation is the existence of {\em polynomial superluminal} gates, hypothetical primitive operations that enable superluminal signaling but not the efficient solution of intractable problems. It therefore follows, if this assumption is a basic principle of physics, either that it must be supplemented with additional assumptions to prohibit such gates, or, improbably, that no-signaling is not a universal condition. Yet, a gate of this kind is possibly implicit, though not recognized as such, in a decade-old quantum optical experiment involving position-momentum entangled photons. Here we describe a feasible modified version of the experiment that appears to explicitly demonstrate the action of this gate. Some obvious counter-claims are shown to be invalid. We believe that the unexpected possibility of polynomial superluminal operations arises because some practically measured quantum optical quantities are not describable as standard quantum mechanical observables.Comment: 17 pages, 2 figures (REVTeX 4

    The Contextualization of Islamic Law Paradigms in the Pandemic Time Covid-19 as the Word of Religious Moderation

    Get PDF
    This study aims to investigate and analyze the adaptive and contextualization values of Islamic law during the Covid-19 pandemic, in order to occupy a more dynamic concept of religious moderation from a paradigm or thought perspective, by applying a design in the form of research (Library Research) using various library sources as the source of research data, it also applied a descriptive-normative approach to describe the pandemic as a real phenomenon, and began from the building as the source of the primary and secondary data to support, to be further processed using Fatwa No. 14 of 2020 and the Principles of Fiqhiyyah as a knife of analysis through the Descriptive Analytic Method Content process. The conclusions are drawn by applying inductive and deductive thinking methods. The contextualization effort in question is a form of embodiment of the value of religious moderation in a textual dichotomy but essentially integrates the moderation paradigm that has been popular so far focused in the washatiyah study discourse or only in tolerance between fellow of religious communities. The reflection of the paradigm of Islamic law during the pandemic period is to produce an ijtihad that commonly configures and elaborates normative texts with a progressive interpretation mode, as well as mediating and harmonizing the interrelation and domination of texts against existing realities, so that they actually produce an actual understanding as well as reflect the principles of Islamic law that is flexible, complete, and beneficial.     &nbsp

    Overcoming the Problem of Induction: Science and Religion as Ways of Knowing

    Get PDF

    Overcoming the Problem of Induction: Science and Religion as Ways of Knowing

    Get PDF
    corecore