159,547 research outputs found

    The quantum measurement problem and physical reality: a computation theoretic perspective

    Full text link
    Is the universe computable? If yes, is it computationally a polynomial place? In standard quantum mechanics, which permits infinite parallelism and the infinitely precise specification of states, a negative answer to both questions is not ruled out. On the other hand, empirical evidence suggests that NP-complete problems are intractable in the physical world. Likewise, computational problems known to be algorithmically uncomputable do not seem to be computable by any physical means. We suggest that this close correspondence between the efficiency and power of abstract algorithms on the one hand, and physical computers on the other, finds a natural explanation if the universe is assumed to be algorithmic; that is, that physical reality is the product of discrete sub-physical information processing equivalent to the actions of a probabilistic Turing machine. This assumption can be reconciled with the observed exponentiality of quantum systems at microscopic scales, and the consequent possibility of implementing Shor's quantum polynomial time algorithm at that scale, provided the degree of superposition is intrinsically, finitely upper-bounded. If this bound is associated with the quantum-classical divide (the Heisenberg cut), a natural resolution to the quantum measurement problem arises. From this viewpoint, macroscopic classicality is an evidence that the universe is in BPP, and both questions raised above receive affirmative answers. A recently proposed computational model of quantum measurement, which relates the Heisenberg cut to the discreteness of Hilbert space, is briefly discussed. A connection to quantum gravity is noted. Our results are compatible with the philosophy that mathematical truths are independent of the laws of physics.Comment: Talk presented at "Quantum Computing: Back Action 2006", IIT Kanpur, India, March 200

    Modeling practical thinking

    Get PDF
    Intellectualists about knowledge how argue that knowing how to do something is knowing the content of a proposition (i.e, a fact). An important component of this view is the idea that propositional knowledge is translated into behavior when it is presented to the mind in a peculiarly practical way. Until recently, however, intellectualists have not said much about what it means for propositional knowledge to be entertained under thought's practical guise. Carlotta Pavese fills this gap in the intellectualist view by modeling practical modes of thought after Fregean senses. In this paper, I take up her model and the presuppositions it is built upon, arguing that her view of practical thought is not positioned to account for much of what human agents are able to do

    The challenges of purely mechanistic models in biology and the minimum need for a 'mechanism-plus-X' framework

    Get PDF
    Ever since the advent of molecular biology in the 1970s, mechanical models have become the dogma in the field, where a "true" understanding of any subject is equated to a mechanistic description. This has been to the detriment of the biomedical sciences, where, barring some exceptions, notable new feats of understanding have arguably not been achieved in normal and disease biology, including neurodegenerative disease and cancer pathobiology. I argue for a "mechanism-plus-X" paradigm, where mainstay elements of mechanistic models such as hierarchy and correlation are combined with nomological principles such as general operative rules and generative principles. Depending on the question at hand and the nature of the inquiry, X could range from proven physical laws to speculative biological generalizations, such as the notional principle of cellular synchrony. I argue that the "mechanism-plus-X" approach should ultimately aim to move biological inquiries out of the deadlock of oft-encountered mechanistic pitfalls and reposition biology to its former capacity of illuminating fundamental truths about the world

    Can hierarchical predictive coding explain binocular rivalry?

    Get PDF
    Hohwy et al.’s (2008) model of binocular rivalry (BR) is taken as a classic illustration of predictive coding’s explanatory power. I revisit the account and show that it cannot explain the role of reward in BR. I then consider a more recent version of Bayesian model averaging, which recasts the role of reward in (BR) in terms of optimism bias. If we accept this account, however, then we must reconsider our conception of perception. On this latter view, I argue, organisms engage in what amounts to policy-driven, motivated perception

    Ten reasons why a thermalized system cannot be described by a many-particle wave function

    Full text link
    It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practise and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper argues that the irreversibility and stochasticity of statistical mechanics should be taken as a true property of nature. It follows that a gas of a macroscopic number NN of atoms in thermal equilibrium is best represented by a collection of NN wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.Comment: Drastically rewritten version, with more explanations, with three new reasons added and three old ones merged with other parts of the tex

    Perceptual Consciousness, Short-Term Memory, and Overflow: Replies to Beck, Orlandi and Franklin, and Phillips

    Get PDF
    A reply to commentators -- Jake Beck, Nico Orlandi and Aaron Franklin, and Ian Phillips -- on our paper "Does perceptual consciousness overflow cognitive access?"

    Laws, Causation and Dynamics at Different Levels

    Get PDF
    I have two main aims. The first is general, and more philosophical (Section 2). The second is specific, and more closely related to physics (Sections 3 and 4). The first aim is to state my general views about laws and causation at different `levels'. The main task is to understand how the higher levels sustain notions of law and causation that `ride free' of reductions to the lower level or levels. I endeavour to relate my views to those of other symposiasts. The second aim is to give a framework for describing dynamics at different levels, emphasising how the various levels' dynamics can mesh or fail to mesh. This framework is essentially that of elementary dynamical systems theory. The main idea will be, for simplicity, to work with just two levels, dubbed `micro' and `macro' which are related by coarse-graining. I use this framework to describe, in part, the first four of Ellis' five types of top-down causation

    Perceptual Consciousness and Cognitive Access from the Perspective of Capacity-Unlimited Working Memory

    Get PDF
    Theories of consciousness divide over whether perceptual consciousness is rich or sparse in specific representational content and whether it requires cognitive access. These two issues are often treated in tandem because of a shared assumption that the representational capacity of cognitive access is fairly limited. Recent research on working memory challenges this shared assumption. This paper argues that abandoning the assumption undermines post-cue-based “overflow” arguments, according to which perceptual conscious is rich and does not require cognitive access. Abandoning it also dissociates the rich/sparse debate from the access question. The paper then explores attempts to reformulate overflow theses in ways that don’t require the assumption of limited capacity. Finally, it discusses the problem of relating seemingly non-probabilistic perceptual consciousness to the probabilistic representations posited by the models that challenge conceptions of cognitive access as capacity-limited
    corecore