101,841 research outputs found

    From statistical proofs of the Kochen-Specker theorem to noise-robust noncontextuality inequalities

    Full text link
    The Kochen-Specker theorem rules out models of quantum theory wherein projective measurements are assigned outcomes deterministically and independently of context. This notion of noncontextuality is not applicable to experimental measurements because these are never free of noise and thus never truly projective. For nonprojective measurements, therefore, one must drop the requirement that an outcome is assigned deterministically in the model and merely require that it is assigned a distribution over outcomes in a manner that is context-independent. By demanding context-independence in the representation of preparations as well, one obtains a generalized principle of noncontextuality that also supports a quantum no-go theorem. Several recent works have shown how to derive inequalities on experimental data which, if violated, demonstrate the impossibility of finding a generalized-noncontextual model of this data. That is, these inequalities do not presume quantum theory and, in particular, they make sense without requiring an operational analogue of the quantum notion of projectiveness. We here describe a technique for deriving such inequalities starting from arbitrary proofs of the Kochen-Specker theorem. It extends significantly previous techniques that worked only for logical proofs, which are based on sets of projective measurements that fail to admit of any deterministic noncontextual assignment, to the case of statistical proofs, which are based on sets of projective measurements that do admit of some deterministic noncontextual assignments, but not enough to explain the quantum statistics.Comment: 14 pages, 4 figures, published versio

    The Einstein-Podolsky-Rosen Argument and the Bell Inequalities

    Get PDF
    In 1935 Einstein, Podolsky, and Rosen (EPR) published an important paper in which they claimed that the whole formalism of quantum mechanics together with what they called ``Reality Criterion'' imply that quantum mechanics cannot be complete. That is, there must exist some elements of reality that are not described by quantum mechanics. There must be, they concluded, a more complete description of physical reality behind quantum mechanics. There must be a state, a hidden variable, characterizing the state of affairs in the world in more details than the quantum mechanical state, something that also reflects the missing elements of reality. Under some further but quite plausible assumptions, this conclusion implies that in some spin-correlation experiments the measured quantum mechanical probabilities should satisfy particular inequalities (Bell-type inequalities). The paradox consists in the fact that quantum probabilities do not satisfy these inequalities. And this paradoxical fact has been confirmed by several laboratory experiments in the last three decades. The problem is still open and hotly debated among both physicists and philosophers. It has motivated a wide range of research from the most fundamental quantum mechanical experiments through foundations of probability theory to the theory of stochastic causality as well as the metaphysics of free will

    Credit Valuation Adjustment

    Get PDF
    Credit risk has become a topical issue since the 2007 Credit Crisis, particularly for its impact on the valuation of OTC derivatives. This becomes critical when the credit risk of entities involved in a contract either as underlying or counterparty become highly correlated as is the case during macroeconomic shocks. It impacts the valuation of such contracts through an additional term, the credit valuation adjustment (CVA). This can become large with such correlation. This thesis outlines the main approaches to credit risk modelling, intensity and structural. It gives important examples of both and particular examples useful in the calculation of CVA, the intensity model of Brigo and the structural model of Hull and White. It details Brigo's market standard model independent framework for derivatives valuation with CVA. It does this for both its unilateral form where only one counterparty is credit risky and also for its bilateral form where both counterparties are credit risky. This thesis then shows how these frameworks can be applied to the valuation of a credit default swap contract (CDS). Finally, it shows how Brigo's and Hull and White's model for credit risk apply to the valuation of the CVA of CDS and draws comparisons, especially based on their ability to capture correlation effects

    Multipartite Nonlocal Quantum Correlations Resistant to Imperfections

    Full text link
    We use techniques for lower bounds on communication to derive necessary conditions in terms of detector efficiency or amount of super-luminal communication for being able to reproduce with classical local hidden-variable theories the quantum correlations occurring in EPR-type experiments in the presence of noise. We apply our method to an example involving n parties sharing a GHZ-type state on which they carry out measurements and show that for local-hidden variable theories, the amount of super-luminal classical communication c and the detector efficiency eta are constrained by eta 2^(-c/n) = O(n^(-1/6)) even for constant general error probability epsilon = O(1)

    From the Kochen-Specker theorem to noncontextuality inequalities without assuming determinism

    Full text link
    The Kochen-Specker theorem demonstrates that it is not possible to reproduce the predictions of quantum theory in terms of a hidden variable model where the hidden variables assign a value to every projector deterministically and noncontextually. A noncontextual value-assignment to a projector is one that does not depend on which other projectors - the context - are measured together with it. Using a generalization of the notion of noncontextuality that applies to both measurements and preparations, we propose a scheme for deriving inequalities that test whether a given set of experimental statistics is consistent with a noncontextual model. Unlike previous inequalities inspired by the Kochen-Specker theorem, we do not assume that the value-assignments are deterministic and therefore in the face of a violation of our inequality, the possibility of salvaging noncontextuality by abandoning determinism is no longer an option. Our approach is operational in the sense that it does not presume quantum theory: a violation of our inequality implies the impossibility of a noncontextual model for any operational theory that can account for the experimental observations, including any successor to quantum theory.Comment: 5+8 pages, 4+3 figures. Comments are welcome

    Dynamical system analysis and forecasting of deformation produced by an earthquake fault

    Full text link
    We present a method of constructing low-dimensional nonlinear models describing the main dynamical features of a discrete 2D cellular fault zone, with many degrees of freedom, embedded in a 3D elastic solid. A given fault system is characterized by a set of parameters that describe the dynamics, rheology, property disorder, and fault geometry. Depending on the location in the system parameter space we show that the coarse dynamics of the fault can be confined to an attractor whose dimension is significantly smaller than the space in which the dynamics takes place. Our strategy of system reduction is to search for a few coherent structures that dominate the dynamics and to capture the interaction between these coherent structures. The identification of the basic interacting structures is obtained by applying the Proper Orthogonal Decomposition (POD) to the surface deformations fields that accompany strike-slip faulting accumulated over equal time intervals. We use a feed-forward artificial neural network (ANN) architecture for the identification of the system dynamics projected onto the subspace (model space) spanned by the most energetic coherent structures. The ANN is trained using a standard back-propagation algorithm to predict (map) the values of the observed model state at a future time given the observed model state at the present time. This ANN provides an approximate, large scale, dynamical model for the fault.Comment: 30 pages, 12 figure

    Stochastic Gravity

    Get PDF
    Gravity is treated as a stochastic phenomenon based on fluctuations of the metric tensor of general relativity. By using a (3+1) slicing of spacetime, a Langevin equation for the dynamical conjugate momentum and a Fokker-Planck equation for its probability distribution are derived. The Raychaudhuri equation for a congruence of timelike or null geodesics leads to a stochastic differential equation for the expansion parameter Ξ\theta in terms of the proper time ss. For sufficiently strong metric fluctuations, it is shown that caustic singularities in spacetime can be avoided for converging geodesics. The formalism is applied to the gravitational collapse of a star and the Friedmann-Robertson-Walker cosmological model. It is found that owing to the stochastic behavior of the geometry, the singularity in gravitational collapse and the big-bang have a zero probability of occurring. Moreover, as a star collapses the probability of a distant observer seeing an infinite red shift at the Schwarzschild radius of the star is zero. Therefore, there is a vanishing probability of a Schwarzschild black hole event horizon forming during gravitational collapse.Comment: Revised version. Eq. (108) has been modified. Additional comments have been added to text. Revtex 39 page
    • 

    corecore