38 research outputs found

    Spacetime Information

    Get PDF
    In usual quantum theory, the information available about a quantum system is defined in terms of the density matrix describing it on a spacelike surface. This definition must be generalized for extensions of quantum theory which do not have a notion of state on a spacelike surface. It must be generalized for the generalized quantum theories appropriate when spacetime geometry fluctuates quantum mechanically or when geometry is fixed but not foliable by spacelike surfaces. This paper introduces a four-dimensional notion of the information available about a quantum system's boundary conditions in the various sets of decohering histories it may display. The idea of spacetime information is applied in several contexts: When spacetime geometry is fixed the information available through alternatives restricted to a spacetime region is defined. The information available through histories of alternatives of general operators is compared to that obtained from the more limited coarse- grainings of sum-over-histories quantum mechanics. The definition of information is considered in generalized quantum theories. We consider as specific examples time-neutral quantum mechanics with initial and final conditions, quantum theories with non-unitary evolution, and the generalized quantum frameworks appropriate for quantum spacetime. In such theories complete information about a quantum system is not necessarily available on any spacelike surface but must be searched for throughout spacetime. The information loss commonly associated with the ``evolution of pure states into mixed states'' in black hole evaporation is thus not in conflict with the principles of generalized quantum mechanics.Comment: 47pages, 2 figures, UCSBTH 94-0

    The Response of Earth's Electron Radiation Belts to Geomagnetic Storms: Statistics From the Van Allen Probes Era Including Effects From Different Storm Drivers

    Get PDF
    A statistical study was conducted of Earth's radiation belt electron response to geomagnetic storms using NASA's Van Allen Probes mission. Data for electrons with energies ranging from 30 keV to 6.3 MeV were included and examined as a function of L-shell, energy, and epoch time during 110 storms with SYM-H 1 MeV also revealed a marked increase in likelihood of a depletion at all L-shells through the outer belt (3.5 1-MeV electrons throughout the outer belt, while storms driven by full CMEs and stream interaction regions are most likely to produce an enhancement of MeV electrons at lower (L similar to 4.5) L-shells, respectively. CME sheaths intriguingly result in a distinct enhancement of similar to 1-MeV electrons around L similar to 5.5, and on average, CME sheaths and stream interaction regions result in double outer belt structures

    Correlation Entropy of an Interacting Quantum Field and H-theorem for the O(N) Model

    Full text link
    Following the paradigm of Boltzmann-BBGKY we propose a correlation entropy (of the nth order) for an interacting quantum field, obtained by `slaving' (truncation with causal factorization) of the higher (n+1 th) order correlation functions in the Schwinger-Dyson system of equations. This renders an otherwise closed system effectively open where dissipation arises. The concept of correlation entropy is useful for addressing issues related to thermalization. As a small yet important step in that direction we prove an H-theorem for the correlation entropy of a quantum mechanical O(N) model with a Closed Time Path Two Particle Irreducible Effective Action at the level of Next-to-Leading-Order large N approximation. This model may be regarded as a field theory in 00 space dimensions.Comment: 22 page

    Measurement-based quantum foundations

    Full text link
    I show that quantum theory is the only probabilistic framework that permits arbitrary processes to be emulated by sequences of local measurements. This supports the view that, contrary to conventional wisdom, measurement should not be regarded as a complex phenomenon in need of a dynamical explanation but rather as a primitive -- and perhaps the only primitive -- operation of the theory.Comment: 8 pages, version to appear in Found. Phy

    Anthropogenic Space Weather

    Full text link
    Anthropogenic effects on the space environment started in the late 19th century and reached their peak in the 1960s when high-altitude nuclear explosions were carried out by the USA and the Soviet Union. These explosions created artificial radiation belts near Earth that resulted in major damages to several satellites. Another, unexpected impact of the high-altitude nuclear tests was the electromagnetic pulse (EMP) that can have devastating effects over a large geographic area (as large as the continental United States). Other anthropogenic impacts on the space environment include chemical release ex- periments, high-frequency wave heating of the ionosphere and the interaction of VLF waves with the radiation belts. This paper reviews the fundamental physical process behind these phenomena and discusses the observations of their impacts.Comment: 71 pages, 35 figure

    Origins of the Ambient Solar Wind: Implications for Space Weather

    Full text link
    The Sun's outer atmosphere is heated to temperatures of millions of degrees, and solar plasma flows out into interplanetary space at supersonic speeds. This paper reviews our current understanding of these interrelated problems: coronal heating and the acceleration of the ambient solar wind. We also discuss where the community stands in its ability to forecast how variations in the solar wind (i.e., fast and slow wind streams) impact the Earth. Although the last few decades have seen significant progress in observations and modeling, we still do not have a complete understanding of the relevant physical processes, nor do we have a quantitatively precise census of which coronal structures contribute to specific types of solar wind. Fast streams are known to be connected to the central regions of large coronal holes. Slow streams, however, appear to come from a wide range of sources, including streamers, pseudostreamers, coronal loops, active regions, and coronal hole boundaries. Complicating our understanding even more is the fact that processes such as turbulence, stream-stream interactions, and Coulomb collisions can make it difficult to unambiguously map a parcel measured at 1 AU back down to its coronal source. We also review recent progress -- in theoretical modeling, observational data analysis, and forecasting techniques that sit at the interface between data and theory -- that gives us hope that the above problems are indeed solvable.Comment: Accepted for publication in Space Science Reviews. Special issue connected with a 2016 ISSI workshop on "The Scientific Foundations of Space Weather." 44 pages, 9 figure

    Are biological systems poised at criticality?

    Full text link
    Many of life's most fascinating phenomena emerge from interactions among many elements--many amino acids determine the structure of a single protein, many genes determine the fate of a cell, many neurons are involved in shaping our thoughts and memories. Physicists have long hoped that these collective behaviors could be described using the ideas and methods of statistical mechanics. In the past few years, new, larger scale experiments have made it possible to construct statistical mechanics models of biological systems directly from real data. We review the surprising successes of this "inverse" approach, using examples form families of proteins, networks of neurons, and flocks of birds. Remarkably, in all these cases the models that emerge from the data are poised at a very special point in their parameter space--a critical point. This suggests there may be some deeper theoretical principle behind the behavior of these diverse systems.Comment: 21 page

    A representation theorem and applications

    No full text
    Abstract. We introduce a set of transformations on the set of all probability distributions over a finite state space, and show that these transformations are the only ones that preserve certain elementary probabilistic relationships. This result provides a new perspective on a variety of probabilistic inference problems in which invariance considerations play a role. Two particular applications we consider in this paper are the development of an equivariance-based approach to the problem of measure selection, and a new justification for Haldane’s prior as the distribution that encodes prior ignorance about the parameter of a multinomial distribution.

    A Bayesian framework for model calibration, comparison and analysis: application to four models for the biogeochemistry of a Norway spruce forest

    Get PDF
    Four different parameter-rich process-based models of forest biogeochemistry were analysed in a Bayesian framework consisting of three operations: (1) Model calibration, (2) Model comparison, (3) Analysis of model–data mismatch. Data were available for four output variables common to the models: soil water content and emissions of N2O, NO and CO2. All datasets consisted of time series of daily measurements. Monthly averages and quantiles of the annual frequency distributions of daily emission rates were calculated for comparison with equivalent model outputs. This use of the data at model-appropriate temporal scale, together with the choice of heavy-tailed likelihood functions that accounted for data uncertainty through random and systematic errors, helped prevent asymptotic collapse of the parameter distributions in the calibration. Model behaviour and how it was affected by calibration was analysed by quantifying the normalised RMSE and r2 for the different output variables, and by decomposition of the MSE into contributions from bias, phase shift and variance error. The simplest model, BASFOR, seemed to underestimate the temporal variance of nitrogenous emissions even after calibration. The model of intermediate complexity, DAYCENT, simulated the time series well but with large phase shift. COUP and MoBiLE-DNDC were able to remove most bias through calibration. The Bayesian framework was shown to be effective in improving the parameterisation of the models, quantifying the uncertainties in parameters and outputs, and evaluating the different models. The analysis showed that there remain patterns in the data – in particular infrequent events of very high nitrogenous emission rate – that are unexplained by any of the selected forest models and that this is unlikely to be due to incorrect model parameterisation
    corecore