91 research outputs found
Decoherence, Re-coherence, and the Black Hole Information Paradox
We analyze a system consisting of an oscillator coupled to a field. With the
field traced out as an environment, the oscillator loses coherence on a very
short {\it decoherence timescale}; but, on a much longer {\it relaxation
timescale}, predictably evolves into a unique, pure (ground) state. This
example of {\it re-coherence} has interesting implications both for the
interpretation of quantum theory and for the loss of information during black
hole evaporation. We examine these implications by investigating the
intermediate and final states of the quantum field, treated as an open system
coupled to an unobserved oscillator.Comment: 23 pages, 2 figures included, figures 3.1 - 3.3 available at
http://qso.lanl.gov/papers/Papers.htm
Environment-Induced Decoherence and the Transition From Quantum to Classical
We study dynamics of quantum open systems, paying special attention to those
aspects of their evolution which are relevant to the transition from quantum to
classical. We begin with a discussion of the conditional dynamics of simple
systems. The resulting models are straightforward but suffice to illustrate
basic physical ideas behind quantum measurements and decoherence. To discuss
decoherence and environment-induced superselection einselection in a more
general setting, we sketch perturbative as well as exact derivations of several
master equations valid for various systems. Using these equations we study
einselection employing the general strategy of the predictability sieve.
Assumptions that are usually made in the discussion of decoherence are
critically reexamined along with the ``standard lore'' to which they lead.
Restoration of quantum-classical correspondence in systems that are classically
chaotic is discussed. The dynamical second law -it is shown- can be traced to
the same phenomena that allow for the restoration of the correspondence
principle in decohering chaotic systems (where it is otherwise lost on a very
short time-scale). Quantum error correction is discussed as an example of an
anti-decoherence strategy. Implications of decoherence and einselection for the
interpretation of quantum theory are briefly pointed out.Comment: 80 pages, 7 figures included, Lectures given by both authors at the
72nd Les Houches Summer School on "Coherent Matter Waves", July-August 199
What Physical Processes Drive the Interstellar Medium in the Local Bubble?
Recent 3D high-resolution simulations of the interstellar medium in a star form-
ing galaxy like the Milky Way show that supernova explosions are the main driver of the
structure and evolution of the gas. Its physical state is largely controlled by turbulence due
to the high Reynolds numbers of the average flows. For a constant supernova rate a dynam-
ical equilibrium is established within 200 Myr of simulation as a consequence of the setup
of a galactic fountain. The resulting interstellar medium reveals a typical density/pressure
pattern, i.e. distribution of so-called gas phases, on scales of 500–700 pc, with interstellar
bubbles being a common phenomenon just like the Local Bubble and the Loop I superbub-
ble, which are assumed to be interacting. However, modeling the Local Bubble is special,
because it is driven by a moving group, passing through its volume, as it is inferred from
the analysis of Hipparcos data. A detailed analysis reveals that between 14 and 19 super-
novae have exploded during the last 15 Myr. The age of the Local Bubble is derived from
comparison with HI and UV absorption line data to be 14.5±0.7
Myr. We further predict the
0.4merging of the two bubbles in about 3 Myr from now, when the interaction shell starts to
fragment. The Local Cloud and its companion HI clouds are the consequence of a dynamical
instability in the interaction shell between the Local and the Loop I bubble
Natriuretic peptides and integrated risk assessment for cardiovascular disease: an individual-participant-data meta-analysis
BACKGROUND: Guidelines for primary prevention of cardiovascular diseases focus on prediction of coronary heart disease and stroke. We assessed whether or not measurement of N-terminal-pro-B-type natriuretic peptide (NT-proBNP) concentration could enable a more integrated approach than at present by predicting heart failure and enhancing coronary heart disease and stroke risk assessment.
METHODS: In this individual-participant-data meta-analysis, we generated and harmonised individual-participant data from relevant prospective studies via both de-novo NT-proBNP concentration measurement of stored samples and collection of data from studies identified through a systematic search of the literature (PubMed, Scientific Citation Index Expanded, and Embase) for articles published up to Sept 4, 2014, using search terms related to natriuretic peptide family members and the primary outcomes, with no language restrictions. We calculated risk ratios and measures of risk discrimination and reclassification across predicted 10 year risk categories (ie, <5%, 5% to <7·5%, and ≥7·5%), adding assessment of NT-proBNP concentration to that of conventional risk factors (ie, age, sex, smoking status, systolic blood pressure, history of diabetes, and total and HDL cholesterol concentrations). Primary outcomes were the combination of coronary heart disease and stroke, and the combination of coronary heart disease, stroke, and heart failure.
FINDINGS: We recorded 5500 coronary heart disease, 4002 stroke, and 2212 heart failure outcomes among 95 617 participants without a history of cardiovascular disease in 40 prospective studies. Risk ratios (for a comparison of the top third vs bottom third of NT-proBNP concentrations, adjusted for conventional risk factors) were 1·76 (95% CI 1·56-1·98) for the combination of coronary heart disease and stroke and 2·00 (1·77-2·26) for the combination of coronary heart disease, stroke, and heart failure. Addition of information about NT-proBNP concentration to a model containing conventional risk factors was associated with a C-index increase of 0·012 (0·010-0·014) and a net reclassification improvement of 0·027 (0·019-0·036) for the combination of coronary heart disease and stroke and a C-index increase of 0·019 (0·016-0·022) and a net reclassification improvement of 0·028 (0·019-0·038) for the combination of coronary heart disease, stroke, and heart failure.
INTERPRETATION: In people without baseline cardiovascular disease, NT-proBNP concentration assessment strongly predicted first-onset heart failure and augmented coronary heart disease and stroke prediction, suggesting that NT-proBNP concentration assessment could be used to integrate heart failure into cardiovascular disease primary prevention.
FUNDING: British Heart Foundation, Austrian Science Fund, UK Medical Research Council, National Institute for Health Research, European Research Council, and European Commission Framework Programme 7
The Atacama Cosmology Telescope: A Catalog of >4000 Sunyaev–Zel’dovich Galaxy Clusters
We present a catalog of 4195 optically confirmed Sunyaev–Zel'dovich (SZ) selected galaxy clusters detected with signal-to-noise ratio >4 in 13,211 deg2 of sky surveyed by the Atacama Cosmology Telescope (ACT). Cluster candidates were selected by applying a multifrequency matched filter to 98 and 150 GHz maps constructed from ACT observations obtained from 2008 to 2018 and confirmed using deep, wide-area optical surveys. The clusters span the redshift range 0.04 1 clusters, and a total of 868 systems are new discoveries. Assuming an SZ signal versus mass-scaling relation calibrated from X-ray observations, the sample has a 90% completeness mass limit of M500c > 3.8 × 1014 M⊙, evaluated at z = 0.5, for clusters detected at signal-to-noise ratio >5 in maps filtered at an angular scale of 2farcm4. The survey has a large overlap with deep optical weak-lensing surveys that are being used to calibrate the SZ signal mass-scaling relation, such as the Dark Energy Survey (4566 deg2), the Hyper Suprime-Cam Subaru Strategic Program (469 deg2), and the Kilo Degree Survey (825 deg2). We highlight some noteworthy objects in the sample, including potentially projected systems, clusters with strong lensing features, clusters with active central galaxies or star formation, and systems of multiple clusters that may be physically associated. The cluster catalog will be a useful resource for future cosmological analyses and studying the evolution of the intracluster medium and galaxies in massive clusters over the past 10 Gyr
DES Y3 + KiDS-1000: Consistent cosmology combining cosmic shear surveys
We present a joint cosmic shear analysis of the Dark Energy Survey (DES Y3)
and the Kilo-Degree Survey (KiDS-1000) in a collaborative effort between the
two survey teams. We find consistent cosmological parameter constraints between
DES Y3 and KiDS-1000 which, when combined in a joint-survey analysis, constrain
the parameter with a mean value of
. The mean marginal is lower than the maximum a
posteriori estimate, , owing to skewness in the marginal
distribution and projection effects in the multi-dimensional parameter space.
Our results are consistent with constraints from observations of the
cosmic microwave background by Planck, with agreement at the level.
We use a Hybrid analysis pipeline, defined from a mock survey study quantifying
the impact of the different analysis choices originally adopted by each survey
team. We review intrinsic alignment models, baryon feedback mitigation
strategies, priors, samplers and models of the non-linear matter power
spectrum.Comment: 38 pages, 21 figures, 15 tables, submitted to the Open Journal of
Astrophysics. Watch the core team discuss this analysis at
https://cosmologytalks.com/2023/05/26/des-kid
All-sky search for long-duration gravitational wave transients with initial LIGO
We present the results of a search for long-duration gravitational wave transients in two sets of data collected by the LIGO Hanford and LIGO Livingston detectors between November 5, 2005 and September 30, 2007, and July 7, 2009 and October 20, 2010, with a total observational time of 283.0 days and 132.9 days, respectively. The search targets gravitational wave transients of duration 10-500 s in a frequency band of 40-1000 Hz, with minimal assumptions about the signal waveform, polarization, source direction, or time of occurrence. All candidate triggers were consistent with the expected background; as a result we set 90% confidence upper limits on the rate of long-duration gravitational wave transients for different types of gravitational wave signals. For signals from black hole accretion disk instabilities, we set upper limits on the source rate density between 3.4×10-5 and 9.4×10-4 Mpc-3 yr-1 at 90% confidence. These are the first results from an all-sky search for unmodeled long-duration transient gravitational waves. © 2016 American Physical Society
All-sky search for long-duration gravitational wave transients with initial LIGO
We present the results of a search for long-duration gravitational wave transients in two sets of data collected by the LIGO Hanford and LIGO Livingston detectors between November 5, 2005 and September 30, 2007, and July 7, 2009 and October 20, 2010, with a total observational time of 283.0 days and 132.9 days, respectively. The search targets gravitational wave transients of duration 10-500 s in a frequency band of 40-1000 Hz, with minimal assumptions about the signal waveform, polarization, source direction, or time of occurrence. All candidate triggers were consistent with the expected background; as a result we set 90% confidence upper limits on the rate of long-duration gravitational wave transients for different types of gravitational wave signals. For signals from black hole accretion disk instabilities, we set upper limits on the source rate density between 3.4×10-5 and 9.4×10-4 Mpc-3 yr-1 at 90% confidence. These are the first results from an all-sky search for unmodeled long-duration transient gravitational waves. © 2016 American Physical Society
- …