502 research outputs found

    Problems with Time-Varying Extra Dimensions or "Cardassian Expansion" as Alternatives to Dark Energy

    Get PDF
    It has recently been proposed that the Universe might be accelerating as a consequence of extra dimensions with time varying size. We show that although these scenarios can lead to acceleration, they run into serious difficulty when taking into account limits on the time variation of the four dimensional Newton's constant. On the other hand, models of ``Cardassian'' expansion based on extra dimensions which have been constructed so far violate the weak energy condition for the bulk stress energy, for parameters that give an accelerating universe.Comment: 8 pages, minor changes. To appear in Physical Review

    Gauge-gravity correspondence in de Sitter braneworld

    Full text link
    We study the braneworld solutions based on a solvable model of 5d gauged supergravity with two scalars of conformal dimension three, which correspond to bilinear operators of fermions in the dual N=4\mathcal{N}=4 super Yang-Mills theory on the boundary. An accelerating braneworld solution is obtained when both scalars are taken as the form of deformations of the super Yang-Mills theory and the bulk supersymmetry is broken. This solution is smoothly connected to the Poincare invariant brane in the limit of vanishing cosmological constant. The stability of this brane-solution and the correspondence to the gauge theory are addressed.Comment: 16 pages, 1 figur

    Constraining dark energy with Sunyaev-Zel'dovich cluster surveys

    Get PDF
    We discuss the prospects of constraining the properties of a dark energy component, with particular reference to a time varying equation of state, using future cluster surveys selected by their Sunyaev-Zel'dovich effect. We compute the number of clusters expected for a given set of cosmological parameters and propogate the errors expected from a variety of surveys. In the short term they will constrain dark energy in conjunction with future observations of type Ia supernovae, but may in time do so in their own right.Comment: 5 pages, 3 figures, 1 table, version accepted for publication in PR

    Quantum phase transitions of light

    Full text link
    Recently, condensed matter and atomic experiments have reached a length-scale and temperature regime where new quantum collective phenomena emerge. Finding such physics in systems of photons, however, is problematic, as photons typically do not interact with each other and can be created or destroyed at will. Here, we introduce a physical system of photons that exhibits strongly correlated dynamics on a meso-scale. By adding photons to a two-dimensional array of coupled optical cavities each containing a single two-level atom in the photon-blockade regime, we form dressed states, or polaritons, that are both long-lived and strongly interacting. Our zero temperature results predict that this photonic system will undergo a characteristic Mott insulator (excitations localised on each site) to superfluid (excitations delocalised across the lattice) quantum phase transition. Each cavity's impressive photon out-coupling potential may lead to actual devices based on these quantum many-body effects, as well as observable, tunable quantum simulators. We explicitly show that such phenomena may be observable in micro-machined diamond containing nitrogen-vacancy colour centres and superconducting microwave strip-line resonators.Comment: 11 pages, 5 figures (2 in colour

    Experimental loophole-free violation of a Bell inequality using entangled electron spins separated by 1.3 km

    Get PDF
    For more than 80 years, the counterintuitive predictions of quantum theory have stimulated debate about the nature of reality. In his seminal work, John Bell proved that no theory of nature that obeys locality and realism can reproduce all the predictions of quantum theory. Bell showed that in any local realist theory the correlations between distant measurements satisfy an inequality and, moreover, that this inequality can be violated according to quantum theory. This provided a recipe for experimental tests of the fundamental principles underlying the laws of nature. In the past decades, numerous ingenious Bell inequality tests have been reported. However, because of experimental limitations, all experiments to date required additional assumptions to obtain a contradiction with local realism, resulting in loopholes. Here we report on a Bell experiment that is free of any such additional assumption and thus directly tests the principles underlying Bell's inequality. We employ an event-ready scheme that enables the generation of high-fidelity entanglement between distant electron spins. Efficient spin readout avoids the fair sampling assumption (detection loophole), while the use of fast random basis selection and readout combined with a spatial separation of 1.3 km ensure the required locality conditions. We perform 245 trials testing the CHSH-Bell inequality S2S \leq 2 and find S=2.42±0.20S = 2.42 \pm 0.20. A null hypothesis test yields a probability of p=0.039p = 0.039 that a local-realist model for space-like separated sites produces data with a violation at least as large as observed, even when allowing for memory in the devices. This result rules out large classes of local realist theories, and paves the way for implementing device-independent quantum-secure communication and randomness certification.Comment: Raw data will be made available after publicatio

    An investigation of the apparent breast cancer epidemic in France: screening and incidence trends in birth cohorts

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Official descriptive data from France showed a strong increase in breast-cancer incidence between 1980 to 2005 without a corresponding change in breast-cancer mortality. This study quantifies the part of incidence increase due to secular changes in risk factor exposure and in overdiagnosis due to organised or opportunistic screening. Overdiagnosis was defined as non progressive tumours diagnosed as cancer at histology or progressive cancer that would remain asymptomatic until time of death for another cause.</p> <p>Methods</p> <p>Comparison between age-matched cohorts from 1980 to 2005. All women residing in France and born 1911-1915, 1926-1930 and 1941-1945 are included. Sources are official data sets and published French reports on screening by mammography, age and time specific breast-cancer incidence and mortality, hormone replacement therapy, alcohol and obesity. Outcome measures include breast-cancer incidence differences adjusted for changes in risk factor distributions between pairs of age-matched cohorts who had experienced different levels of screening intensity.</p> <p>Results</p> <p>There was an 8-fold increase in the number of mammography machines operating in France between 1980 and 2000. Opportunistic and organised screening increased over time. In comparison to age-matched cohorts born 15 years earlier, recent cohorts had adjusted incidence proportion over 11 years that were 76% higher [95% confidence limits (CL) 67%, 85%] for women aged 50 to 64 years and 23% higher [95% CL 15%, 31%] for women aged 65 to 79 years. Given that mortality did not change correspondingly, this increase in adjusted 11 year incidence proportion was considered as an estimate of overdiagnosis.</p> <p>Conclusions</p> <p>Breast cancer may be overdiagnosed because screening increases diagnosis of slowly progressing non-life threatening cancer and increases misdiagnosis among women without progressive cancer. We suggest that these effects could largely explain the reported "epidemic" of breast cancer in France. Better predictive classification of tumours is needed in order to avoid unnecessary cancer diagnoses and subsequent procedures.</p

    A frequentist framework of inductive reasoning

    Full text link
    Reacting against the limitation of statistics to decision procedures, R. A. Fisher proposed for inductive reasoning the use of the fiducial distribution, a parameter-space distribution of epistemological probability transferred directly from limiting relative frequencies rather than computed according to the Bayes update rule. The proposal is developed as follows using the confidence measure of a scalar parameter of interest. (With the restriction to one-dimensional parameter space, a confidence measure is essentially a fiducial probability distribution free of complications involving ancillary statistics.) A betting game establishes a sense in which confidence measures are the only reliable inferential probability distributions. The equality between the probabilities encoded in a confidence measure and the coverage rates of the corresponding confidence intervals ensures that the measure's rule for assigning confidence levels to hypotheses is uniquely minimax in the game. Although a confidence measure can be computed without any prior distribution, previous knowledge can be incorporated into confidence-based reasoning. To adjust a p-value or confidence interval for prior information, the confidence measure from the observed data can be combined with one or more independent confidence measures representing previous agent opinion. (The former confidence measure may correspond to a posterior distribution with frequentist matching of coverage probabilities.) The representation of subjective knowledge in terms of confidence measures rather than prior probability distributions preserves approximate frequentist validity.Comment: major revisio

    Experimental demonstration of quantum correlations over more than 10 km

    Full text link
    Energy and time entangled photons at a wavelength of 1310 nm are produced by parametric downconversion in a KNbO3 crystal and are sent into all-fiber interferometers using a telecom fiber network. The two interferometers of this Franson-type test of the Bell-inequality are located 10.9 km apart from one another. Two-photon fringe visibilities of up to 81.6 % are obtained. These strong nonlocal correlations support the nonlocal predictions of quantum mechanics and provide evidence that entanglement between photons can be maintained over long distances.Comment: 5 pages, REVTeX, 3 postscript figures include

    Vitamin D supplementation and breast cancer prevention : a systematic review and meta-analysis of randomized clinical trials

    Get PDF
    In recent years, the scientific evidence linking vitamin D status or supplementation to breast cancer has grown notably. To investigate the role of vitamin D supplementation on breast cancer incidence, we conducted a systematic review and meta-analysis of randomized controlled trials comparing vitamin D with placebo or no treatment. We used OVID to search MEDLINE (R), EMBASE and CENTRAL until April 2012. We screened the reference lists of included studies and used the “Related Article” feature in PubMed to identify additional articles. No language restrictions were applied. Two reviewers independently extracted data on methodological quality, participants, intervention, comparison and outcomes. Risk Ratios and 95% Confident Intervals for breast cancer were pooled using a random-effects model. Heterogeneity was assessed using the I2 test. In sensitivity analysis, we assessed the impact of vitamin D dosage and mode of administration on treatment effects. Only two randomized controlled trials fulfilled the pre-set inclusion criteria. The pooled analysis included 5372 postmenopausal women. Overall, Risk Ratios and 95% Confident Intervals were 1.11 and 0.74–1.68. We found no evidence of heterogeneity. Neither vitamin D dosage nor mode of administration significantly affected breast cancer risk. However, treatment efficacy was somewhat greater when vitamin D was administered at the highest dosage and in combination with calcium (Risk Ratio 0.58, 95% Confident Interval 0.23–1.47 and Risk Ratio 0.93, 95% Confident Interval 0.54–1.60, respectively). In conclusions, vitamin D use seems not to be associated with a reduced risk of breast cancer development in postmenopausal women. However, the available evidence is still limited and inadequate to draw firm conclusions. Study protocol code: FARM8L2B5L

    Testing foundations of quantum mechanics with photons

    Full text link
    The foundational ideas of quantum mechanics continue to give rise to counterintuitive theories and physical effects that are in conflict with a classical description of Nature. Experiments with light at the single photon level have historically been at the forefront of tests of fundamental quantum theory and new developments in photonics engineering continue to enable new experiments. Here we review recent photonic experiments to test two foundational themes in quantum mechanics: wave-particle duality, central to recent complementarity and delayed-choice experiments; and Bell nonlocality where recent theoretical and technological advances have allowed all controversial loopholes to be separately addressed in different photonics experiments.Comment: 10 pages, 5 figures, published as a Nature Physics Insight review articl
    corecore