1,276 research outputs found

    A Note on the Area under the Gains Chart

    Get PDF
    The Receiver Operating Characteristic (ROC) chart is well known in medicine and machine learning. In particular the area under the ROC chart measures the probability of correct selection in a two alternative forced choice (2AFC) scenario. The gains chart is closely related to the ROC curve but carries extra information about the rate at which the classifier identifies response, information that is not carried by the ROC chart. In this note, we point out that the appropriate area under the gains chart is identical to the analogous area under the ROC chart and that the gains chart is therefor to be preferred as a summary of classifier success

    Efficient Bayesian Nonparametric Modelling of Structured Point Processes

    Full text link
    This paper presents a Bayesian generative model for dependent Cox point processes, alongside an efficient inference scheme which scales as if the point processes were modelled independently. We can handle missing data naturally, infer latent structure, and cope with large numbers of observed processes. A further novel contribution enables the model to work effectively in higher dimensional spaces. Using this method, we achieve vastly improved predictive performance on both 2D and 1D real data, validating our structured approach.Comment: Presented at UAI 2014. Bibtex: @inproceedings{structcoxpp14_UAI, Author = {Tom Gunter and Chris Lloyd and Michael A. Osborne and Stephen J. Roberts}, Title = {Efficient Bayesian Nonparametric Modelling of Structured Point Processes}, Booktitle = {Uncertainty in Artificial Intelligence (UAI)}, Year = {2014}

    Estuary environmental flows assessment methodology for Victoria

    Full text link
    This report sets out a method to determine the environmental water requirements of estuaries in Victoria. The estuary environmental flows assessment method (EEFAM) is a standard methodology which can be applied consistently across Victorian estuaries.The primary objective of EEFAM is to define a flow regime to maintain or enhance the ecological health of an estuary. The method is used to inform Victorian water resource planning processes.The output of EEFAM is a recommended flow regime for estuaries. This recommendation is developed from the known dependence of the estuary’s flora, fauna, biogeochemical and geomorphological features on the flow regime. EEFAM is an evidence-based methodology. This bottom-up or ‘building block’ approach conforms to the asset-based approach of the Victorian River Health Strategy and regional river health strategies.EEFAM is based on and expands on FLOWS, the Victorian method for determining environmental water requirements in rivers. The list of tasks has been modified and re-ordered in EEFAM to reflect environmental and management issues specific to estuaries. EEFAM and FLOWS can be appliedsimultaneously to a river and its estuary as part of a whole-of-system approach to environmental flow requirements. Like the FLOWS method, EEFAM is modular, and additional components can be readily incorporated

    A luminosity distribution for kilonovae based on short gamma-ray burst afterglows

    Get PDF
    The combined detection of a gravitational-wave signal, kilonova, and short gamma-ray burst (sGRB) from GW170817 marked a scientific breakthrough in the field of multi-messenger astronomy. But even before GW170817, there have been a number of sGRBs with possible associated kilonova detections. In this work, we re-examine these "historical" sGRB afterglows with a combination of state-of-the-art afterglow and kilonova models. This allows us to include optical/near-infrared synchrotron emission produced by the sGRB as well as ultraviolet/optical/near-infrared emission powered by the radioactive decay of rr-process elements (i.e., the kilonova). Fitting the lightcurves, we derive the velocity and the mass distribution as well as the composition of the ejected material. The posteriors on kilonova parameters obtained from the fit were turned into distributions for the peak magnitude of the kilonova emission in different bands and the time at which this peak occurs. From the sGRB with an associated kilonova, we found that the peak magnitude in H bands falls in the range [-16.2, -13.1] (95%95\% of confidence) and occurs within 0.83.6days0.8-3.6\,\rm days after the sGRB prompt emission. In g band instead we obtain a peak magnitude in range [-16.8, -12.3] occurring within the first 18hr18\,\rm hr after the sGRB prompt. From the luminosity distributions of GW170817/AT2017gfo, kilonova candidates GRB130603B, GRB050709 and GRB060614 (with the possible inclusion of GRB150101B) and the upper limits from all the other sGRBs not associated with any kilonova detection we obtain for the first time a kilonova luminosity function in different bands.Comment: Published in MNRAS, 24 pages, 14 figure

    Quantum Monte Carlo study of the phase diagram of solid molecular hydrogen at extreme pressures.

    Get PDF
    Establishing the phase diagram of hydrogen is a major challenge for experimental and theoretical physics. Experiment alone cannot establish the atomic structure of solid hydrogen at high pressure, because hydrogen scatters X-rays only weakly. Instead, our understanding of the atomic structure is largely based on density functional theory (DFT). By comparing Raman spectra for low-energy structures found in DFT searches with experimental spectra, candidate atomic structures have been identified for each experimentally observed phase. Unfortunately, DFT predicts a metallic structure to be energetically favoured at a broad range of pressures up to 400 GPa, where it is known experimentally that hydrogen is non-metallic. Here we show that more advanced theoretical methods (diffusion quantum Monte Carlo calculations) find the metallic structure to be uncompetitive, and predict a phase diagram in reasonable agreement with experiment. This greatly strengthens the claim that the candidate atomic structures accurately model the experimentally observed phases.We thank Dominik Jochym for help with the implementation of the BLYP density functional. Financial support was provided by the Engineering and Physical Sciences Research Council (EPSRC), U.K. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. Additional calculations were performed on the Cambridge High Performance Computing Service facility Darwin and the N8 high-performance computing facility provided and funded by the N8 consortium and EPSRC (Grant No. EP/K000225/1). We thank Dominik Jochym for help with the mplementation of the BLYP density functional. Financial support was provided by the Engineering and Physical Sciences Research Council (EPSRC), U.K. This research used resources of the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC05-00OR22725. Additional calculations were performed on the Cambridge High Performance Computing Service facility Darwin and the N8 high-performance computing facility provided and funded by the N8 consortium and EPSRC (Grant No. EP/K000225/1).This is the final version of the article. It first appeared from Nature Publishing Group via http://dx.doi.org/10.1038/ncomms879

    Quantum Information Theory of Entanglement and Measurement

    Full text link
    We present a quantum information theory that allows for a consistent description of entanglement. It parallels classical (Shannon) information theory but is based entirely on density matrices (rather than probability distributions) for the description of quantum ensembles. We find that quantum conditional entropies can be negative for entangled systems, which leads to a violation of well-known bounds in Shannon information theory. Such a unified information-theoretic description of classical correlation and quantum entanglement clarifies the link between them: the latter can be viewed as ``super-correlation'' which can induce classical correlation when considering a tripartite or larger system. Furthermore, negative entropy and the associated clarification of entanglement paves the way to a natural information-theoretic description of the measurement process. This model, while unitary and causal, implies the well-known probabilistic results of conventional quantum mechanics. It also results in a simple interpretation of the Kholevo theorem limiting the accessible information in a quantum measurement.Comment: 26 pages with 6 figures. Expanded version of PhysComp'96 contributio

    The XMM Cluster Survey: Evidence for energy injection at high redshift from evolution of the X-ray luminosity-temperature relation

    Get PDF
    We measure the evolution of the X-ray luminosity-temperature (L_X-T) relation since z~1.5 using a sample of 211 serendipitously detected galaxy clusters with spectroscopic redshifts drawn from the XMM Cluster Survey first data release (XCS-DR1). This is the first study spanning this redshift range using a single, large, homogeneous cluster sample. Using an orthogonal regression technique, we find no evidence for evolution in the slope or intrinsic scatter of the relation since z~1.5, finding both to be consistent with previous measurements at z~0.1. However, the normalisation is seen to evolve negatively with respect to the self-similar expectation: we find E(z)^{-1} L_X = 10^{44.67 +/- 0.09} (T/5)^{3.04 +/- 0.16} (1+z)^{-1.5 +/- 0.5}, which is within 2 sigma of the zero evolution case. We see milder, but still negative, evolution with respect to self-similar when using a bisector regression technique. We compare our results to numerical simulations, where we fit simulated cluster samples using the same methods used on the XCS data. Our data favour models in which the majority of the excess entropy required to explain the slope of the L_X-T relation is injected at high redshift. Simulations in which AGN feedback is implemented using prescriptions from current semi-analytic galaxy formation models predict positive evolution of the normalisation, and differ from our data at more than 5 sigma. This suggests that more efficient feedback at high redshift may be needed in these models.Comment: Accepted for publication in MNRAS; 12 pages, 6 figures; added references to match published versio

    HIV Prevalence in a Gold Mining Camp in the Amazon Region, Guyana

    Get PDF
    The prevalence of HIV infection among men in a gold mining camp in the Amazon region of Guyana was 6.5%. This high percentage of HIV infection provides a reservoir for the virus in this region, warranting immediate public health intervention to curb its spread. As malaria is endemic in the Amazon Basin (>30,000 cases/year), the impact of coinfection may be substantial
    corecore