168 research outputs found

    Correlation effects in ionic crystals: I. The cohesive energy of MgO

    Full text link
    High-level quantum-chemical calculations, using the coupled-cluster approach and extended one-particle basis sets, have been performed for (Mg2+)n (O2-)m clusters embedded in a Madelung potential. The results of these calculations are used for setting up an incremental expansion for the correlation energy of bulk MgO. This way, 96% of the experimental cohesive energy of the MgO crystal is recovered. It is shown that only 60% of the correlation contribution to the cohesive energy is of intra-ionic origin, the remaining part being caused by van der Waals-like inter-ionic excitations.Comment: LaTeX, 20 pages, no figure

    Bayesian joint estimation of non-Gaussianity and the power spectrum

    Get PDF
    We propose a rigorous, non-perturbative, Bayesian framework which enables one jointly to test Gaussianity and estimate the power spectrum of CMB anisotropies. It makes use of the Hilbert space of an harmonic oscillator to set up an exact likelihood function, dependent on the power spectrum and on a set of parameters αi\alpha_i, which are zero for Gaussian processes. The latter can be expressed as series of cumulants; indeed they perturbatively reduce to cumulants. However they have the advantage that their variation is essentially unconstrained. Any truncation(i.e.: finite set of αi\alpha_i) therefore still produces a proper distribution - something which cannot be said of the only other such tool on offer, the Edgeworth expansion. We apply our method to Very Small Array (VSA) simulations based on signal Gaussianity, showing that our algorithm is indeed not biased.Comment: 11pages, 4 figures, submitted to MNRA

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Tabu assisted guided local search approaches for freight service network design

    Get PDF
    The service network design problem (SNDP) is a core problem in freight transportation. It involves the determination of the most cost-effective transportation network and the character- istics of the corresponding services, subject to various constraints. The scale of the problem in real-world applications is usually very large, especially when the network contains both the geographical information and the temporal constraints which are necessary for modelling mul- tiple service-classes and dynamic events. The development of time-efficient algorithms for this problem is, therefore, crucial for successful real-world applications. Earlier research indicated that guided local search (GLS) was a promising solution method for this problem. One of the advantages of GLS is that it makes use of both the information collected during the search as well as any special structures which are present in solutions. Building upon earlier research, this paper carries out in-depth investigations into several mechanisms that could potentially speed up the GLS algorithm for the SNDP. Specifically, the mechanisms that we have looked at in this paper include a tabu list (as used by tabu search), short-term memory, and an aspiration crite- rion. An efficient hybrid algorithm for the SNDP is then proposed, based upon the results of these experiments. The algorithm combines a tabu list within a multi-start GLS approach, with an efficient feasibility-repairing heuristic. Experimental tests on a set of 24 well-known service network design benchmark instances have shown that the proposed algorithm is superior to a previously proposed tabu search method, reducing the computation time by over a third. In ad- dition, we also show that far better results can be obtained when a faster linear program solver is adopted for the sub-problem solution. The contribution of this paper is an efficient algorithm, along with detailed analyses of effective mechanisms which can help to increase the speed of the GLS algorithm for the SNDP

    Genuine Correlations of Like-Sign Particles in Hadronic Z0 Decays

    Get PDF
    Correlations among hadrons with the same electric charge produced in Z0 decays are studied using the high statistics data collected from 1991 through 1995 with the OPAL detector at LEP. Normalized factorial cumulants up to fourth order are used to measure genuine particle correlations as a function of the size of phase space domains in rapidity, azimuthal angle and transverse momentum. Both all-charge and like-sign particle combinations show strong positive genuine correlations. One-dimensional cumulants initially increase rapidly with decreasing size of the phase space cells but saturate quickly. In contrast, cumulants in two- and three-dimensional domains continue to increase. The strong rise of the cumulants for all-charge multiplets is increasingly driven by that of like-sign multiplets. This points to the likely influence of Bose-Einstein correlations. Some of the recently proposed algorithms to simulate Bose-Einstein effects, implemented in the Monte Carlo model PYTHIA, are found to reproduce reasonably well the measured second- and higher-order correlations between particles with the same charge as well as those in all-charge particle multiplets.Comment: 26 pages, 6 figures, Submitted to Phys. Lett.

    An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics

    Get PDF
    For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types

    Innovating clinical trials for amyotrophic lateral sclerosis : challenging the established order

    Get PDF
    Development of effective treatments for amyotrophic lateral sclerosis (ALS) has been hampered by disease heterogeneity, a limited understanding of underlying pathophysiology, and methodologic design challenges. We have evaluated 2 major themes in the design of pivotal, phase 3 clinical trials for ALS—(1) patient selection and (2) analytical strategy—and discussed potential solutions with the European Medicines Agency. Several design considerations were assessed using data from 5 placebo-controlled clinical trials (n = 988), 4 population-based cohorts (n = 5,100), and 2,436 placebo-allocated patients from the Pooled Resource Open-Access ALS Clinical Trials (PRO-ACT) database. The validity of each proposed design modification was confirmed by means of simulation and illustrated for a hypothetical setting. Compared to classical trial design, the proposed design modifications reduce the sample size by 30.5% and placebo exposure time by 35.4%. By making use of prognostic survival models, one creates a potential to include a larger proportion of the population and maximize generalizability. We propose a flexible design framework that naturally adapts the trial duration when inaccurate assumptions are made at the design stage, such as enrollment or survival rate. In case of futility, the follow-up time is shortened and patient exposure to ineffective treatments or placebo is minimized. For diseases such as ALS, optimizing the use of resources, widening eligibility criteria, and minimizing exposure to futile treatments and placebo is critical to the development of effective treatments. Our proposed design modifications could circumvent important pitfalls and may serve as a blueprint for future clinical trials in this population

    Comprehensive analysis of epigenetic clocks reveals associations between disproportionate biological ageing and hippocampal volume

    Get PDF
    The concept of age acceleration, the difference between biological age and chronological age, is of growing interest, particularly with respect to age-related disorders, such as Alzheimer’s Disease (AD). Whilst studies have reported associations with AD risk and related phenotypes, there remains a lack of consensus on these associations. Here we aimed to comprehensively investigate the relationship between five recognised measures of age acceleration, based on DNA methylation patterns (DNAm age), and cross-sectional and longitudinal cognition and AD-related neuroimaging phenotypes (volumetric MRI and Amyloid-β PET) in the Australian Imaging, Biomarkers and Lifestyle (AIBL) and the Alzheimer’s Disease Neuroimaging Initiative (ADNI). Significant associations were observed between age acceleration using the Hannum epigenetic clock and cross-sectional hippocampal volume in AIBL and replicated in ADNI. In AIBL, several other findings were observed cross-sectionally, including a significant association between hippocampal volume and the Hannum and Phenoage epigenetic clocks. Further, significant associations were also observed between hippocampal volume and the Zhang and Phenoage epigenetic clocks within Amyloid-β positive individuals. However, these were not validated within the ADNI cohort. No associations between age acceleration and other Alzheimer’s disease-related phenotypes, including measures of cognition or brain Amyloid-β burden, were observed, and there was no association with longitudinal change in any phenotype. This study presents a link between age acceleration, as determined using DNA methylation, and hippocampal volume that was statistically significant across two highly characterised cohorts. The results presented in this study contribute to a growing literature that supports the role of epigenetic modifications in ageing and AD-related phenotypes
    • …
    corecore