6,028 research outputs found

    Introductory Chapter: The Patient Presenting with Chest Pain

    Get PDF

    China

    Get PDF
    The mere notion of bankruptcy, liquidation or reorganisation of industrial enterprises was long considered anathema in the People\u27s Republic of China (PRC or China), and directly contrary to the underlying logic of a centrally planned, state-owned economy and industrial system. The state\u27s reluctance to allow bankruptcies was rooted in the ideology of the governing Communist Party but also reflects fiscal constraints with respect to payments to unemployed workers and the recapitalisation of state-owned commercial banks forced to write off loans as bad debts. However, such notions have gained wider acceptance concurrent with: China\u27s ongoing transformation to a socialist market economy; reform of the \u27state-owned enterprise\u27 (SOE) system; corporatisation and the gradual removal of the state from control and ownership of enterprises; and the requirement that industrial enterprises survive as independent economic entities without government allocations of capital

    Application of Bayesian Principles to the Evaluation of Coronary Artery Disease in the Modern Era

    Get PDF
    The number of testing modalities available for the diagnosis of significant coronary artery disease has grown over the last few decades. Inappropriate utilization of these tests often leads to: (i) further investigation, (ii) physician and patient uncertainty, (iii) harm and poor outcomes, and (iv) increase in health care costs. An informed approach to the evaluation of the patients with stable ischemic chest pain can lead to efficient use of resources and better outcomes. Throughout the course of this chapter, we will explain how the applications of age-old statistical principles are still relevant in this modern era of technological advancement

    Targeting the Microbiota to Address Diet-Induced Obesity: A Time Dependent Challenge

    Get PDF
    peer-reviewedLinks between the gut microbiota and host metabolism have provided new perspectives on obesity. We previously showed that the link between the microbiota and fat deposition is age- and time-dependent subject to microbial adaptation to diet over time. We also demonstrated reduced weight gain in diet-induced obese (DIO) mice through manipulation of the gut microbiota with vancomycin or with the bacteriocin-producing probiotic Lactobacillus salivarius UCC118 (Bac+), with metabolic improvement achieved in DIO mice in receipt of vancomycin. However, two phases of weight gain were observed with effects most marked early in the intervention phase. Here, we compare the gut microbial populations at the early relative to the late stages of intervention using a high throughput sequencing-based analysis to understand the temporal relationship between the gut microbiota and obesity. This reveals several differences in microbiota composition over the intervening period. Vancomycin dramatically altered the gut microbiota composition, relative to controls, at the early stages of intervention after which time some recovery was evident. It was also revealed that Bac+ treatment initially resulted in the presence of significantly higher proportions of Peptococcaceae and significantly lower proportions of Rikenellaceae and Porphyromonadaceae relative to the gut microbiota of L. salivarius UCC118 bacteriocin negative (Bac-) administered controls. These differences were no longer evident at the later time. The results highlight the resilience of the gut microbiota and suggest that interventions may need to be monitored and continually adjusted to ensure sustained modification of the gut microbiota.The authors are supported in part by Teagasc, Science Foundation Ireland (in the form of a research centre grant to the Alimentary Pharmabiotic Centre and PI awards to PWOT and PC) and by Alimentary Health Ltd

    Adam through a Second-Order Lens

    Full text link
    Research into optimisation for deep learning is characterised by a tension between the computational efficiency of first-order, gradient-based methods (such as SGD and Adam) and the theoretical efficiency of second-order, curvature-based methods (such as quasi-Newton methods and K-FAC). We seek to combine the benefits of both approaches into a single computationally-efficient algorithm. Noting that second-order methods often depend on stabilising heuristics (such as Levenberg-Marquardt damping), we propose AdamQLR: an optimiser combining damping and learning rate selection techniques from K-FAC (Martens and Grosse, 2015) with the update directions proposed by Adam, inspired by considering Adam through a second-order lens. We evaluate AdamQLR on a range of regression and classification tasks at various scales, achieving competitive generalisation performance vs runtime.Comment: 28 pages, 15 figures, 4 tables. Submitted to ICLR 202

    Robust estimation : limit theorems and their applications

    Get PDF
    This thesis is concerned with the asymptotic theory of general M-estimators and some minimal distance estimators. Particular attention is paid to uniform convergence theory which is used to prove limit theorems for statistics that are usually implicitly defined as solutions of estimating equations. The thesis is divided into eight chapters and into three main sections. In Section A the theory of convergence is studied as a prelude to validating the use of the particular M-estimators given in Section B and C. Section B initially covers the view of robustness of Hampel (1968) but places more emphasis on the application of the notions of differentiability of functionals and on M-estimators of a general parameter that are robust against "tail" contamination. Sections A and B establish a base for a comparison of robustness and application aspects of minimal distance estimators, particularly with regard to their application to estimating mixtures of normal distributions. An important application of this is illustrated for the analysis of seismic data. This constitutes Section C. Chapter 1 is devoted to the study of uniform convergence theorems over classes of functions and sets allowing also the possibility that the underlying probability mechanism may be from a specified family. A new Glivenko-Cantelli type theorem is proved which has applications later to weakening differentiability requirements for the convergence of loss functions used in this thesis. For implicitly defined estimators it is important to clearly identify the estimator. By uniform convergence, asymptotic uniqueness in regions of the parameter space of solutions to estimating equations can be established. This then justifies the selection of solutions through appropriate statistics, thus defining estimators uniquely for all samples. This comes under the discussion of existence and consistency in Chapter 2. Chapter 3 includes central limit theorems and the law of the iterated logarithm for the general M-estimator, established under various conditions, both on the loss function and on the underlying distribution. Uniform convergence plays a central role in showing the validity of approximating expansions. Results are shown for both univariate and multivariate parameters. Arguments for the univariate parameter are often simpler or require weaker conditions. Our study of robustness is both of a theoretical and quantitative nature. Weak continuity and also Frechet differentiability with respect to Prokhorov, Levy and Kolmogorov distance functions are established for multivariate M-functionals under similar but necessarily stronger conditions than those required for asymptotic normality. Relationships between the conditions imposed on the class of loss functions in order to attain Frechet differentiability and those necessary and sufficient conditions placed on classes of functions for which uniform convergence of measures hold can be shown. Much weaker conditions exist for almost sure uniform convergence and this goes part way to explaining the restrictive nature of this functional derivative approach to showing asymptotic normality. In Chapter 5 the notion of a set of null influence is emphasized. This can be used to construct M-functionals robust (in terms of asymptotic bias and variance) against contamination in the "tails" of a distribution. This set can depend on the parameter being estimated and in this sense the resulting estimator is adaptive. Its construction is illustrated in Chapter 6 for the estimation of scale. Robustness against "tail" contamination is illustrated by numerical comparison with other M-estimators. Particular applications are given to inference in the joint estimation of location and scale where it is important to identify the root to the M-estimating equations. Techniques justified by uniform convergence are used here. Uniform convergence also lends itself to the use of a graphical method of plotting "expectation curves". It can be used for either identifying the M-estimator from multiple solutions of the defining equations or in large samples (e.g. > 50) as a visual indication of whether the fitted model is a good approximation for the underlying mechanism. Theorems based on uniform convergence are given that show a domain of convergence (numerical analysis interpretation) for the Newton-Raphson iteration method applied to M-estimating equations for the location parameter when redescending loss functions are used. The M-estimator theory provides a common framework whereby some minimal distance methods can be compared. Two established L₂ minimal distance estimators are shown to be general M-estimators. In particular a Cramer-Von Mises type distance estimator is shown to be qualitatively robust and have good small sample properties. Its applicability to some new mixture data from geological recordings, which clearly requires robust methods of analysis is demonstrated in Chapters 7 and 8
    corecore