14 research outputs found

    Applications of Bregman Divergence Measures in Bayesian Modeling

    Get PDF
    This dissertation has mainly focused on the development of statistical theory, methodology, and application from a Bayesian perspective using a general class of divergence measures (or loss functions), called Bregman divergence. Many applications of Bregman divergence have played a key role in recent advances in machine learning. My goal is to turn the spotlight on Bregman divergence and its applications in Bayesian modeling. Since Bregman divergence includes many well-known loss functions such as squared error loss, Kullback-Leibler divergence, Itakura-Saito distance, and Mahalanobis distance, the theoretical and methodological development unify and extend many existing Bayesian methods. The broad applicability of both Bregman divergence and Bayesian approach can handle diverse types of data such as circular data, high-dimensional data, multivariate data and functional data. Furthermore, the developed methods are flexible to be applied to real applications in various scientific fields including biology, physical sciences, and engineering

    Worldwide Measles Vaccination: Trends, Outliers, and Projections

    No full text
    corecore