130 research outputs found

    Data based identification and prediction of nonlinear and complex dynamical systems

    Get PDF
    We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.Peer reviewedPostprin

    From Dark Matter to the Earth's Deep Interior: There and Back Again

    Get PDF
    This thesis is a two-way transfer of knowledge between cosmology and seismology, aiming to substantially advance imaging methods and uncertainty quantification in both fields. I develop a method using wavelets to simulate the uncertainty in a set of existing global seismic tomography images to assess the robustness of mantle plume-like structures. Several plumes are identified, including one that is rarely discussed in the seismological literature. I present a new classification of the most likely deep mantle plumes from my automated method, potentially resolving past discrepancies between deep mantle plumes inferred by visual analysis of tomography models and other geophysical data. Following on from this, I create new images of the upper-most mantle and their associated uncertainties using a sparsity-promoting wavelet prior and an advanced probabilistic inversion scheme. These new images exhibit the expected tectonic features such as plate boundaries and continental cratons. Importantly, the uncertainties obtained are physically reasonable and informative, in that they reflect the heterogenous data distribution and also highlight artefacts due to an incomplete forward model. These inversions are a first step towards building a fully probabilistic upper-mantle model in a sparse wavelet basis. I then apply the same advanced probabilistic method to the problem of full-sky cosmological mass-mapping. However, this is severely limited by the computational complexity of high-resolution spherical harmonic transforms. In response to this, I use, for the first time in cosmology, a trans-dimensional algorithm to build galaxy cluster-scale mass-maps. This new approach performs better than the standard mass-mapping method, with the added benefit that uncertainties are naturally recovered. With more accurate mass-maps and uncertainties, this method will be a valuable tool for cosmological inference with the new high-resolution data expected from upcoming galaxy surveys, potentially providing new insights into the interactions of dark matter particles in colliding galaxy cluster systems

    Machine learning in solar physics

    Full text link
    The application of machine learning in solar physics has the potential to greatly enhance our understanding of the complex processes that take place in the atmosphere of the Sun. By using techniques such as deep learning, we are now in the position to analyze large amounts of data from solar observations and identify patterns and trends that may not have been apparent using traditional methods. This can help us improve our understanding of explosive events like solar flares, which can have a strong effect on the Earth environment. Predicting hazardous events on Earth becomes crucial for our technological society. Machine learning can also improve our understanding of the inner workings of the sun itself by allowing us to go deeper into the data and to propose more complex models to explain them. Additionally, the use of machine learning can help to automate the analysis of solar data, reducing the need for manual labor and increasing the efficiency of research in this field.Comment: 100 pages, 13 figures, 286 references, accepted for publication as a Living Review in Solar Physics (LRSP

    Approximate inference methods in probabilistic machine learning and Bayesian statistics

    Get PDF
    This thesis develops new methods for efficient approximate inference in probabilistic models. Such models are routinely used in different fields, yet they remain computationally challenging as they involve high-dimensional integrals. We propose different approximate inference approaches addressing some challenges in probabilistic machine learning and Bayesian statistics. First, we present a Bayesian framework for genome-wide inference of DNA methylation levels and devise an efficient particle filtering and smoothing algorithm that can be used to identify differentially methylated regions between case and control groups. Second, we present a scalable inference approach for state space models by combining variational methods with sequential Monte Carlo sampling. The method is applied to self-exciting point process models that allow for flexible dynamics in the latent intensity function. Third, a new variational density motivated by copulas is developed. This new variational family can be beneficial compared with Gaussian approximations, as illustrated on examples with Bayesian neural networks. Lastly, we make some progress in a gradient-based adaptation of Hamiltonian Monte Carlo samplers by maximizing an approximation of the proposal entropy

    Sparse Approximation via Penalty Decomposition Methods

    Full text link
    In this paper we consider sparse approximation problems, that is, general l0l_0 minimization problems with the l0l_0-"norm" of a vector being a part of constraints or objective function. In particular, we first study the first-order optimality conditions for these problems. We then propose penalty decomposition (PD) methods for solving them in which a sequence of penalty subproblems are solved by a block coordinate descent (BCD) method. Under some suitable assumptions, we establish that any accumulation point of the sequence generated by the PD methods satisfies the first-order optimality conditions of the problems. Furthermore, for the problems in which the l0l_0 part is the only nonconvex part, we show that such an accumulation point is a local minimizer of the problems. In addition, we show that any accumulation point of the sequence generated by the BCD method is a saddle point of the penalty subproblem. Moreover, for the problems in which the l0l_0 part is the only nonconvex part, we establish that such an accumulation point is a local minimizer of the penalty subproblem. Finally, we test the performance of our PD methods by applying them to sparse logistic regression, sparse inverse covariance selection, and compressed sensing problems. The computational results demonstrate that our methods generally outperform the existing methods in terms of solution quality and/or speed.Comment: 31 pages, 3 figures and 9 tables. arXiv admin note: substantial text overlap with arXiv:1008.537

    Bayesian large-scale structure inference and cosmic web analysis

    Full text link
    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the first quantitative inference of the cosmological initial conditions and of the formation history of the observed structures. We then use these results for several cosmographic projects aiming at analyzing and classifying the large-scale structure. In particular, we build an enhanced catalog of cosmic voids probed at the level of the dark matter distribution, deeper than with the galaxies. We present detailed probabilistic maps of the dynamic cosmic web, and offer a general solution to the problem of classifying structures in the presence of uncertainty. The results described in this thesis constitute accurate chrono-cosmography of the inhomogeneous cosmic structure.Comment: 237 pages, 63 figures, 14 tables. PhD thesis, Institut d'Astrophysique de Paris, September 2015 (advisor: B. Wandelt). Contains the papers arXiv:1305.4642, arXiv:1409.6308, arXiv:1410.0355, arXiv:1502.02690, arXiv:1503.00730, arXiv:1507.08664 and draws from arXiv:1403.1260. Full version including high-resolution figures available from the author's websit

    Gaining Insight into Determinants of Physical Activity using Bayesian Network Learning

    Get PDF
    Contains fulltext : 228326pre.pdf (preprint version ) (Open Access) Contains fulltext : 228326pub.pdf (publisher's version ) (Open Access)BNAIC/BeneLearn 202
    • …
    corecore