6 research outputs found

    Generalization of polynomial chaos for estimation of angular random variables

    Get PDF
    “The state of a dynamical system will rarely be known perfectly, requiring the variable elements in the state to become random variables. More accurate estimation of the uncertainty in the random variable results in a better understanding of how the random variable will behave at future points in time. Many methods exist for representing a random variable within a system including a polynomial chaos expansion (PCE), which expresses a random variable as a linear combination of basis polynomials. Polynomial chaos expansions have been studied at length for the joint estimation of states that are purely translational (i.e. described in Cartesian space); however, many dynamical systems also include non-translational states, such as angles. Many methods of quantifying the uncertainty in a random variable are not capable of representing angular random variables on the unit circle and instead rely on projections onto a tangent line. Any element of any space V can be quantified with a PCE if V is spanned by the expansion’s basis polynomials. This implies that, as long as basis polynomials span the unit circle, an angular random variable (either real or complex) can be quantified using a PCE. A generalization of the PCE is developed allowing for the representation of complex valued random variables, which includes complex representations of angles. Additionally, it is proposed that real valued polynomials that are orthogonal with respect to measures on the real valued unit circle can be used as basis polynomials in a chaos expansion, which reduces the additional numerical burden imposed by complex valued polynomials. Both complex and real unit circle PCEs are shown to accurately estimate angular random variables in independent and correlated multivariate dynamical systems”--Abstract, page iii

    Mathematics & Statistics 2017 APR Self-Study & Documents

    Get PDF
    UNM Mathematics & Statistics APR self-study report, review team report, response report, and initial action plan for Spring 2017, fulfilling requirements of the Higher Learning Commission

    Handbook of Mathematical Geosciences

    Get PDF
    This Open Access handbook published at the IAMG's 50th anniversary, presents a compilation of invited path-breaking research contributions by award-winning geoscientists who have been instrumental in shaping the IAMG. It contains 45 chapters that are categorized broadly into five parts (i) theory, (ii) general applications, (iii) exploration and resource estimation, (iv) reviews, and (v) reminiscences covering related topics like mathematical geosciences, mathematical morphology, geostatistics, fractals and multifractals, spatial statistics, multipoint geostatistics, compositional data analysis, informatics, geocomputation, numerical methods, and chaos theory in the geosciences

    Better predictions when models are wrong or underspecified

    Get PDF
    Many statistical methods rely on models of reality in order to learn from data and to make predictions about future data. By necessity, these models usually do not match reality exactly, but are either wrong (none of the hypotheses in the model provides an accurate description of reality) or underspecified (the hypotheses in the model describe only part of the data). In this thesis, we discuss three scenarios involving models that are wrong or underspecified. In each case, we find that standard statistical methods may fail, sometimes dramatically, and present different methods that continue to perform well even if the models are wrong or underspecified. The first two of these scenarios involve regression problems and investigate AIC (Akaike's Information Criterion) and Bayesian statistics. The third scenario has the famous Monty Hall problem as a special case, and considers the question how we can update our belief about an unknown outcome given new evidence when the precise relation between outcome and evidence is unknown.UBL - phd migration 201

    Better predictions when models are wrong or underspecified

    Get PDF
    Many statistical methods rely on models of reality in order to learn from data and to make predictions about future data. By necessity, these models usually do not match reality exactly, but are either wrong (none of the hypotheses in the model provides an accurate description of reality) or underspecified (the hypotheses in the model describe only part of the data). In this thesis, we discuss three scenarios involving models that are wrong or underspecified. In each case, we find that standard statistical methods may fail, sometimes dramatically, and present different methods that continue to perform well even if the models are wrong or underspecified. The first two of these scenarios involve regression problems and investigate AIC (Akaike's Information Criterion) and Bayesian statistics. The third scenario has the famous Monty Hall problem as a special case, and considers the question how we can update our belief about an unknown outcome given new evidence when the precise relation between outcome and evidence is unknown.UBL - phd migration 201
    corecore