80 research outputs found

    Probabilistic time series forecasts with autoregressive transformation models

    Full text link
    Probabilistic forecasting of time series is an important matter in many applications and research fields. In order to draw conclusions from a probabilistic forecast, we must ensure that the model class used to approximate the true forecasting distribution is expressive enough. Yet, characteristics of the model itself, such as its uncertainty or its feature-outcome relationship are not of lesser importance. This paper proposes Autoregressive Transformation Models (ATMs), a model class inspired by various research directions to unite expressive distributional forecasts using a semi-parametric distribution assumption with an interpretable model specification. We demonstrate the properties of ATMs both theoretically and through empirical evaluation on several simulated and real-world forecasting datasets

    What drives inflation and how? Evidence from additive mixed models selected by cAIC

    Get PDF
    We analyze the forces that explain inflation using a panel of 122 countries from 1997 to 2015 with 37 regressors. Ninety-eight models motivated by economic theory are compared to a boosting algorithm, non-linearities and structural breaks are considered. We show that the typical estimation methods are likely to lead to fallacious policy conclusions, which motivates the use of a new approach that we propose in this paper. The boosting algorithm outperforms theory-based models. Furthermore, we extend the current software implementation of conditional Akaike Information Criteria for additive mixed models with observation weights. We present a novel two-step selection process suitable for a wide range of applications that enables to empirically compare theory- and data-driven models with varying data availability

    Complete devil's staircase and crystal--superfluid transitions in a dipolar XXZ spin chain: A trapped ion quantum simulation

    Full text link
    Systems with long-range interactions show a variety of intriguing properties: they typically accommodate many meta-stable states, they can give rise to spontaneous formation of supersolids, and they can lead to counterintuitive thermodynamic behavior. However, the increased complexity that comes with long-range interactions strongly hinders theoretical studies. This makes a quantum simulator for long-range models highly desirable. Here, we show that a chain of trapped ions can be used to quantum simulate a one-dimensional model of hard-core bosons with dipolar off-site interaction and tunneling, equivalent to a dipolar XXZ spin-1/2 chain. We explore the rich phase diagram of this model in detail, employing perturbative mean-field theory, exact diagonalization, and quasiexact numerical techniques (density-matrix renormalization group and infinite time evolving block decimation). We find that the complete devil's staircase -- an infinite sequence of crystal states existing at vanishing tunneling -- spreads to a succession of lobes similar to the Mott-lobes found in Bose--Hubbard models. Investigating the melting of these crystal states at increased tunneling, we do not find (contrary to similar two-dimensional models) clear indications of supersolid behavior in the region around the melting transition. However, we find that inside the insulating lobes there are quasi-long range (algebraic) correlations, opposed to models with nearest-neighbor tunneling which show exponential decay of correlations

    deepregression: A Flexible Neural Network Framework for Semi-Structured Deep Distributional Regression

    Get PDF
    In this paper we describe the implementation of semi-structured deep distributional regression, a flexible framework to learn conditional distributions based on the combination of additive regression models and deep networks. Our implementation encompasses (1) a modular neural network building system based on the deep learning library TensorFlow for the fusion of various statistical and deep learning approaches, (2) an orthogonalization cell to allow for an interpretable combination of different subnetworks, as well as (3) pre-processing steps necessary to set up such models. The software package allows to define models in a user-friendly manner via a formula interface that is inspired by classical statistical model frameworks such as mgcv. The package's modular design and functionality provides a unique resource for both scalable estimation of complex statistical models and the combination of approaches from deep learning and statistics. This allows for state-of-the-art predictive performance while simultaneously retaining the indispensable interpretability of classical statistical models
    corecore