372 research outputs found

    The Parameter Houlihan: a solution to high-throughput identifiability indeterminacy for brutally ill-posed problems

    Get PDF
    One way to interject knowledge into clinically impactful forecasting is to use data assimilation, a nonlinear regression that projects data onto a mechanistic physiologic model, instead of a set of functions, such as neural networks. Such regressions have an advantage of being useful with particularly sparse, non-stationary clinical data. However, physiological models are often nonlinear and can have many parameters, leading to potential problems with parameter identifiability, or the ability to find a unique set of parameters that minimize forecasting error. The identifiability problems can be minimized or eliminated by reducing the number of parameters estimated, but reducing the number of estimated parameters also reduces the flexibility of the model and hence increases forecasting error. We propose a method, the parameter Houlihan, that combines traditional machine learning techniques with data assimilation, to select the right set of model parameters to minimize forecasting error while reducing identifiability problems. The method worked well: the data assimilation-based glucose forecasts and estimates for our cohort using the Houlihan-selected parameter sets generally also minimize forecasting errors compared to other parameter selection methods such as by-hand parameter selection. Nevertheless, the forecast with the lowest forecast error does not always accurately represent physiology, but further advancements of the algorithm provide a path for improving physiologic fidelity as well. Our hope is that this methodology represents a first step toward combining machine learning with data assimilation and provides a lower-threshold entry point for using data assimilation with clinical data by helping select the right parameters to estimate

    "Bayesian Estimation and Particle Filter for Max-Stable Processes"

    Get PDF
    Extreme values are often correlated over time, for example, in a financial time series, and these values carry various risks. Max-stable processes such as maxima of moving maxima (M3) processes have been recently considered in the literature to describe timedependent dynamics, which have been difficult to estimate. This paper first proposes a feasible and efficient Bayesian estimation method for nonlinear and non-Gaussian state space models based on these processes and describes a Markov chain Monte Carlo algorithm where the sampling efficiency is improved by the normal mixture sampler. Furthermore, a unique particle filter that adapts to extreme observations is proposed and shown to be highly accurate in comparison with other well-known filters. Our proposed algorithms were applied to daily minima of high-frequency stock return data, and a model comparison was conducted using marginal likelihoods to investigate the time-dependent dynamics in extreme stock returns for financial risk management.

    Using rational filters to uncover the first ringdown overtone in GW150914

    Full text link
    There have been debates in the literature about the existence of the first overtone in the ringdown of GW150914. We develop a novel Bayesian framework to reanalyze the data of this event, by incorporating a new technique, the "rational filter" that can clean particular modes from the ringdown signal. We examine the existence of the first overtone in GW150914 from multiple novel perspectives. First, we confirm that the estimates of the remnant black hole mass and spin are more consistent with those obtained from the full IMR signal when including the first overtone at an early stage of the ringdown (right after the inferred signal peak); such improvement fades away at later times. Second, we formulate a new way to compare the ringdown models with and without the first overtone by calculating the Bayes factor at different times during the ringdown. We obtain a Bayes factor of 600 at the time when the signal amplitude reaches its peak. The Bayes factor decreases sharply when moving away from the peak time and eventually oscillates around a small value when the overtone signal is expected to have decayed. Third, we clean the fundamental mode from the ringdown of GW150914 and estimate the amplitudes of the modes using the filtered data with MCMC. The inferred amplitude of the fundamental mode is ~0 whereas the amplitude of the first overtone remains almost unchanged, implying that the filtered data is consistent with a first-overtone-only template. Similarly, if we remove the first overtone from the GW150914 data, the filtered data are consistent with a fundamental-mode-only template. Finally, after removing the fundamental mode, we use MCMC to infer the remnant black hole mass and spin from the first overtone alone. We find the posteriors are still informative and consistent with those inferred from the fundamental mode

    Flexible modeling of dependence in volatility processes

    Get PDF
    This paper proposes a novel stochastic volatility model that draws from the exist- ing literature on autoregressive stochastic volatility models, aggregation of autoregres- sive processes, and Bayesian nonparametric modelling to create a stochastic volatility model that can capture long range dependence. The volatility process is assumed to be the aggregate of autoregressive processes where the distribution of the autoregressive coefficients is modelled using a flexible Bayesian approach. The model provides insight into the dynamic properties of the volatility. An efficient algorithm is defined which uses recently proposed adaptive Monte Carlo methods. The proposed model is applied to the daily returns of stocks
    • ā€¦
    corecore