20,517 research outputs found

    Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    Full text link
    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.Comment: 19 pages, 8 Fig

    Robust Online Hamiltonian Learning

    Get PDF
    In this work we combine two distinct machine learning methodologies, sequential Monte Carlo and Bayesian experimental design, and apply them to the problem of inferring the dynamical parameters of a quantum system. We design the algorithm with practicality in mind by including parameters that control trade-offs between the requirements on computational and experimental resources. The algorithm can be implemented online (during experimental data collection), avoiding the need for storage and post-processing. Most importantly, our algorithm is capable of learning Hamiltonian parameters even when the parameters change from experiment-to-experiment, and also when additional noise processes are present and unknown. The algorithm also numerically estimates the Cramer-Rao lower bound, certifying its own performance.Comment: 24 pages, 12 figures; to appear in New Journal of Physic

    Determining global parameters of the oscillations of solar-like stars

    Full text link
    Helioseismology has enabled us to better understand the solar interior, while also allowing us to better constrain solar models. But now is a tremendous epoch for asteroseismology as space missions dedicated to studying stellar oscillations have been launched within the last years (MOST and CoRoT). CoRoT has already proved valuable results for many types of stars, while Kepler, which was launched in March 2009, will provide us with a huge number of seismic data very soon. This is an opportunity to better constrain stellar models and to finally understand stellar structure and evolution. The goal of this research work is to estimate the global parameters of any solar-like oscillating target in an automatic manner. We want to determine the global parameters of the acoustic modes (large separation, range of excited pressure modes, maximum amplitude, and its corresponding frequency), retrieve the surface rotation period of the star and use these results to estimate the global parameters of the star (radius and mass).To prepare the analysis of hundreds of solar-like oscillating stars, we have developed a robust and automatic pipeline. The pipeline consists of data analysis techniques, such as Fast Fourier Transform, wavelets, autocorrelation, as well as the application of minimisation algorithms for stellar-modelling. We apply our pipeline to some simulated lightcurves from the asteroFLAG team and the Aarhus-asteroFLAG simulator, and obtain results that are consistent with the input data to the simulations. Our strategy gives correct results for stars with magnitudes below 11 with only a few 10% of bad determinations among the reliable results. We then apply the pipeline to the Sun and three CoRoT targets.In particular we determine the parameters of the Sun, HD49933, HD181906, and HD181420.Comment: 15 pages, 17 figures, accepted for publication in A&

    On the smoothness of nonlinear system identification

    Full text link
    We shed new light on the \textit{smoothness} of optimization problems arising in prediction error parameter estimation of linear and nonlinear systems. We show that for regions of the parameter space where the model is not contractive, the Lipschitz constant and β\beta-smoothness of the objective function might blow up exponentially with the simulation length, making it hard to numerically find minima within those regions or, even, to escape from them. In addition to providing theoretical understanding of this problem, this paper also proposes the use of multiple shooting as a viable solution. The proposed method minimizes the error between a prediction model and the observed values. Rather than running the prediction model over the entire dataset, multiple shooting splits the data into smaller subsets and runs the prediction model over each subset, making the simulation length a design parameter and making it possible to solve problems that would be infeasible using a standard approach. The equivalence to the original problem is obtained by including constraints in the optimization. The new method is illustrated by estimating the parameters of nonlinear systems with chaotic or unstable behavior, as well as neural networks. We also present a comparative analysis of the proposed method with multi-step-ahead prediction error minimization

    Response-adaptive dose-finding under model uncertainty

    Get PDF
    Dose-finding studies are frequently conducted to evaluate the effect of different doses or concentration levels of a compound on a response of interest. Applications include the investigation of a new medicinal drug, a herbicide or fertilizer, a molecular entity, an environmental toxin, or an industrial chemical. In pharmaceutical drug development, dose-finding studies are of critical importance because of regulatory requirements that marketed doses are safe and provide clinically relevant efficacy. Motivated by a dose-finding study in moderate persistent asthma, we propose response-adaptive designs addressing two major challenges in dose-finding studies: uncertainty about the dose-response models and large variability in parameter estimates. To allocate new cohorts of patients in an ongoing study, we use optimal designs that are robust under model uncertainty. In addition, we use a Bayesian shrinkage approach to stabilize the parameter estimates over the successive interim analyses used in the adaptations. This approach allows us to calculate updated parameter estimates and model probabilities that can then be used to calculate the optimal design for subsequent cohorts. The resulting designs are hence robust with respect to model misspecification and additionally can efficiently adapt to the information accrued in an ongoing study. We focus on adaptive designs for estimating the minimum effective dose, although alternative optimality criteria or mixtures thereof could be used, enabling the design to address multiple objectives.Comment: Published in at http://dx.doi.org/10.1214/10-AOAS445 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Automation Process for Morphometric Analysis of Volumetric CT Data from Pulmonary Vasculature in Rats

    Get PDF
    With advances in medical imaging scanners, it has become commonplace to generate large multidimensional datasets. These datasets require tools for a rapid, thorough analysis. To address this need, we have developed an automated algorithm for morphometric analysis incorporating A Visualization Workshop computational and image processing libraries for three-dimensional segmentation, vascular tree generation and structural hierarchical ordering with a two-stage numeric optimization procedure for estimating vessel diameters. We combine this new technique with our mathematical models of pulmonary vascular morphology to quantify structural and functional attributes of lung arterial trees. Our physiological studies require repeated measurements of vascular structure to determine differences in vessel biomechanical properties between animal models of pulmonary disease. Automation provides many advantages including significantly improved speed and minimized operator interaction and biasing. The results are validated by comparison with previously published rat pulmonary arterial micro-CT data analysis techniques, in which vessels were manually mapped and measured using intense operator intervention
    corecore