59,673 research outputs found

    Minimum Distance Estimation of Milky Way Model Parameters and Related Inference

    Get PDF
    We propose a method to estimate the location of the Sun in the disk of the Milky Way using a method based on the Hellinger distance and construct confidence sets on our estimate of the unknown location using a bootstrap based method. Assuming the Galactic disk to be two-dimensional, the sought solar location then reduces to the radial distance separating the Sun from the Galactic center and the angular separation of the Galactic center to Sun line, from a pre-fixed line on the disk. On astronomical scales, the unknown solar location is equivalent to the location of us earthlings who observe the velocities of a sample of stars in the neighborhood of the Sun. This unknown location is estimated by undertaking pairwise comparisons of the estimated density of the observed set of velocities of the sampled stars, with densities estimated using synthetic stellar velocity data sets generated at chosen locations in the Milky Way disk according to four base astrophysical models. The "match" between the pair of estimated densities is parameterized by the affinity measure based on the familiar Hellinger distance. We perform a novel cross-validation procedure to establish a desirable "consistency" property of the proposed method.Comment: 25 pages, 10 Figures. This version incorporates the suggestions made by the referees. To appear in SIAM/ASA Journal on Uncertainty Quantificatio

    Bayesian Identification of Elastic Constants in Multi-Directional Laminate from Moir\'e Interferometry Displacement Fields

    Get PDF
    The ply elastic constants needed for classical lamination theory analysis of multi-directional laminates may differ from those obtained from unidirectional laminates because of three dimensional effects. In addition, the unidirectional laminates may not be available for testing. In such cases, full-field displacement measurements offer the potential of identifying several material properties simultaneously. For that, it is desirable to create complex displacement fields that are strongly influenced by all the elastic constants. In this work, we explore the potential of using a laminated plate with an open-hole under traction loading to achieve that and identify all four ply elastic constants (E 1, E 2, 12, G 12) at once. However, the accuracy of the identified properties may not be as good as properties measured from individual tests due to the complexity of the experiment, the relative insensitivity of the measured quantities to some of the properties and the various possible sources of uncertainty. It is thus important to quantify the uncertainty (or confidence) with which these properties are identified. Here, Bayesian identification is used for this purpose, because it can readily model all the uncertainties in the analysis and measurements, and because it provides the full coupled probability distribution of the identified material properties. In addition, it offers the potential to combine properties identified based on substantially different experiments. The full-field measurement is obtained by moir\'e interferometry. For computational efficiency the Bayesian approach was applied to a proper orthogonal decomposition (POD) of the displacement fields. The analysis showed that the four orthotropic elastic constants are determined with quite different confidence levels as well as with significant correlation. Comparison with manufacturing specifications showed substantial difference in one constant, and this conclusion agreed with earlier measurement of that constant by a traditional four-point bending test. It is possible that the POD approach did not take full advantage of the copious data provided by the full field measurements, and for that reason that data is provided for others to use (as on line material attached to the article)

    Teaching old sensors New tricks: archetypes of intelligence

    No full text
    In this paper a generic intelligent sensor software architecture is described which builds upon the basic requirements of related industry standards (IEEE 1451 and SEVA BS- 7986). It incorporates specific functionalities such as real-time fault detection, drift compensation, adaptation to environmental changes and autonomous reconfiguration. The modular based structure of the intelligent sensor architecture provides enhanced flexibility in regard to the choice of specific algorithmic realizations. In this context, the particular aspects of fault detection and drift estimation are discussed. A mixed indicative/corrective fault detection approach is proposed while it is demonstrated that reversible/irreversible state dependent drift can be estimated using generic algorithms such as the EKF or on-line density estimators. Finally, a parsimonious density estimator is presented and validated through simulated and real data for use in an operating regime dependent fault detection framework

    Short and long-term wind turbine power output prediction

    Get PDF
    In the wind energy industry, it is of great importance to develop models that accurately forecast the power output of a wind turbine, as such predictions are used for wind farm location assessment or power pricing and bidding, monitoring, and preventive maintenance. As a first step, and following the guidelines of the existing literature, we use the supervisory control and data acquisition (SCADA) data to model the wind turbine power curve (WTPC). We explore various parametric and non-parametric approaches for the modeling of the WTPC, such as parametric logistic functions, and non-parametric piecewise linear, polynomial, or cubic spline interpolation functions. We demonstrate that all aforementioned classes of models are rich enough (with respect to their relative complexity) to accurately model the WTPC, as their mean squared error (MSE) is close to the MSE lower bound calculated from the historical data. We further enhance the accuracy of our proposed model, by incorporating additional environmental factors that affect the power output, such as the ambient temperature, and the wind direction. However, all aforementioned models, when it comes to forecasting, seem to have an intrinsic limitation, due to their inability to capture the inherent auto-correlation of the data. To avoid this conundrum, we show that adding a properly scaled ARMA modeling layer increases short-term prediction performance, while keeping the long-term prediction capability of the model

    On the dialog between experimentalist and modeler in catchment hydrology

    Get PDF
    The dialog between experimentalist and modeler in catchment hydrology has been minimal to date. The experimentalist often has a highly detailed yet highly qualitative understanding of dominant runoff processes—thus there is often much more information content on the catchment than we use for calibration of a model. While modelers often appreciate the need for 'hard data' for the model calibration process, there has been little thought given to how modelers might access this 'soft' or process knowledge. We present a new method where soft data (i.e., qualitative knowledge from the experimentalist that cannot be used directly as exact numbers) are made useful through fuzzy measures of model-simulation and parameter-value acceptability. We developed a three-box lumped conceptual model for the Maimai catchment in New Zealand, a particularly well-studied process-hydrological research catchment. The boxes represent the key hydrological reservoirs that are known to have distinct groundwater dynamics, isotopic composition and solute chemistry. The model was calibrated against hard data (runoff and groundwater-levels) as well as a number of criteria derived from the soft data (e.g. percent new water, reservoir volume, etc). We achieved very good fits for the three-box model when optimizing the parameter values with only runoff (Reff=0.93). However, parameter sets obtained in this way showed in general a poor goodness-of-fit for other criteria such as the simulated new-water contributions to peak runoff. Inclusion of soft-data criteria in the model calibration process resulted in lower Reff-values (around 0.84 when including all criteria) but led to better overall performance, as interpreted by the experimentalist’s view of catchment runoff dynamics. The model performance with respect to soft data (like, for instance, the new water ratio) increased significantly and parameter uncertainty was reduced by 60% on average with the introduction of the soft data multi-criteria calibration. We argue that accepting lower model efficiencies for runoff is 'worth it' if one can develop a more 'real' model of catchment behavior. The use of soft data is an approach to formalize this exchange between experimentalist and modeler and to more fully utilize the information content from experimental catchments
    • 

    corecore