9 research outputs found

    A probabilistic metric for the validation of computational models

    Get PDF
    A new validation metric is proposed that combines the use of a threshold based on the uncertainty in the measurement data with a normalized relative error, and that is robust in the presence of large variations in the data. The outcome from the metric is the probability that a model's predictions are representative of the real world based on the specific conditions and confidence level pertaining to the experiment from which the measurements were acquired. Relative error metrics are traditionally designed for use with a series of data values, but orthogonal decomposition has been employed to reduce the dimensionality of data matrices to feature vectors so that the metric can be applied to fields of data. Three previously published case studies are employed to demonstrate the efficacy of this quantitative approach to the validation process in the discipline of structural analysis, for which historical data were available; however, the concept could be applied to a wide range of disciplines and sectors where modelling and simulation play a pivotal role

    Comparative study of orthogonal decomposition of surface deformation in composite automotive panel

    Get PDF
    Model validation is a major step in achieving computational models with good predictive capabilities. It is normal practice to validate simulation models by comparing their numerical results to experimental data. A critical issue when performing a validation procedure with information-rich data fields is the identification of effective techniques for data compression to allow the application of statistical measures to the comparison of predictions and measurements. Recently, image decomposition techniques have successfully been applied in a laboratory environment to condense data and extract features of surface deformation maps obtained with the aid of optical measurement techniques and finite element analysis. In this work, the integration of orthogonal decomposition with a validation metrics is explored and a new metric introduced. For the purpose of illustration, a case study of a composite car bonnet liner subject to impact loading has been used. Displacement fields from the entire surface of the bonnet liner were captured at equal time increments for 0.1s following the impact and then decomposed while a parallel process was applied to predictions from a finite element model. The validation metric was calculated from the resultant feature vectors and used to evaluate the quality of the predictions. It is anticipated that the outcomes of this investigation will support the development of a robust validation methodology for industrial applications

    Comparing full-field data from structural components with complicated geometries

    Get PDF
    A new decomposition algorithm based on QR factorization is introduced for processing and comparing irregularly shaped stress and deformation datasets found in structural analysis. The algorithm improves the comparison of two-dimensional data fields from the surface of components where data is missing from the field of view due to obstructed measurement systems or component geometry that results in areas where no data is present. The technique enables the comparison of these irregularly shaped datasets without the need for interpolation or warping of the data necessary in some other decomposition techniques, for example, Chebyshev or Zernike decomposition. This ensures comparisons are only made between the available data in each dataset and thus similarity metrics are not biased by missing data. The decomposition and comparison technique has been applied during an impact experiment, a modal analysis, and a fatigue study, with the stress and displacement data obtained from finite-element analysis, digital image correlation and thermoelastic stress analysis. The results demonstrate that the technique can be used to process data from a range of sources and suggests the technique has the potential for use in a wide variety of applications

    Steps Towards Industrial Validation Experiments

    Get PDF
    Imaging systems for measuring surface displacement and strain fields such as stereoscopic Digital Image Correlation (DIC) are increasingly used in industry to validate model simulations. Recently, CEN has published a guideline for validation that is based on image decomposition to compare predicted and measured data fields. The CEN guideline was evaluated in an inter-laboratory study that demonstrated its usefulness in laboratory environments. This paper addresses the incorporation of the CEN methodology into an industrial environment and reports progress of the H2020 Clean Sky 2 project MOTIVATE. First, while DIC is a well-established technique, the estimation of its measurement uncertainty in an industrial environment is still being discussed, as the current approach to rely on the calibration uncertainty is insufficient. Second, in view of the push towards virtual testing it is important to harvest existing data in the course of the V&V activities before requesting a dedicated validation experiment, specifically at higher levels of the test pyramid. Finally, it is of uttermost importance to ensure compatibility and comparability of the simulation and measurement data so as to optimize the test matrix for maximum reliability and credibility of the simulations and a quantification of the model quality
    corecore