232,665 research outputs found

    Interval Fuzzy Model for Robust Aircraft IMU Sensors Fault Detection

    Get PDF
    This paper proposes a data-based approach for a robust fault detection (FD) of the inertial measurement unit (IMU) sensors of an aircraft. Fuzzy interval models (FIMs) have been introduced for coping with the significant modeling uncertainties caused by poorly modeled aerodynamics. The proposed FIMs are used to compute robust prediction intervals for the measurements provided by the IMU sensors. Specifically, a nonlinear neural network (NN) model is used as central prediction of the sensor response while the uncertainty around the central estimation is captured by the FIM model. The uncertainty has been also modelled using a conventional linear Interval Model (IM) approach; this allows a quantitative evaluation of the benefits provided by the FIM approach. The identification of the IMs and of the FIMs was formalized as a linear matrix inequality (LMI) optimization problem using as cost function the (mean) amplitude of the prediction interval and as optimization variables the parameters defining the amplitudes of the intervals of the IMs and FIMs. Based on the identified models, FD validation tests have been successfully conducted using actual flight data of a P92 Tecnam aircraft by artificially injecting additive fault signals on the fault free IMU readings

    Predicting Skin Permeability by means of Computational Approaches : Reliability and Caveats in Pharmaceutical Studies

    Get PDF
    © 2019 American Chemical Society.The skin is the main barrier between the internal body environment and the external one. The characteristics of this barrier and its properties are able to modify and affect drug delivery and chemical toxicity parameters. Therefore, it is not surprising that permeability of many different compounds has been measured through several in vitro and in vivo techniques. Moreover, many different in silico approaches have been used to identify the correlation between the structure of the permeants and their permeability, to reproduce the skin behavior, and to predict the ability of specific chemicals to permeate this barrier. A significant number of issues, like interlaboratory variability, experimental conditions, data set building rationales, and skin site of origin and hydration, still prevent us from obtaining a definitive predictive skin permeability model. This review wants to show the main advances and the principal approaches in computational methods used to predict this property, to enlighten the main issues that have arisen, and to address the challenges to develop in future research.Peer reviewedFinal Accepted Versio

    An empirical learning-based validation procedure for simulation workflow

    Full text link
    Simulation workflow is a top-level model for the design and control of simulation process. It connects multiple simulation components with time and interaction restrictions to form a complete simulation system. Before the construction and evaluation of the component models, the validation of upper-layer simulation workflow is of the most importance in a simulation system. However, the methods especially for validating simulation workflow is very limit. Many of the existing validation techniques are domain-dependent with cumbersome questionnaire design and expert scoring. Therefore, this paper present an empirical learning-based validation procedure to implement a semi-automated evaluation for simulation workflow. First, representative features of general simulation workflow and their relations with validation indices are proposed. The calculation process of workflow credibility based on Analytic Hierarchy Process (AHP) is then introduced. In order to make full use of the historical data and implement more efficient validation, four learning algorithms, including back propagation neural network (BPNN), extreme learning machine (ELM), evolving new-neuron (eNFN) and fast incremental gaussian mixture model (FIGMN), are introduced for constructing the empirical relation between the workflow credibility and its features. A case study on a landing-process simulation workflow is established to test the feasibility of the proposed procedure. The experimental results also provide some useful overview of the state-of-the-art learning algorithms on the credibility evaluation of simulation models

    Full correction of scattering effects by using the radiative transfer theory for improved quantitative analysis of absorbing species in suspensions

    Get PDF
    Sample-to-sample photon path length variations that arise due to multiple scattering can be removed by decoupling absorption and scattering effects by using the radiative transfer theory, with a suitable set of measurements. For samples where particles both scatter and absorb light, the extracted bulk absorption spectrum is not completely free from nonlinear particle effects, since it is related to the absorption cross-section of particles that changes nonlinearly with particle size and shape. For the quantitative analysis of absorbing-only (i.e., nonscattering) species present in a matrix that contains a particulate species that absorbs and scatters light, a method to eliminate particle effects completely is proposed here, which utilizes the particle size information contained in the bulk scattering coefficient extracted by using the Mie theory to carry out an additional correction step to remove particle effects from bulk absorption spectra. This should result in spectra that are equivalent to spectra collected with only the liquid species in the mixture. Such an approach has the potential to significantly reduce the number of calibration samples as well as improve calibration performance. The proposed method was tested with both simulated and experimental data from a four-component model system
    corecore