78,928 research outputs found

    Analysing imperfect temporal information in GIS using the Triangular Model

    Get PDF
    Rough set and fuzzy set are two frequently used approaches for modelling and reasoning about imperfect time intervals. In this paper, we focus on imperfect time intervals that can be modelled by rough sets and use an innovative graphic model [i.e. the triangular model (TM)] to represent this kind of imperfect time intervals. This work shows that TM is potentially advantageous in visualizing and querying imperfect time intervals, and its analytical power can be better exploited when it is implemented in a computer application with graphical user interfaces and interactive functions. Moreover, a probabilistic framework is proposed to handle the uncertainty issues in temporal queries. We use a case study to illustrate how the unique insights gained by TM can assist a geographical information system for exploratory spatio-temporal analysis

    After-sales services optimisation through dynamic opportunistic maintenance: a wind energy case study

    Get PDF
    After-sales maintenance services can be a very profitable source of incomes for original equipment manufacturers (OEM) due to the increasing interest of assets’ users on performance-based contracts. However, when it concerns the product value-adding process, OEM have traditionally been more focused on improving their production processes, rather than on complementing their products by offering after-sales services; consequently leading to difficulties in offering them efficiently. Furthermore, both due to the high uncertainty of the assets’ behaviour and the inherent challenges of managing the maintenance process (e.g. maintenance strategy to be followed or resources to be deployed), it is complex to make business out of the provision of after-sales services. With the aim of helping the business and maintenance decision makers at this point, this paper proposes a framework for optimising the incomes of after-sales maintenance services through: 1) implementing advanced multi-objective opportunistic maintenance strategies that sistematically consider the assets’ operational context in order to perform preventive maintenance during most favourable conditions, 2) considering the specific OEMs’ and users’ needs, and 3) assessing both internal and external uncertainties that might condition the after-sales services’ success. The developed case study for the wind energy sector demonstrates the suitability of the presented framework for optimising the after-sales services.EU Framework Programme Horizon 2020, MSCA-RISE-2014: Marie SkƂodowska-Curie Research and Innovation Staff Exchange (RISE) (grant agreement number 645733- Sustain-Owner-H2020-MSCA-RISE-2014) and the EmaitekPlus 2016-2017 Program of the Basque Government

    A two-step fusion process for multi-criteria decision applied to natural hazards in mountains

    Get PDF
    Mountain river torrents and snow avalanches generate human and material damages with dramatic consequences. Knowledge about natural phenomenona is often lacking and expertise is required for decision and risk management purposes using multi-disciplinary quantitative or qualitative approaches. Expertise is considered as a decision process based on imperfect information coming from more or less reliable and conflicting sources. A methodology mixing the Analytic Hierarchy Process (AHP), a multi-criteria aid-decision method, and information fusion using Belief Function Theory is described. Fuzzy Sets and Possibilities theories allow to transform quantitative and qualitative criteria into a common frame of discernment for decision in Dempster-Shafer Theory (DST ) and Dezert-Smarandache Theory (DSmT) contexts. Main issues consist in basic belief assignments elicitation, conflict identification and management, fusion rule choices, results validation but also in specific needs to make a difference between importance and reliability and uncertainty in the fusion process

    Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study

    Get PDF
    BACKGROUND: Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. RESULTS: The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. CONCLUSIONS: Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens

    Implementing imperfect information in fuzzy databases

    Get PDF
    Information in real-world applications is often vague, imprecise and uncertain. Ignoring the inherent imperfect nature of real-world will undoubtedly introduce some deformation of human perception of real-world and may eliminate several substantial information, which may be very useful in several data-intensive applications. In database context, several fuzzy database models have been proposed. In these works, fuzziness is introduced at different levels. Common to all these proposals is the support of fuzziness at the attribute level. This paper proposes ïŹrst a rich set of data types devoted to model the different kinds of imperfect information. The paper then proposes a formal approach to implement these data types. The proposed approach was implemented within a relational object database model but it is generic enough to be incorporated into other database models.ou

    Impact of imperfect test sensitivity on determining risk factors : the case of bovine tuberculosis

    Get PDF
    Background Imperfect diagnostic testing reduces the power to detect significant predictors in classical cross-sectional studies. Assuming that the misclassification in diagnosis is random this can be dealt with by increasing the sample size of a study. However, the effects of imperfect tests in longitudinal data analyses are not as straightforward to anticipate, especially if the outcome of the test influences behaviour. The aim of this paper is to investigate the impact of imperfect test sensitivity on the determination of predictor variables in a longitudinal study. Methodology/Principal Findings To deal with imperfect test sensitivity affecting the response variable, we transformed the observed response variable into a set of possible temporal patterns of true disease status, whose prior probability was a function of the test sensitivity. We fitted a Bayesian discrete time survival model using an MCMC algorithm that treats the true response patterns as unknown parameters in the model. We applied our approach to epidemiological data of bovine tuberculosis outbreaks in England and investigated the effect of reduced test sensitivity in the determination of risk factors for the disease. We found that reduced test sensitivity led to changes to the collection of risk factors associated with the probability of an outbreak that were chosen in the ‘best’ model and to an increase in the uncertainty surrounding the parameter estimates for a model with a fixed set of risk factors that were associated with the response variable. Conclusions/Significance We propose a novel algorithm to fit discrete survival models for longitudinal data where values of the response variable are uncertain. When analysing longitudinal data, uncertainty surrounding the response variable will affect the significance of the predictors and should therefore be accounted for either at the design stage by increasing the sample size or at the post analysis stage by conducting appropriate sensitivity analyses

    Reduced order modeling of fluid flows: Machine learning, Kolmogorov barrier, closure modeling, and partitioning

    Full text link
    In this paper, we put forth a long short-term memory (LSTM) nudging framework for the enhancement of reduced order models (ROMs) of fluid flows utilizing noisy measurements. We build on the fact that in a realistic application, there are uncertainties in initial conditions, boundary conditions, model parameters, and/or field measurements. Moreover, conventional nonlinear ROMs based on Galerkin projection (GROMs) suffer from imperfection and solution instabilities due to the modal truncation, especially for advection-dominated flows with slow decay in the Kolmogorov width. In the presented LSTM-Nudge approach, we fuse forecasts from a combination of imperfect GROM and uncertain state estimates, with sparse Eulerian sensor measurements to provide more reliable predictions in a dynamical data assimilation framework. We illustrate the idea with the viscous Burgers problem, as a benchmark test bed with quadratic nonlinearity and Laplacian dissipation. We investigate the effects of measurements noise and state estimate uncertainty on the performance of the LSTM-Nudge behavior. We also demonstrate that it can sufficiently handle different levels of temporal and spatial measurement sparsity. This first step in our assessment of the proposed model shows that the LSTM nudging could represent a viable realtime predictive tool in emerging digital twin systems
    • 

    corecore