620 research outputs found

    Uncertainty Analysis of the Adequacy Assessment Model of a Distributed Generation System

    Full text link
    Due to the inherent aleatory uncertainties in renewable generators, the reliability/adequacy assessments of distributed generation (DG) systems have been particularly focused on the probabilistic modeling of random behaviors, given sufficient informative data. However, another type of uncertainty (epistemic uncertainty) must be accounted for in the modeling, due to incomplete knowledge of the phenomena and imprecise evaluation of the related characteristic parameters. In circumstances of few informative data, this type of uncertainty calls for alternative methods of representation, propagation, analysis and interpretation. In this study, we make a first attempt to identify, model, and jointly propagate aleatory and epistemic uncertainties in the context of DG systems modeling for adequacy assessment. Probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. Evidence theory is used to incorporate the two uncertainties under a single framework. Based on the plausibility and belief functions of evidence theory, the hybrid propagation approach is introduced. A demonstration is given on a DG system adapted from the IEEE 34 nodes distribution test feeder. Compared to the pure probabilistic approach, it is shown that the hybrid propagation is capable of explicitly expressing the imprecision in the knowledge on the DG parameters into the final adequacy values assessed. It also effectively captures the growth of uncertainties with higher DG penetration levels

    Characterizing and Extending Answer Set Semantics using Possibility Theory

    Full text link
    Answer Set Programming (ASP) is a popular framework for modeling combinatorial problems. However, ASP cannot easily be used for reasoning about uncertain information. Possibilistic ASP (PASP) is an extension of ASP that combines possibilistic logic and ASP. In PASP a weight is associated with each rule, where this weight is interpreted as the certainty with which the conclusion can be established when the body is known to hold. As such, it allows us to model and reason about uncertain information in an intuitive way. In this paper we present new semantics for PASP, in which rules are interpreted as constraints on possibility distributions. Special models of these constraints are then identified as possibilistic answer sets. In addition, since ASP is a special case of PASP in which all the rules are entirely certain, we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction, called weak disjunction, that has not been previously considered in the literature. In addition to introducing and motivating the semantics of weak disjunction, we also pinpoint its computational complexity. In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.Comment: 39 pages and 16 pages appendix with proofs. This article has been accepted for publication in Theory and Practice of Logic Programming, Copyright Cambridge University Pres

    New Graphical Model for Computing Optimistic Decisions in Possibility Theory Framework

    Get PDF
    This paper first proposes a new graphical model for decision making under uncertainty based on min-based possibilistic networks. A decision problem under uncertainty is described by means of two distinct min-based possibilistic networks: the first one expresses agent's knowledge while the second one encodes agent's preferences representing a qualitative utility. We then propose an efficient algorithm for computing optimistic optimal decisions using our new model for representing possibilistic decision making under uncertainty. We show that the computation of optimal decisions comes down to compute a normalization degree of the junction tree associated with the graph resulting from the fusion of agent's beliefs and preferences. This paper also proposes an alternative way for computing optimal optimistic decisions. The idea is to transform the two possibilistic networks into two equivalent possibilistic logic knowledge bases, one representing agent's knowledge and the other represents agent's preferences. We show that computing an optimal optimistic decision comes down to compute the inconsistency degree of the union of the two possibilistic bases augmented with a given decision

    Naive possibilistic classifiers for imprecise or uncertain numerical data

    Get PDF
    International audienceIn real-world problems, input data may be pervaded with uncertainty. In this paper, we investigate the behavior of naive possibilistic classifiers, as a counterpart to naive Bayesian ones, for dealing with classification tasks in the presence of uncertainty. For this purpose, we extend possibilistic classifiers, which have been recently adapted to numerical data, in order to cope with uncertainty in data representation. Here the possibility distributions that are used are supposed to encode the family of Gaussian probabilistic distributions that are compatible with the considered dataset. We consider two types of uncertainty: (i) the uncertainty associated with the class in the training set, which is modeled by a possibility distribution over class labels, and (ii) the imprecision pervading attribute values in the testing set represented under the form of intervals for continuous data. Moreover, the approach takes into account the uncertainty about the estimation of the Gaussian distribution parameters due to the limited amount of data available. We first adapt the possibilistic classification model, previously proposed for the certain case, in order to accommodate the uncertainty about class labels. Then, we propose an algorithm based on the extension principle to deal with imprecise attribute values. The experiments reported show the interest of possibilistic classifiers for handling uncertainty in data. In particular, the probability-to-possibility transform-based classifier shows a robust behavior when dealing with imperfect data

    General fuzzy min-max neural network for clustering and classification

    Get PDF
    This paper describes a general fuzzy min-max (GFMM) neural network which is a generalization and extension of the fuzzy min-max clustering and classification algorithms of Simpson (1992, 1993). The GFMM method combines supervised and unsupervised learning in a single training algorithm. The fusion of clustering and classification resulted in an algorithm that can be used as pure clustering, pure classification, or hybrid clustering classification. It exhibits a property of finding decision boundaries between classes while clustering patterns that cannot be said to belong to any of existing classes. Similarly to the original algorithms, the hyperbox fuzzy sets are used as a representation of clusters and classes. Learning is usually completed in a few passes and consists of placing and adjusting the hyperboxes in the pattern space; this is an expansion-contraction process. The classification results can be crisp or fuzzy. New data can be included without the need for retraining. While retaining all the interesting features of the original algorithms, a number of modifications to their definition have been made in order to accommodate fuzzy input patterns in the form of lower and upper bounds, combine the supervised and unsupervised learning, and improve the effectiveness of operations. A detailed account of the GFMM neural network, its comparison with the Simpson's fuzzy min-max neural networks, a set of examples, and an application to the leakage detection and identification in water distribution systems are given

    Possibilistic networks for uncertainty knowledge processing in student diagnosis

    Get PDF
    In this paper, a possibilistic network implementation for uncertain knowledge modeling of the diagnostic process is proposed as a means to achieve student diagnosis in intelligent tutoring system. This approach is proposed in the object oriented programming domain for diagnosis of students learning errors and misconception. In this expertise domain dependencies between data exist that are encoded in the structure of network. Also, it is available qualitative information about these data which are represented and interpreted with qualitative approach of possibility theory. The aim of student diagnosis system is to ensure an adapted support for the student and to sustain the student in personalized learning process and errors explanation

    Extending uncertainty formalisms to linear constraints and other complex formalisms

    Get PDF
    Linear constraints occur naturally in many reasoning problems and the information that they represent is often uncertain. There is a difficulty in applying AI uncertainty formalisms to this situation, as their representation of the underlying logic, either as a mutually exclusive and exhaustive set of possibilities, or with a propositional or a predicate logic, is inappropriate (or at least unhelpful). To overcome this difficulty, we express reasoning with linear constraints as a logic, and develop the formalisms based on this different underlying logic. We focus in particular on a possibilistic logic representation of uncertain linear constraints, a lattice-valued possibilistic logic, an assumption-based reasoning formalism and a Dempster-Shafer representation, proving some fundamental results for these extended systems. Our results on extending uncertainty formalisms also apply to a very general class of underlying monotonic logics

    Possibilistic classifiers for numerical data

    Get PDF
    International audienceNaive Bayesian Classifiers, which rely on independence hypotheses, together with a normality assumption to estimate densities for numerical data, are known for their simplicity and their effectiveness. However, estimating densities, even under the normality assumption, may be problematic in case of poor data. In such a situation, possibility distributions may provide a more faithful representation of these data. Naive Possibilistic Classifiers (NPC), based on possibility theory, have been recently proposed as a counterpart of Bayesian classifiers to deal with classification tasks. There are only few works that treat possibilistic classification and most of existing NPC deal only with categorical attributes. This work focuses on the estimation of possibility distributions for continuous data. In this paper we investigate two kinds of possibilistic classifiers. The first one is derived from classical or flexible Bayesian classifiers by applying a probability–possibility transformation to Gaussian distributions, which introduces some further tolerance in the description of classes. The second one is based on a direct interpretation of data in possibilistic formats that exploit an idea of proximity between data values in different ways, which provides a less constrained representation of them. We show that possibilistic classifiers have a better capability to detect new instances for which the classification is ambiguous than Bayesian classifiers, where probabilities may be poorly estimated and illusorily precise. Moreover, we propose, in this case, an hybrid possibilistic classification approach based on a nearest-neighbour heuristics to improve the accuracy of the proposed possibilistic classifiers when the available information is insufficient to choose between classes. Possibilistic classifiers are compared with classical or flexible Bayesian classifiers on a collection of benchmarks databases. The experiments reported show the interest of possibilistic classifiers. In particular, flexible possibilistic classifiers perform well for data agreeing with the normality assumption, while proximity-based possibilistic classifiers outperform others in the other cases. The hybrid possibilistic classification exhibits a good ability for improving accuracy
    • …
    corecore