6,048 research outputs found

    Median evidential c-means algorithm and its application to community detection

    Get PDF
    Median clustering is of great value for partitioning relational data. In this paper, a new prototype-based clustering method, called Median Evidential C-Means (MECM), which is an extension of median c-means and median fuzzy c-means on the theoretical framework of belief functions is proposed. The median variant relaxes the restriction of a metric space embedding for the objects but constrains the prototypes to be in the original data set. Due to these properties, MECM could be applied to graph clustering problems. A community detection scheme for social networks based on MECM is investigated and the obtained credal partitions of graphs, which are more refined than crisp and fuzzy ones, enable us to have a better understanding of the graph structures. An initial prototype-selection scheme based on evidential semi-centrality is presented to avoid local premature convergence and an evidential modularity function is defined to choose the optimal number of communities. Finally, experiments in synthetic and real data sets illustrate the performance of MECM and show its difference to other methods

    Evidential relational clustering using medoids

    Get PDF
    In real clustering applications, proximity data, in which only pairwise similarities or dissimilarities are known, is more general than object data, in which each pattern is described explicitly by a list of attributes. Medoid-based clustering algorithms, which assume the prototypes of classes are objects, are of great value for partitioning relational data sets. In this paper a new prototype-based clustering method, named Evidential C-Medoids (ECMdd), which is an extension of Fuzzy C-Medoids (FCMdd) on the theoretical framework of belief functions is proposed. In ECMdd, medoids are utilized as the prototypes to represent the detected classes, including specific classes and imprecise classes. Specific classes are for the data which are distinctly far from the prototypes of other classes, while imprecise classes accept the objects that may be close to the prototypes of more than one class. This soft decision mechanism could make the clustering results more cautious and reduce the misclassification rates. Experiments in synthetic and real data sets are used to illustrate the performance of ECMdd. The results show that ECMdd could capture well the uncertainty in the internal data structure. Moreover, it is more robust to the initializations compared with FCMdd.Comment: in The 18th International Conference on Information Fusion, July 2015, Washington, DC, USA , Jul 2015, Washington, United State

    Maximin and maximal solutions for linear programming problems with possibilistic uncertainty

    Get PDF
    We consider linear programming problems with uncertain constraint coefficients described by intervals or, more generally, possi-bility distributions. The uncertainty is given a behavioral interpretation using coherent lower previsions from the theory of imprecise probabilities. We give a meaning to the linear programming problems by reformulating them as decision problems under such imprecise-probabilistic uncer-tainty. We provide expressions for and illustrations of the maximin and maximal solutions of these decision problems and present computational approaches for dealing with them

    The Combination of Paradoxical, Uncertain, and Imprecise Sources of Information based on DSmT and Neutro-Fuzzy Inference

    Full text link
    The management and combination of uncertain, imprecise, fuzzy and even paradoxical or high conflicting sources of information has always been, and still remains today, of primal importance for the development of reliable modern information systems involving artificial reasoning. In this chapter, we present a survey of our recent theory of plausible and paradoxical reasoning, known as Dezert-Smarandache Theory (DSmT) in the literature, developed for dealing with imprecise, uncertain and paradoxical sources of information. We focus our presentation here rather on the foundations of DSmT, and on the two important new rules of combination, than on browsing specific applications of DSmT available in literature. Several simple examples are given throughout the presentation to show the efficiency and the generality of this new approach. The last part of this chapter concerns the presentation of the neutrosophic logic, the neutro-fuzzy inference and its connection with DSmT. Fuzzy logic and neutrosophic logic are useful tools in decision making after fusioning the information using the DSm hybrid rule of combination of masses.Comment: 20 page

    How much of commonsense and legal reasoning is formalizable? A review of conceptual obstacles

    Get PDF
    Fifty years of effort in artificial intelligence (AI) and the formalization of legal reasoning have produced both successes and failures. Considerable success in organizing and displaying evidence and its interrelationships has been accompanied by failure to achieve the original ambition of AI as applied to law: fully automated legal decision-making. The obstacles to formalizing legal reasoning have proved to be the same ones that make the formalization of commonsense reasoning so difficult, and are most evident where legal reasoning has to meld with the vast web of ordinary human knowledge of the world. Underlying many of the problems is the mismatch between the discreteness of symbol manipulation and the continuous nature of imprecise natural language, of degrees of similarity and analogy, and of probabilities

    Imprecise probabilistic evaluation of sewer flooding in urban drainage systems using random set theory

    Get PDF
    publication-status: Publishedtypes: ArticleCopyright © 2011 American Geophysical UnionUncertainty analysis is widely applied in water system modeling to quantify prediction uncertainty from models and data. Conventional methods typically handle various kinds of uncertainty using a single characterizing approach, be it probability theory or fuzzy set theory. However, using a single approach may not be appropriate, particularly when uncertainties are of different types. For example, in sewer flood estimation problems, random rainfall variables are used as model inputs and imprecise or subjective information is used to define model parameters. This paper presents a general framework for sewer flood estimation that enables simultaneous consideration of two types of uncertainty: randomness from rainfall data represented using imprecise probabilities and imprecision from model parameters represented by fuzzy numbers. These two types of uncertainties are combined using random set theory and then propagated through a hydrodynamic urban drainage model. Two propagation methods, i.e., discretization and Monte Carlo based methods, are presented and compared, with the latter shown to be much more computationally efficient and hence recommended for high-dimensional problems. The model output (flood depth) is generated in the form of lower and upper cumulative probabilities, which are best estimates given the various stochastic and epistemic uncertainties considered and which embrace the unknown true cumulative probability. The distance between the cumulative probabilities represents the extent of imprecise, incomplete, or conflicting information and can be reduced only when more knowledge is available. The proposed methodology has a more complete and thus more accurate representation of uncertainty in data and models and can effectively handle different uncertainty characterizations in a single, integrated framework for sewer flood estimation

    Uncertainty Analysis of the Adequacy Assessment Model of a Distributed Generation System

    Full text link
    Due to the inherent aleatory uncertainties in renewable generators, the reliability/adequacy assessments of distributed generation (DG) systems have been particularly focused on the probabilistic modeling of random behaviors, given sufficient informative data. However, another type of uncertainty (epistemic uncertainty) must be accounted for in the modeling, due to incomplete knowledge of the phenomena and imprecise evaluation of the related characteristic parameters. In circumstances of few informative data, this type of uncertainty calls for alternative methods of representation, propagation, analysis and interpretation. In this study, we make a first attempt to identify, model, and jointly propagate aleatory and epistemic uncertainties in the context of DG systems modeling for adequacy assessment. Probability and possibility distributions are used to model the aleatory and epistemic uncertainties, respectively. Evidence theory is used to incorporate the two uncertainties under a single framework. Based on the plausibility and belief functions of evidence theory, the hybrid propagation approach is introduced. A demonstration is given on a DG system adapted from the IEEE 34 nodes distribution test feeder. Compared to the pure probabilistic approach, it is shown that the hybrid propagation is capable of explicitly expressing the imprecision in the knowledge on the DG parameters into the final adequacy values assessed. It also effectively captures the growth of uncertainties with higher DG penetration levels
    • …
    corecore