638 research outputs found

    Logic of agreement: Foundations, semantic system and proof theory

    Get PDF
    AbstractIn this paper a multi-valued propositional logic — logic of agreement — in terms of its model theory and inference system is presented. This formal system is the natural consequence of a new way to approach concepts as commonsense knowledge, uncertainty and approximate reasoning — the point of view of agreement. Particularly, it is discussed a possible extension of the Classical Theory of Sets based on the idea that, instead of trying to conceptualize sets as “fuzzy” or “vague” entities, it is more adequate to define membership as the result of a partial agreement among a group of individual agents. Furthermore, it is shown that the concept of agreement provides a framework for the development of a formal and sound explanation for concepts (e.g. fuzzy sets) which lack formal semantics. According to the definition of agreement, an individual agent agrees or not with the fact that an object possesses a certain property. A clear distinction is then established, between an individual agent — to whom deciding whether an element belongs to a set is just a yes or no matter — and a commonsensical agent — the one who interprets the knowledge shared by a certain group of people. Finally, the logic of agreement is presented and discussed. As it is assumed the existence of several individual agents, the semantic system is based on the perspective that each individual agent defines her/his own conceptualization of reality. So the semantics of the logic of agreement can be seen as being similar to a semantics of possible worlds, one for each individual agent. The proof theory is an extension of a natural deduction system, using supported formulas and incorporating only inference rules. Moreover, the soundness and completeness of the logic of agreement are also presented

    A discriminative analysis of approaches to ranking fuzzy numbers in fuzzy decision making

    Get PDF
    This paper presents a discriminative analysis of approaches to ranking fuzzy numbers in fuzzy decision making based on a comprehensive review of existing approaches. The consistency and effectiveness of the approaches to ranking fuzzy numbers are examined in terms of two objective measures developed, leading to a better understanding of the relative performance of individual approaches in ranking fuzzy numbers. Representative fuzzy numbers are selected for carrying out the comparative study of several typical approaches in ranking fuzzy numbers. Several interesting findings are identified which may be of practical significance to fuzzy decision making in real situations

    Signature Verification and Forgery Detection System

    Get PDF
    This paper presents an innovative approach for signature verification and forgery detection based on fuzzy modeling. The signature images are binarized and resized to a fixed size window and are then thinned. The thinned image is then partitioned into a fixed number of eight sub-images called boxes. This partition is done using the horizontal density approximation approach. Each sub-image is then further resized and again partitioned into twelve further sub-images using the uniform partitioning approach. The features of consideration are normalized vector angle and distance from each box. Each feature extracted from sample signatures gives rise to fuzzy sets. Since the choice of a proper fuzzification function is crucial for verification, we have devised a new fuzzification function with structural parameters, which is able to adapt to the variations in fuzzy sets. This function is employed to develop a complete forgery detection and verification system

    Simulation-based evaluation of defuzzification-based approaches to fuzzy multi-attribute decision making

    Get PDF
    This paper presents a simulation-based study to evaluate the performance of 12 defuzzification-based approaches for solving the general fuzzy multiattribute decision-making (MADM) problem requiring cardinal ranking of decision alternatives. These approaches are generated based on six defuzzification methods in conjunction with the simple additive weighting (SAW) method and the technique for order preference by similarity to the ideal solution method. The consistency and effectiveness of these approaches are examined in terms of four new objective performance measures, which are based on five evaluation indexes. The Simulation result shows that the approaches, which are capable of using all the available information on fuzzy numbers, effectively in the defuzzification process, produce more consistent ranking outcomes. In particular, the SAW method with the degree of dominance defuzzification is proved to be the overall best performed approach, which is, followed by the SAW method with the area center defuzzification. These findings are of practical significance in real-world settings where the selection of the defuzzification-based approaches is required in solving the general fuzzy MADM problems under specific decision contexts

    A Procedure for Extending Input Selection Algorithms to Low Quality Data in Modelling Problems with Application to the Automatic Grading of Uploaded Assignments

    Get PDF
    When selecting relevant inputs in modeling problems with low quality data, the ranking of the most informative inputs is also uncertain. In this paper, this issue is addressed through a new procedure that allows the extending of different crisp feature selection algorithms to vague data. The partial knowledge about the ordinal of each feature is modelled by means of a possibility distribution, and a ranking is hereby applied to sort these distributions. It will be shown that this technique makes the most use of the available information in some vague datasets. The approach is demonstrated in a real-world application. In the context of massive online computer science courses, methods are sought for automatically providing the student with a qualification through code metrics. Feature selection methods are used to find the metrics involved in the most meaningful predictions. In this study, 800 source code files, collected and revised by the authors in classroom Computer Science lectures taught between 2013 and 2014, are analyzed with the proposed technique, and the most relevant metrics for the automatic grading task are discussed.This work was supported by the Spanish Ministerio de Economía y Competitividad under Project TIN2011-24302, including funding from the European Regional Development Fund

    Are We Always Translating Signs Whether We Know It or Not?

    Get PDF
    This paper is an extended review-and-discussion of Professor Dinda Gorlee's recent book Wittgenstein in Translation: Exploring Semiotic Signa­tures. Professor Gorlee's volume focuses on Ludwig Wittgenstein's frag­ments, many of which ended up in enticingly interconnected books prima­rily edited by others, and Charles S. Peirce's stops and starts ending in mounds of unpublished papers, a fraction of which have found their way between book covers. Both authors challenge whoever might venture to translate them, especially when they have so much to say on vague and un­certain interpretations, which is to say, translations

    Noncomparabilities & Non Standard Logics

    Get PDF
    Many normative theories set forth in the welfare economics, distributive justice and cognate literatures posit noncomparabilities or incommensurabilities between magnitudes of various kinds. In some cases these gaps are predicated on metaphysical claims, in others upon epistemic claims, and in still others upon political-moral claims. I show that in all such cases they are best given formal expression in nonstandard logics that reject bivalence, excluded middle, or both. I do so by reference to an illustrative case study: a contradiction known to beset John Rawls\u27s selection and characterization of primary goods as the proper distribuendum in any distributively just society. The contradiction is avoided only by reformulating Rawls\u27s claims in a nonstandard form, which form happens also to cohere quite attractively with Rawls\u27s intuitive argumentation on behalf of his claims

    Noncomparabilities & Non Standard Logics

    Get PDF
    Many normative theories set forth in the welfare economics, distributive justice and cognate literatures posit noncomparabilities or incommensurabilities between magnitudes of various kinds. In some cases these gaps are predicated on metaphysical claims, in others upon epistemic claims, and in still others upon political-moral claims. I show that in all such cases they are best given formal expression in nonstandard logics that reject bivalence, excluded middle, or both. I do so by reference to an illustrative case study: a contradiction known to beset John Rawls\u27s selection and characterization of primary goods as the proper distribuendum in any distributively just society. The contradiction is avoided only by reformulating Rawls\u27s claims in a nonstandard form, which form happens also to cohere quite attractively with Rawls\u27s intuitive argumentation on behalf of his claims

    Fuzzy modeling for multicriteria decision making under uncertainty

    Get PDF
    Master'sMASTER OF ENGINEERIN
    corecore