232 research outputs found

    Rough sets, their extensions and applications

    Get PDF
    Rough set theory provides a useful mathematical foundation for developing automated computational systems that can help understand and make use of imperfect knowledge. Despite its recency, the theory and its extensions have been widely applied to many problems, including decision analysis, data-mining, intelligent control and pattern recognition. This paper presents an outline of the basic concepts of rough sets and their major extensions, covering variable precision, tolerance and fuzzy rough sets. It also shows the diversity of successful applications these theories have entailed, ranging from financial and business, through biological and medicine, to physical, art, and meteorological

    LearnFCA: A Fuzzy FCA and Probability Based Approach for Learning and Classification

    Get PDF
    Formal concept analysis(FCA) is a mathematical theory based on lattice and order theory used for data analysis and knowledge representation. Over the past several years, many of its extensions have been proposed and applied in several domains including data mining, machine learning, knowledge management, semantic web, software development, chemistry ,biology, medicine, data analytics, biology and ontology engineering. This thesis reviews the state-of-the-art of theory of Formal Concept Analysis(FCA) and its various extensions that have been developed and well-studied in the past several years. We discuss their historical roots, reproduce the original definitions and derivations with illustrative examples. Further, we provide a literature review of it’s applications and various approaches adopted by researchers in the areas of dataanalysis, knowledge management with emphasis to data-learning and classification problems. We propose LearnFCA, a novel approach based on FuzzyFCA and probability theory for learning and classification problems. LearnFCA uses an enhanced version of FuzzyLattice which has been developed to store class labels and probability vectors and has the capability to be used for classifying instances with encoded and unlabelled features. We evaluate LearnFCA on encodings from three datasets - mnist, omniglot and cancer images with interesting results and varying degrees of success. Adviser: Dr Jitender Deogu

    LEARNFCA: A FUZZY FCA AND PROBABILITY BASED APPROACH FOR LEARNING AND CLASSIFICATION

    Get PDF
    Formal concept analysis(FCA) is a mathematical theory based on lattice and order theory used for data analysis and knowledge representation. Over the past several years, many of its extensions have been proposed and applied in several domains including data mining, machine learning, knowledge management, semantic web, software development, chemistry ,biology, medicine, data analytics, biology and ontology engineering. This thesis reviews the state-of-the-art of theory of Formal Concept Analysis(FCA) and its various extensions that have been developed and well-studied in the past several years. We discuss their historical roots, reproduce the original definitions and derivations with illustrative examples. Further, we provide a literature review of it’s applications and various approaches adopted by researchers in the areas of dataanalysis, knowledge management with emphasis to data-learning and classification problems. We propose LearnFCA, a novel approach based on FuzzyFCA and probability theory for learning and classification problems. LearnFCA uses an enhanced version of FuzzyLattice which has been developed to store class labels and probability vectors and has the capability to be used for classifying instances with encoded and unlabelled features. We evaluate LearnFCA on encodings from three datasets - mnist, omniglot and cancer images with interesting results and varying degrees of success. Adviser: Jitender Deogu

    Faculty of Sciences

    Get PDF
    A comprehensive study of fuzzy rough sets and their application in data reductio

    A semantical and computational approach to covering-based rough sets

    Get PDF

    From Quantum Metalanguage to the Logic of Qubits

    Get PDF
    The main aim of this thesis is to look for a logical deductive calculus (we will adopt sequent calculus, originally introduced in Gentzen, 1935), which could describe quantum information and its properties. More precisely, we intended to describe in logical terms the formation of the qubit (the unit of quantum information) which is a particular linear superposition of the two classical bits 0 and 1. To do so, we had to introduce the new connective "quantum superposition", in the logic of one qubit, Lq, as the classical conjunction cannot describe this quantum link.Comment: 138 pages, PhD thesis in Mathematic

    Performance modelling and the representation of large scale distributed system functions

    Get PDF
    This thesis presents a resource based approach to model generation for performance characterization and correctness checking of large scale telecommunications networks. A notion called the timed automaton is proposed and then developed to encapsulate behaviours of networking equipment, system control policies and non-deterministic user behaviours. The states of pooled network resources and the behaviours of resource consumers are represented as continually varying geometric patterns; these patterns form part of the data operated upon by the timed automata. Such a representation technique allows for great flexibility regarding the level of abstraction that can be chosen in the modelling of telecommunications systems. None the less, the notion of system functions is proposed to serve as a constraining framework for specifying bounded behaviours and features of telecommunications systems. Operational concepts are developed for the timed automata; these concepts are based on limit preserving relations. Relations over system states represent the evolution of system properties observable at various locations within the network under study. The declarative nature of such permutative state relations provides a direct framework for generating highly expressive models suitable for carrying out optimization experiments. The usefulness of the developed procedure is demonstrated by tackling a large scale case study, in particular the problem of congestion avoidance in networks; it is shown that there can be global coupling among local behaviours within a telecommunications network. The uncovering of such a phenomenon through a function oriented simulation is a contribution to the area of network modelling. The direct and faithful way of deriving performance metrics for loss in networks from resource utilization patterns is also a new contribution to the work area
    corecore