705 research outputs found

    Formal Concept Analysis Applications in Bioinformatics

    Get PDF
    Bioinformatics is an important field that seeks to solve biological problems with the help of computation. One specific field in bioinformatics is that of genomics, the study of genes and their functions. Genomics can provide valuable analysis as to the interaction between how genes interact with their environment. One such way to measure the interaction is through gene expression data, which determines whether (and how much) a certain gene activates in a situation. Analyzing this data can be critical for predicting diseases or other biological reactions. One method used for analysis is Formal Concept Analysis (FCA), a computing technique based in partial orders that allows the user to examine the structural properties of binary data based on which subsets of the data set depend on each other. This thesis surveys, in breadth and depth, the current literature related to the use of FCA for bioinformatics, with particular focus on gene expression data. This includes descriptions of current data management techniques specific to FCA, such as lattice reduction, discretization, and variations of FCA to account for different data types. Advantages and shortcomings of using FCA for genomic investigations, as well as the feasibility of using FCA for this application are addressed. Finally, several areas for future doctoral research are proposed. Adviser: Jitender S. Deogu

    LEARNFCA: A FUZZY FCA AND PROBABILITY BASED APPROACH FOR LEARNING AND CLASSIFICATION

    Get PDF
    Formal concept analysis(FCA) is a mathematical theory based on lattice and order theory used for data analysis and knowledge representation. Over the past several years, many of its extensions have been proposed and applied in several domains including data mining, machine learning, knowledge management, semantic web, software development, chemistry ,biology, medicine, data analytics, biology and ontology engineering. This thesis reviews the state-of-the-art of theory of Formal Concept Analysis(FCA) and its various extensions that have been developed and well-studied in the past several years. We discuss their historical roots, reproduce the original definitions and derivations with illustrative examples. Further, we provide a literature review of it’s applications and various approaches adopted by researchers in the areas of dataanalysis, knowledge management with emphasis to data-learning and classification problems. We propose LearnFCA, a novel approach based on FuzzyFCA and probability theory for learning and classification problems. LearnFCA uses an enhanced version of FuzzyLattice which has been developed to store class labels and probability vectors and has the capability to be used for classifying instances with encoded and unlabelled features. We evaluate LearnFCA on encodings from three datasets - mnist, omniglot and cancer images with interesting results and varying degrees of success. Adviser: Jitender Deogu

    Blended intelligence of FCA with FLC for knowledge representation from clustered data in medical analysis

    Get PDF
    Formal concept analysis is the process of data analysis mechanism with emergent attractiveness across various fields such as data mining, robotics, medical, big data and so on. FCA is helpful to generate the new learning ontology based techniques. In medical field, some growing kids are facing the problem of representing their knowledge from their gathered prior data which is in the form of unordered and insufficient clustered data which is not supporting them to take the right decision on right time for solving the uncertainty based questionnaires. In the approach of decision theory, many mathematical replicas such as probability-allocation, crisp set, and fuzzy based set theory were designed to deals with knowledge representation based difficulties along with their characteristic. This paper is proposing new ideological blended approach of FCA with FLC and described with major objectives: primarily the FCA analyzes the data based on relationships between the set of objects of prior-attributes and the set of attributes based prior-data, which the data is framed with data-units implicated composition which are formal statements of idea of human thinking with conversion of significant intelligible explanation. Suitable rules are generated to explore the relationship among the attributes and used the formal concept analysis from these suitable rules to explore better knowledge and most important factors affecting the decision making. Secondly how the FLC derive the fuzzification, rule-construction and defuzzification methods implicated for representing the accurate knowledge for uncertainty based questionnaires. Here the FCA is projected to expand the FCA based conception with help of the objective based item set notions considered as the target which is implicated with the expanded cardinalities along with its weights which is associated through the fuzzy based inference decision rules. This approach is more helpful for medical experts for knowing the range of patient’s memory deficiency also for people whose are facing knowledge explorer deficiency

    Relational Approach to the L-Fuzzy Concept Analysis

    Get PDF
    Modern industrial production systems benefit from the classification and processing of objects and their attributes. In general, the object classification procedure can coincide with vagueness. Vagueness is a common problem in object analysis that exists at various stages of classification, including ambiguity in input data, overlapping boundaries between classes or regions, and uncertainty in defining or extracting the properties and relationships of objects. To manage the ambiguity mentioned in the classification of objects, using a framework for L-fuzzy relations, and displaying such uncertainties by it can be a solution. Obtaining the least unreliable and uncertain output associated with the original data is the main concern of this thesis. Therefore, my general approach to this research can be categorized as follows: We developed an L-Fuzzy Concept Analysis as a generalization of a regular Concept Analysis. We start our work by providing the input data. Data is stored in a table (database). The next step is the creation of the contexts and concepts from the given original data using some structures. In the next stage, rules, or patterns (Attribute Implications) from the data will be generated. This includes all rules and a minimal base of rules. All of them are using L-fuzziness due to uncertainty. This requires L-fuzzy relations that will be implemented as L -valued matrices. In the end, everything is nicely packed in a convenient application and implemented in Java programming language. Generally, our approach is done in an algebraic framework that covers both regular and L -Fuzzy FCA, simultaneously. The tables we started with are already L-valued (not crisp) in our implementation. In other words, we work with the L-Fuzzy data directly. This is the idea here. We start with vague data. In simple terms, the data is shown using L -valued tables (vague data) trying to relate objects with their attributes at the start of the implementation. Generating attribute implications from many-valued contexts by a relational theory is the purpose of this thesis, i.e, a range of degrees is used to indicate the relationship between objects and their properties. The smallest degree corresponds to the classical no and the greatest degree corresponds to the classical yes in the table

    A systematic literature review of soft set theory

    Get PDF
    [EN] Soft set theory, initially introduced through the seminal article ‘‘Soft set theory—First results’’ in 1999, has gained considerable attention in the field of mathematical modeling and decision-making. Despite its growing prominence, a comprehensive survey of soft set theory, encompassing its foundational concepts, developments, and applications, is notably absent in the existing literature. We aim to bridge this gap. This survey delves into the basic elements of the theory, including the notion of a soft set, the operations on soft sets, and their semantic interpretations. It describes various generalizations and modifications of soft set theory, such as N-soft sets, fuzzy soft sets, and bipolar soft sets, highlighting their specific characteristics. Furthermore, this work outlines the fundamentals of various extensions of mathematical structures from the perspective of soft set theory. Particularly, we present basic results of soft topology and other algebraic structures such as soft algebras and sigma-algebras. This article examines a selection of notable applications of soft set theory in different fields, including medicine and economics, underscoring its versatile nature. The survey concludes with a discussion on the challenges and future directions in soft set theory, emphasizing the need for further research to enhance its theoretical foundations and broaden its practical applications. Overall, this survey of soft set theory serves as a valuable resource for practitioners, researchers, and students interested in understanding and utilizing this flexible mathematical framework for tackling uncertainty in decision-making processes

    Concept learning consistency under three‑way decision paradigm

    Get PDF
    Concept Mining is one of the main challenges both in Cognitive Computing and in Machine Learning. The ongoing improvement of solutions to address this issue raises the need to analyze whether the consistency of the learning process is preserved. This paper addresses a particular problem, namely, how the concept mining capability changes under the reconsideration of the hypothesis class. The issue will be raised from the point of view of the so-called Three-Way Decision (3WD) paradigm. The paradigm provides a sound framework to reconsider decision-making processes, including those assisted by Machine Learning. Thus, the paper aims to analyze the influence of 3WD techniques in the Concept Learning Process itself. For this purpose, we introduce new versions of the Vapnik-Chervonenkis dimension. Likewise, to illustrate how the formal approach can be instantiated in a particular model, the case of concept learning in (Fuzzy) Formal Concept Analysis is considered.This work is supported by State Investigation Agency (Agencia Estatal de Investigación), project PID2019-109152GB-100/AEI/10.13039/501100011033. We acknowledge the reviewers for their suggestions and guidance on additional references that have enriched our paper. Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature

    Rough set decision algorithms for modeling with uncertainty

    Get PDF
    The use of decision rules allows to extract information and to infer conclusions from relational databases in a reliable way, thanks to some indicators like support and certainty. Moreover, decision algorithms collect a group of decision rules that satisfies desirable properties to describe the relational system. However, when a decision table is considered within a fuzzy environment, it is necessary to extend all notions related to decision algorithms to this framework. This paper presents a generalization of these notions, highlighting the new definitions of indicators of relevance to describe decision rules and decision algorithm
    • …
    corecore