24 research outputs found

    On the complexity of enumerating pseudo-intents

    Get PDF
    AbstractWe investigate whether the pseudo-intents of a given formal context can efficiently be enumerated. We show that they cannot be enumerated in a specified lexicographic order with polynomial delay unless P=NP. Furthermore we show that if the restriction on the order of enumeration is removed, then the problem becomes at least as hard as enumerating minimal transversals of a given hypergraph. We introduce the notion of minimal pseudo-intents and show that recognizing minimal pseudo-intents is polynomial. Despite their less complicated nature, surprisingly it turns out that minimal pseudo-intents cannot be enumerated in output-polynomial time unless P=NP

    Efficient Axiomatization of OWL 2 EL Ontologies from Data by means of Formal Concept Analysis: (Extended Version)

    Get PDF
    We present an FCA-based axiomatization method that produces a complete EL TBox (the terminological part of an OWL 2 EL ontology) from a graph dataset in at most exponential time. We describe technical details that allow for efficient implementation as well as variations that dispense with the computation of extremely large axioms, thereby rendering the approach applicable albeit some completeness is lost. Moreover, we evaluate the prototype on real-world datasets.This is an extended version of an article accepted at AAAI 2024

    On the isomorphism problem of concept algebras

    Get PDF
    Weakly dicomplemented lattices are bounded lattices equipped with two unary operations to encode a negation on concepts. They have been introduced to capture the equational theory of concept algebras (Wille 2000; Kwuida 2004). They generalize Boolean algebras. Concept algebras are concept lattices, thus complete lattices, with a weak negation and a weak opposition. A special case of the representation problem for weakly dicomplemented lattices, posed in Kwuida (2004), is whether complete weakly dicomplemented lattices are isomorphic to concept algebras. In this contribution we give a negative answer to this question (Theorem4). We also provide a new proof of a well known result due to M.H. Stone(Trans Am Math Soc 40:37-111, 1936), saying that each Boolean algebra is a field of sets (Corollary4). Before these, we prove that the boundedness condition on the initial definition of weakly dicomplemented lattices (Definition1) is superfluous (Theorem1, see also Kwuida (2009)

    Most specific consequences in the description logic EL

    Get PDF
    The notion of a most specific consequence with respect to some terminological box is introduced, conditions for its existence in the description logic EL and its variants are provided, and means for its computation are developed. Algebraic properties of most specific consequences are explored. Furthermore, several applications that make use of this new notion are proposed and, in particular, it is shown how given terminological knowledge can be incorporated in existing approaches for the axiomatization of observations. For instance, a procedure for an incremental learning of concept inclusions from sequences of interpretations is developed

    Attribute Exploration of Gene Regulatory Processes

    Get PDF
    This thesis aims at the logical analysis of discrete processes, in particular of such generated by gene regulatory networks. States, transitions and operators from temporal logics are expressed in the language of Formal Concept Analysis. By the attribute exploration algorithm, an expert or a computer program is enabled to validate a minimal and complete set of implications, e.g. by comparison of predictions derived from literature with observed data. Here, these rules represent temporal dependencies within gene regulatory networks including coexpression of genes, reachability of states, invariants or possible causal relationships. This new approach is embedded into the theory of universal coalgebras, particularly automata, Kripke structures and Labelled Transition Systems. A comparison with the temporal expressivity of Description Logics is made. The main theoretical results concern the integration of background knowledge into the successive exploration of the defined data structures (formal contexts). Applying the method a Boolean network from literature modelling sporulation of Bacillus subtilis is examined. Finally, we developed an asynchronous Boolean network for extracellular matrix formation and destruction in the context of rheumatoid arthritis.Comment: 111 pages, 9 figures, file size 2.1 MB, PhD thesis University of Jena, Germany, Faculty of Mathematics and Computer Science, 2011. Online available at http://www.db-thueringen.de/servlets/DocumentServlet?id=1960

    Constructing and Extending Description Logic Ontologies using Methods of Formal Concept Analysis

    Get PDF
    Description Logic (abbrv. DL) belongs to the field of knowledge representation and reasoning. DL researchers have developed a large family of logic-based languages, so-called description logics (abbrv. DLs). These logics allow their users to explicitly represent knowledge as ontologies, which are finite sets of (human- and machine-readable) axioms, and provide them with automated inference services to derive implicit knowledge. The landscape of decidability and computational complexity of common reasoning tasks for various description logics has been explored in large parts: there is always a trade-off between expressibility and reasoning costs. It is therefore not surprising that DLs are nowadays applied in a large variety of domains: agriculture, astronomy, biology, defense, education, energy management, geography, geoscience, medicine, oceanography, and oil and gas. Furthermore, the most notable success of DLs is that these constitute the logical underpinning of the Web Ontology Language (abbrv. OWL) in the Semantic Web. Formal Concept Analysis (abbrv. FCA) is a subfield of lattice theory that allows to analyze data-sets that can be represented as formal contexts. Put simply, such a formal context binds a set of objects to a set of attributes by specifying which objects have which attributes. There are two major techniques that can be applied in various ways for purposes of conceptual clustering, data mining, machine learning, knowledge management, knowledge visualization, etc. On the one hand, it is possible to describe the hierarchical structure of such a data-set in form of a formal concept lattice. On the other hand, the theory of implications (dependencies between attributes) valid in a given formal context can be axiomatized in a sound and complete manner by the so-called canonical base, which furthermore contains a minimal number of implications w.r.t. the properties of soundness and completeness. In spite of the different notions used in FCA and in DLs, there has been a very fruitful interaction between these two research areas. My thesis continues this line of research and, more specifically, I will describe how methods from FCA can be used to support the automatic construction and extension of DL ontologies from data

    Proceedings of the International Workshop "What can FCA do for Artificial Intelligence?" (FCA4AI 2014)

    Get PDF
    International audienceThis is the third edition of the FCA4AI workshop, whose first edition was organized at ECAI 2012 Conference (Montpellier, August 2012) and second edition was organized at IJCAI 2013 Conference (Beijing, August 2013, see http://www.fca4ai.hse.ru/). Formal Concept Analysis (FCA) is a mathematically well-founded theory aimed at data analysis and classification that can be used for many purposes, especially for Artificial Intelligence (AI) needs. The objective of the workshop is to investigate two main main issues: how can FCA support various AI activities (knowledge discovery, knowledge representation and reasoning, learning, data mining, NLP, information retrieval), and how can FCA be extended in order to help AI researchers to solve new and complex problems in their domain

    A quality-aware spatial data warehouse for querying hydroecological data

    Get PDF
    International audienceAddressing data quality issues in information systems remains a challenging task. Many approaches only tackle this issue at the extract, transform and load steps. Here we define a comprehensive method to gain greater insight into data quality characteristics within data warehouse. Our novel architecture was implemented for an hydroecological case study where massive French watercourse sampling data are collected. The method models and makes effective use of spatial, thematic and temporal accuracy, consistency and completeness for multidimensional data in order to offer analysts a âdata qualityâ oriented framework. The results obtained in experiments carried out on the Saône River dataset demonstrated the relevance of our approac
    corecore