9 research outputs found

    Composite ontology change operators and their customizable evolution strategies

    Get PDF
    Change operators are the building blocks of ontology evolution. Elementary, composite and complex change operators have been suggested. While lower-level change operators are useful in terms of finegranular representation of ontology changes, representing the intent of change requires higher-level change operators. Here, we focus on higherlevel composite change operators to perform an aggregated task. We introduce composite-level evolution strategies. The central role of the evolution strategies is to preserve the intent of the composite change with respect to the user’s requirements and to reduce the change operational cost. Composite-level evolution strategies assist in avoiding the illegal changes or presence of illegal axioms that may generate inconsistencies during application of a composite change. We discuss few composite changes along with the defined evolution strategies as an example that allow users to control and customize the ontology evolution process

    Graph-based discovery of ontology change patterns

    Get PDF
    Ontologies can support a variety of purposes, ranging from capturing conceptual knowledge to the organisation of digital content and information. However, information systems are always subject to change and ontology change management can pose challenges. We investigate ontology change representation and discovery of change patterns. Ontology changes are formalised as graph-based change logs. We use attributed graphs, which are typed over a generic graph with node and edge attribution.We analyse ontology change logs, represented as graphs, and identify frequent change sequences. Such sequences are applied as a reference in order to discover reusable, often domain-specific and usagedriven change patterns. We describe the pattern discovery algorithms and measure their performance using experimental result

    Layered change log model: bridging between ontology change representation and pattern mining

    Get PDF
    To date, no ontology change management system exists that records the ontology changes based on the different levels of granularity. Once changes are performed using elementary level change operations, they are recorded in the database at the elementary level accordingly. Such a change representation procedure is not sufficient to represent the intuition behind any applied change and thus, cannot capture the semantic impact of a change. In this paper, we discuss recording of the applied ontology changes in the form of a layered change log. We support the implementation of a layered change operator framework through layered change logs. We utilize the lower level ontology change log in two ways, i.e. recording of applied ontology changes (operational) and mining of higher level change patterns (analytical). The higher level change logs capture the objective of the ontology changes at a higher level of granularity and support a comprehensive understanding of the ontology evolution. The knowledge-based change log facilitates the detection of similarities within different time series, mining of change patterns and reuse of knowledge.The layered change logs are formalised using a graph-based approach

    Interactive Knowledge Construction in the Collaborative Building of an Encyclopedia

    Get PDF
    International audienceOne of the major challenges of Applied Artificial Intelligence is to provide environments where high level human activities like learning, constructing theories or performing experiments, are enhanced by Artificial Intelligence technologies. This paper starts with the description of an ambitious project: EnCOrE2. The specific real world EnCOrE scenario, significantly representing a much wider class of potential applicative contexts, is dedicated to the building of an Encyclopedia of Organic Chemistry in the context of Virtual Communities of experts and students. Its description is followed by a brief survey of some major AI questions and propositions in relation with the problems raised by the EnCOrE project. The third part of the paper starts with some definitions of a set of “primitives” for rational actions, and then integrates them in a unified conceptual framework for the interactive construction of knowledge. To end with, we sketch out protocols aimed at guiding both the collaborative construction process and the collaborative learning process in the EnCOrE project.The current major result is the emerging conceptual model supporting interaction between human agents and AI tools integrated in Grid services within a socio-constructivist approach, consisting of cycles of deductions, inductions and abductions upon facts (the shared reality) and concepts (their subjective interpretation) submitted to negotiations, and finally converging to a socially validated consensus

    A business analysis methodology

    Get PDF
    Synopsis Business analysis is defined as the process in which business needs are identified and solutions proposed. This process is regarded as one of the most important parts of systems development because no other part is more difficult to rectify later. However, current business analysis methodologies are inadequate because they are at a too high level and only address portions of the complete business analysis process. In particular, the lack of clear objectives, relevance and outcomes of the phases make business analysis methodologies inadequate. Moreover, activities, techniques and tools not mapped to those phases are also problematic. The aim of this research was to develop a business analysis methodology for business analysts in the South African financial services environment. The intentions were to identify the phases, as well as objectives, relevance and outcomes for each of these phases. Furthermore, this research intended to identify appropriate activities, techniques and tools to address the objectives of each phase of a methodology. This was done by presenting a literature review of previous research relating to business analysis methodologies. For information gathering, 45 participants (comprising of business analysts, project managers, IS managers and CIOs) contributed to this research, 22 of whom were interviewed individually while 23 participated in focus group interviews. The data from each of these methods was analysed independently and did not influence or feed into any of the other methods. Once the individual interviews and focus group interviews had been transcribed, content analysis and analysis within and between interviews (Merriam, 1998; Strauss, 1987) was used to analyse the information gathered independently. The phases of a business analysis methodology identified by the research are the: • feasibility phase; • business case phase; • analysis and design phase; and • post-implementation evaluation phase. Objectives, relevance and outcomes of these phases were also identified. In addition, activities, techniques and tools were mapped to each of these phases

    Transformation of graphical models to support knowledge transfer

    Get PDF
    Menschliche Experten verfügen über die Fähigkeit, ihr Entscheidungsverhalten flexibel auf die jeweilige Situation abzustimmen. Diese Fähigkeit zahlt sich insbesondere dann aus, wenn Entscheidungen unter beschränkten Ressourcen wie Zeitrestriktionen getroffen werden müssen. In solchen Situationen ist es besonders vorteilhaft, die Repräsentation des zugrunde liegenden Wissens anpassen und Entscheidungsmodelle auf unterschiedlichen Abstraktionsebenen verwenden zu können. Weiterhin zeichnen sich menschliche Experten durch die Fähigkeit aus, neben unsicheren Informationen auch unscharfe Wahrnehmungen in die Entscheidungsfindung einzubeziehen. Klassische entscheidungstheoretische Modelle basieren auf dem Konzept der Rationalität, wobei in jeder Situation die nutzenmaximale Entscheidung einer Entscheidungsfunktion zugeordnet wird. Neuere graphbasierte Modelle wie Bayes\u27sche Netze oder Entscheidungsnetze machen entscheidungstheoretische Methoden unter dem Aspekt der Modellbildung interessant. Als Hauptnachteil lässt sich die Komplexität nennen, wobei Inferenz in Entscheidungsnetzen NP-hart ist. Zielsetzung dieser Dissertation ist die Transformation entscheidungstheoretischer Modelle in Fuzzy-Regelbasen als Zielsprache. Fuzzy-Regelbasen lassen sich effizient auswerten, eignen sich zur Approximation nichtlinearer funktionaler Beziehungen und garantieren die Interpretierbarkeit des resultierenden Handlungsmodells. Die Übersetzung eines Entscheidungsmodells in eine Fuzzy-Regelbasis wird durch einen neuen Transformationsprozess unterstützt. Ein Agent kann zunächst ein Bayes\u27sches Netz durch Anwendung eines in dieser Arbeit neu vorgestellten parametrisierten Strukturlernalgorithmus generieren lassen. Anschließend lässt sich durch Anwendung von Präferenzlernverfahren und durch Präzisierung der Wahrscheinlichkeitsinformation ein entscheidungstheoretisches Modell erstellen. Ein Transformationsalgorithmus kompiliert daraus eine Regelbasis, wobei ein Approximationsmaß den erwarteten Nutzenverlust als Gütekriterium berechnet. Anhand eines Beispiels zur Zustandsüberwachung einer Rotationsspindel wird die Praxistauglichkeit des Konzeptes gezeigt.Human experts are able to flexible adjust their decision behaviour with regard to the respective situation. This capability pays in situations under limited resources like time restrictions. It is particularly advantageous to adapt the underlying knowledge representation and to make use of decision models at different levels of abstraction. Furthermore human experts have the ability to include uncertain information and vague perceptions in decision making. Classical decision-theoretic models are based directly on the concept of rationality, whereby the decision behaviour prescribed by the principle of maximum expected utility. For each observation some optimal decision function prescribes an action that maximizes expected utility. Modern graph-based methods like Bayesian networks or influence diagrams make use of modelling. One disadvantage of decision-theoretic methods concerns the issue of complexity. Finding an optimal decision might become very expensive. Inference in decision networks is known to be NP-hard. This dissertation aimed at combining the advantages of decision-theoretic models with rule-based systems by transforming a decision-theoretic model into a fuzzy rule-based system. Fuzzy rule bases are an efficient implementation from a computational point of view, they can approximate non-linear functional dependencies and they are also intelligible. There was a need for establishing a new transformation process to generate rule-based representations from decision models, which provide an efficient implementation architecture and represent knowledge in an explicit, intelligible way. At first, an agent can apply the new parameterized structure learning algorithm to identify the structure of the Bayesian network. The use of learning approaches to determine preferences and the specification of probability information subsequently enables to model decision and utility nodes and to generate a consolidated decision-theoretic model. Hence, a transformation process compiled a rule base by measuring the utility loss as approximation measure. The transformation process concept has been successfully applied to the problem of representing condition monitoring results for a rotation spindle

    Scalable Query Processing on Spatial Networks

    Get PDF
    Spatial networks (e.g., road networks) are general graphs with spatial information (e.g., latitude/longitude) information associated with the vertices and/or the edges of the graph. Techniques are presented for query processing on spatial networks that are based on the observed coherence between the spatial positions of the vertices and the shortest paths between them. This facilitates aggregation of the vertices into coherent regions that share vertices on the shortest paths between them. Using this observation, a framework, termed SILC, is introduced that precomputes and compactly encodes the N^2 shortest path and network distances between every pair of vertices on a spatial network containing N vertices. The compactness of the shortest paths from source vertex V is achieved by partitioning the destination vertices into subsets based on the identity of the first edge to them from V. The spatial coherence of these subsets is captured by using a quadtree representation whose dimension-reducing property enables the storage requirements of each subset to be reduced to be proportional to the perimeter of the spatially coherent regions, instead of to the number of vertices in the spatial network. In particular, experiments on a number of large road networks as well as a theoretical analysis have shown that the total storage for the shortest paths has been reduced from O(N^3) to O(N^1.5). In addition to SILC, another framework, termed PCP, is proposed that also takes advantage of the spatial coherence of the source vertices and makes use of the Well Separated Pair decomposition to further reduce the storage, under suitably defined conditions, to O(N). Using these frameworks, scalable algorithms are presented to implement a wide variety of operations such as nearest neighbor finding and distance joins on large datasets of locations residing on a spatial network. These frameworks essentially decouple the process of computing shortest paths from that of spatial query processing as well as also decouple the domain of the participating objects from the domain of the vertices of the spatial network. This means that as long as the spatial network is unchanged, the algorithm and underlying representation of the shortest paths in the spatial network can be used with different sets of objects

    Business-process oriented knowledge management: concepts, methods, and tools

    Get PDF
    corecore