6 research outputs found

    Agnostic content ontology design patterns for a multi-domain ontology

    Get PDF
    This research project aims to solve the semantic heterogeneity problem. Semantic heterogeneity mimics cancer in that semantic heterogeneity unnecessarily consumes resources from its host, the enterprise, and may even affect lives. A number of authors report that semantic heterogeneity may cost a significant portion of an enterprise’s IT budget. Also, semantic heterogeneity hinders pharmaceutical and medical research by consuming valuable research funds. The RA-EKI architecture model comprises a multi-domain ontology, a cross-industry agnostic construct composed of rich axioms notably for data integration. A multi-domain ontology composed of axiomatized agnostic data model patterns would drive a cognitive data integration application system usable in any industry sector. This project’s objective is to elicit agnostic data model patterns here considered as content ontology design patterns. The first research question of this project pertains to the existence of agnostic patterns and their capacity to solve the semantic heterogeneity problem. Due to the theory-building role of this project, a qualitative research approach constitutes the appropriate manner to conduct its research. Contrary to theory testing quantitative methods that rely on well-established validation techniques to determine the reliability of the outcome of a given study, theorybuilding qualitative methods do not possess standardized techniques to ascertain the reliability of a study. The second research question inquires on a dual method theory-building approach that may demonstrate trustworthiness. The first method, a qualitative Systematic Literature Review (SLR) approach induces the sought knowledge from 69 retained publications using a practical screen. The second method, a phenomenological research protocol elicits the agnostic concepts from semi-structured interviews involving 22 senior practitioners with 21 years in average of experience in conceptualization. The SLR retains a set of 89 agnostic concepts from 2009 through 2017. The phenomenological study in turn retains 83 agnostic concepts. During the synthesis stage for both studies, data saturation was calculated for each of the retained concepts at the point where the concepts have been selected for a second time. The quantification of data saturation constitutes an element of the trustworthiness’s transferability criterion. It can be argued that this effort of establishing the trustworthiness, i.e. credibility, dependability, confirmability and transferability can be construed as extensive and this research track as promising. Data saturation for both studies has still not been reached. The assessment performed in the course of the establishment of trustworthiness of this project’s dual method qualitative research approach yields very interesting findings. Such findings include two sets of agnostic data model patterns obtained from research protocols using radically different data sources i.e. publications vs. experienced practitioners but with striking similarities. Further work is required using exactly the same protocols for each of the methods, expand the year range for the SLR and to recruit new co-researchers for the phenomenological protocol. This work will continue until these protocols do not elicit new theory material. At this point, new protocols for both methods will be designed and executed with the intent to measure theoretical saturation. For both methods, this entails in formulating new research questions that may, for example, focus on agnostic themes such as finance, infrastructure, relationships, classifications, etc. For this exploration project, the road ahead involves the design of new questionnaires for semi-structured interviews. This project will need to engage in new knowledge elicitation techniques such as focus groups. The project will definitely conduct other qualitative research methods such as research action for eliciting new knowledge and know-how from actual development and operation of an ontology-based cognitive application. Finally, a mixed methods qualitative-quantitative approach would prepare the transition toward theory testing method using hypothetico-deductive techniques

    Specification of interoperability aspects in methodological approaches to IS development.

    Get PDF
    Ova disertacija se bavi problemom specifikacije aspekata interoperabilnosti u metodološkim pristupima za razvoj informacionih sistema. U uslovima opšte globalizacije i savremene saradnje se umesto čvrste integracije teži ka slabom povezivanju organizacionih sistema. Način na koji heterogeni, slabo-povezani sistemi mogu da ostvare efikasne inter-organizacione veze je jedna od aktuelnih tema istraživanja oblasti interoperabilnosti informacionih sistema. Na osnovu analize relevantne literature, imajući u vidu preporuke Advanced Technologies for Interoperability of Heterogeneous Enterprise Networks and their Application (ATHENA) referentnog modela za konceptulnu integraciju, odabrana su tri relevantna aspekta za specifikaciju interoperabilnosti: aspekt procesa, servisa i informacija. U disertaciji se definiše nov specifičan pristup za specifikaciju aspekata interoperabilnosti u metodološkim pristupima za razvoj informacionih sistema. Predloženi pristup je baziran na principima opšteg sistemsko teorijskog modela životnog ciklusa softvera koji ima tri osnovne faze: identifikaciju, realizaciju i implementaciju. Za svaku od faza su precizno definisani opšti koraci i date su preporuke za njihovu primenu. U fazi identifikacije se predlaže da se za specifikaciju inter-organizacionih poslovnih procesa pored funkcionalnog uključi i procesni pogled. U prvom koraku se identifikuju zahtevi za interoperabilnošću koji podrazumevaju specifikaciju: esencijalnih interoperabilnih poslovnih funkcija i poslovnih partnera koji učestvuju u kolaboraciji. Za reprezentaciju kolaborativnog poslovnog procesa je odabrana Business Process Model and Notation (BPMN) notacija. U drugom koraku se definišu opšti dijagrami konverzacije i kolaboracije, dok se u trećem koraku predlaže detaljna specifikacija njihove javne i privatne reprezentacije. U poslednjem koraku faze identifikacije se vrši kreiranje nove ili izbor postojeće referentne ontologije, koja predstavlja osnovu za nedvosmislenu interpretaciju značenja poruka koje se razmenjuju u kolaboraciji.This thesis addresses the problem of specification of interoperability aspects in methodological approaches to information system development. Under the conditions of general globalization and modern cooperation, instead of firm integration there is a tendency towards weak linking of organizational systems. The manner in which weakly linked systems may achieve efficient inter-organizational connections is one of the up-to-date research topics in the information systems' interoperability domain. Based on the analysis of relevant literature and bearing in mind recommendations of the Advanced Technologies for Interoperability of Heterogeneous Enterprise Networks and their Application (ATHENA) reference model for conceptual integration, three relevant aspects were selected for the specification of interoperability, namely: process, service and informations aspects. This thesis defines a new specific approach to specification of interoperability aspects in methodological approaches for informations systems' development. The proposed approach is based on the „System-Theoretic life cycle“ having three fundamental phases: identification, realization and implementation. For each of these phases general steps have been precisely defined and recommendations for their application given. As for the identification phase, for specification of the inter-organizational business processes it is proposed to include the process view in addition to the functional one. The first step identifies interopebability requirements implying specification of the following: essential interoperable business functions and business partners participating in the collaboration. For representation of the collaborative business process Business Process Model and Notation (BPMN) has been selected. The second step defines general conversation and collaboration diagrams, while the x third step proposes a detailed specification for their private and public representation. The last step in the identification phase implies creation of a new or selection of the existing reference ontology representing a basis for an unambiguous interpretation of the meaning of messages being exchanged in scope of the collaboration

    Vereinigung von detaillierten Teilmodellen in einer flexiblen Enterprise Architecture zur übergreifenden Analyse: Ableitung des Bedarfs an Handlungen für einen durch Kennzahlen beschriebenen Untersuchungskontext

    Get PDF
    Modelle haben sich zu Dokumentationszwecken bewährt, existieren in der Praxis aber oftmals losgelöst voneinander. Durch die wachsende Komplexität in den Unternehmen reicht jedoch eine getrennte Betrachtung nicht mehr aus. Das Zusammenspiel der Unternehmensbestandteile muss bei Entscheidungen berücksichtigt werden. Eine Enterprise Architecture (EA) eignet sich zur Herstellung einer übergreifenden Sichtweise. Wobei hauptsächlich aggregierte Inhalte enthalten sind, die manuell erstellt werden. Damit ist die EA ein weiteres Datensilo. Durch das Fehlen detaillierter Informationen in der EA sind außerdem die Möglichkeiten einer ganzheitlichen Analyse begrenzt. Die vorliegende Arbeit entwickelt daher ein Gesamtkonzept, um Detailinhalte zu vernetzen und übergreifende Analysen zu ermöglichen. Insbesondere werden auch Datenwerte (z.B. Kosten) einbezogen. Eine Indirektstufe kann die Teilmodelle lose verknüpfen. Zugleich dient ein einfaches EA-Vokabular als neutrale Begriffsschicht. Mithilfe der Technologien des Semantic Web entsteht so eine integrierte Datenbasis. Sie positioniert sich als Ebene oberhalb der Datenquellen. Anschließend kann eine übergreifende Analyse erfolgen, in der alle Inhalte kombiniert werden. Zur Konkretisierung des Ansatzes fokussiert sich die Arbeit auf die Ableitung des Bedarfs an Handlungen. Mit der Importance-Performance-Analyse wird ein Verfahren aus dem Qualitätsmanagement von Dienstleistungen entliehen und auf die EA-Analyse übertragen. Die Berechnung basiert auf flexibel zu beschreibenden Kennzahlen, bei deren Definition das EA-Vokabular verwendet wird. Als Ergebnis werden Gesamtratings für alle Untersuchungsobjekte ausgewiesen. Sie sagen etwas über einen Handlungsbedarf und die Dringlichkeit aus. Auch die Analyse basiert auf Technologien des Semantic Web. Als Nachweis der Realisierbarkeit wurde der Ansatz in einem Prototyp umgesetzt. Außerdem wird ein praxisnaher Anwendungsfall einer Digitalisierungsinitiative bei einer Versicherung skizziert
    corecore