12 research outputs found
Description logic-based knowledge merging for concrete- and fuzzy- domain ontologies
Enterprises, especially virtual enterprises, are nowadays becoming more knowledge intensive and adopting efficient
knowledge management systems to boost their competitiveness. The major challenge for knowledge management for
virtual enterprises is to acquire, extract and integrate new knowledge with the existing source. Ontologies have been
proved to be one of the best tools for representing knowledge with class, role and other characteristics. It is imperative
to accommodate the new knowledge in the current ontologies with logical consistencies as it is tedious and costly to
construct new ontologies every time after acquiring new knowledge. This article introduces a mechanism and a process
to integrate new knowledge into the current system (ontology). Separate methods have been adopted for fuzzy- and
concrete-domain ontologies. The process starts by finding the semantic and structural similarities between the concepts
usingWordNet and description logic. Description logic–based reasoning is used next to determine the position and relationships
between the incoming and existing knowledge. The experimental results provided show the efficacy of the proposed
method
How To Build Enterprise Data Models To Achieve Compliance To Standards Or Regulatory Requirements (and share data).
Sharing data between organizations is challenging because it is difficult to ensure that those consuming the data accurately interpret it. The promise of the next generation WWW, the semantic Web, is that semantics about shared data will be represented in ontologies and available for automatic and accurate machine processing of data. Thus, there is inter-organizational business value in developing applications that have ontology-based enterprise models at their core. In an ontology-based enterprise model, business rules and definitions are represented as formal axioms, which are applied to enterprise facts to automatically infer facts not explicitly represented. If the proposition to be inferred is a requirement from, say, ISO 9000 or Sarbanes-Oxley, inference constitutes a model-based proof of compliance. In this paper, we detail the development and application of the TOVE ISO 9000 Micro-Theory, a model of ISO 9000 developed using ontologies for quality management (measurement, traceability, and quality management system ontologies). In so doing, we demonstrate that when enterprise models are developed using ontologies, they can be leveraged to support business analytics problems - in particular, compliance evaluation - and are sharable
A Dynamic Ontology Mapping Architecture for a Grid Database System
Abstract — Most large-scale heterogeneous distributed computing systems, such as Grids, rely on Service Oriented Architectures (SOA) to interact with others in different platforms and computing languages. However, we still need to solve the semantic heterogeneity problem of data; we must interpret the data from different systems in some semantically related ways. Ontologies are the most common and well-accepted methodology to handle this problem at multiple levels of granularities across different systems. Nevertheless, using ontologies in a dynamic environment, such as a Grid, to share some common concepts is still a challenge. It is difficult to keep a static mapping between ontologies; the corresponding semantic mapping changes must occur consistently. Therefore, we adopt the concept of Tuple Space and propose a flexible approach for managing ontologies in a Grid. It enables systems and users to interoperate semantically and dynamically by sharing and managing the concepts and semantic ontology mappings in a flexible approach. I
1st international semantic web working symposium (SWWS-1)
euzenat2001hNo abstract available
Recommended from our members
Developing a data quality scorecard that measures data quality in a data warehouse
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe main purpose of this thesis is to develop a data quality scorecard (DQS) that aligns the data quality needs of the Data warehouse stakeholder group with selected data quality dimensions. To comprehend the research domain, a general and systematic literature review (SLR) was carried out, after which the research scope was established. Using Design Science Research (DSR) as the methodology to structure the research, three iterations were carried out to achieve the research aim highlighted in this thesis. In the first iteration, as DSR was used as a paradigm, the artefact was build from the results of the general and systematic literature review conduct. A data quality scorecard (DQS) was conceptualised. The result of the SLR and the recommendations for designing an effective scorecard provided the input for the development of the DQS. Using a System Usability Scale (SUS), to validate the usability of the DQS, the results of the first iteration suggest that the DW stakeholders found the DQS useful. The second iteration was conducted to further evaluate the DQS through a run through in the FMCG domain and then conducting a semi-structured interview. The thematic analysis of the semi-structured interviews demonstrated that the stakeholder's participants‘ found the DQS to be transparent; an additional reporting tool; Integrates; easy to use; consistent; and increases confidence in the data. However, the timeliness data dimension was found to be redundant, necessitating a modification to the DQS. The third iteration was conducted with similar steps as the second iteration but with the modified DQS in the oil and gas domain. The results from the third iteration suggest that DQS is a useful tool that is easy to use on a daily basis. The research contributes to theory by demonstrating a novel approach to DQS design This was achieved by ensuring the design of the DQS aligns with the data quality concern areas of the DW stakeholders and the data quality dimensions. Further, this research lay a good foundation for the future by establishing a DQS model that can be used as a base for further development