8,703 research outputs found

    Towards automated knowledge-based mapping between individual conceptualisations to empower personalisation of Geospatial Semantic Web

    No full text
    Geospatial domain is characterised by vagueness, especially in the semantic disambiguation of the concepts in the domain, which makes defining universally accepted geo- ontology an onerous task. This is compounded by the lack of appropriate methods and techniques where the individual semantic conceptualisations can be captured and compared to each other. With multiple user conceptualisations, efforts towards a reliable Geospatial Semantic Web, therefore, require personalisation where user diversity can be incorporated. The work presented in this paper is part of our ongoing research on applying commonsense reasoning to elicit and maintain models that represent users' conceptualisations. Such user models will enable taking into account the users' perspective of the real world and will empower personalisation algorithms for the Semantic Web. Intelligent information processing over the Semantic Web can be achieved if different conceptualisations can be integrated in a semantic environment and mismatches between different conceptualisations can be outlined. In this paper, a formal approach for detecting mismatches between a user's and an expert's conceptual model is outlined. The formalisation is used as the basis to develop algorithms to compare models defined in OWL. The algorithms are illustrated in a geographical domain using concepts from the SPACE ontology developed as part of the SWEET suite of ontologies for the Semantic Web by NASA, and are evaluated by comparing test cases of possible user misconceptions

    A conceptual architecture for semantic web services development and deployment

    Get PDF
    Several extensions of the Web Services Framework (WSF) have been proposed. The combination with Semantic Web technologies introduces a notion of semantics, which can enhance scalability through automation. Service composition to processes is an equally important issue. Ontology technology – the core of the Semantic Web – can be the central building block of an extension endeavour. We present a conceptual architecture for ontology-based Web service development and deployment. The development of service-based software systems within the WSF is gaining increasing importance. We show how ontologies can integrate models, languages, infrastructure, and activities within this architecture to support reuse and composition of semantic Web services

    A framework for integrating syntax, semantics and pragmatics for computer-aided professional practice: With application of costing in construction industry

    Get PDF
    Producing a bill of quantity is a knowledge-based, dynamic and collaborative process, and evolves with variances and current evidence. However, within the context of information system practice in BIM, knowledge of cost estimation has not been represented, nor has it been integrated into the processes based on BIM. This paper intends to establish an innovative means of taking data from the BIM linked to a project, and using it to create the necessary items for a bill of quantity that will enable cost estimation to be undertaken for the project. Our framework is founded upon the belief that three components are necessary to gain a full awareness of the domain which is being computerised; the information type which is to be assessed for compatibility (syntax), the definition for the pricing domain (semantics), and the precise implementation environment for the standards being taken into account (pragmatics). In order to achieve this, a prototype is created that allows a cost item for the bill of quantity to be spontaneously generated, by means of the semantic web ontology and a forward chain algorithm. Within this paper, ‘cost items’ signify the elements included in a bill of quantity, including details of their description, quantity and price. As a means of authenticating the process being developed, the authors of this work effectively implemented it in the production of cost items. In addition, the items created were contrasted with those produced by specialists. For this reason, this innovative framework introduces the possibility of a new means of applying semantic web ontology and forward chain algorithm to construction professional practice resulting in automatic cost estimation. These key outcomes demonstrate that, decoupling the professional practice into three key components of syntax, semantics and pragmatics can provide tangible benefits to domain use

    Conceptual Modelling and The Quality of Ontologies: Endurantism Vs. Perdurantism

    Full text link
    Ontologies are key enablers for sharing precise and machine-understandable semantics among different applications and parties. Yet, for ontologies to meet these expectations, their quality must be of a good standard. The quality of an ontology is strongly based on the design method employed. This paper addresses the design problems related to the modelling of ontologies, with specific concentration on the issues related to the quality of the conceptualisations produced. The paper aims to demonstrate the impact of the modelling paradigm adopted on the quality of ontological models and, consequently, the potential impact that such a decision can have in relation to the development of software applications. To this aim, an ontology that is conceptualised based on the Object-Role Modelling (ORM) approach (a representative of endurantism) is re-engineered into a one modelled on the basis of the Object Paradigm (OP) (a representative of perdurantism). Next, the two ontologies are analytically compared using the specified criteria. The conducted comparison highlights that using the OP for ontology conceptualisation can provide more expressive, reusable, objective and temporal ontologies than those conceptualised on the basis of the ORM approach

    A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows

    Get PDF
    This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes. Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques. The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic system’ components serves as a knowledge base. The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete. After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system. A real-life case (from the King’s College hospital accident and emergency (A&E) department’s trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling

    The Knowledge Life Cycle for e-learning

    No full text
    In this paper, we examine the semantic aspects of e-learning from both pedagogical and technological points of view. We suggest that if semantics are to fulfil their potential in the learning domain then a paradigm shift in perspective is necessary, from information-based content delivery to knowledge-based collaborative learning services. We propose a semantics driven Knowledge Life Cycle that characterises the key phases in managing semantics and knowledge, show how this can be applied to the learning domain and demonstrate the value of semantics via an example of knowledge reuse in learning assessment management
    • 

    corecore