6,612 research outputs found
Recommended from our members
Conceptual modelling and the quality of ontologies: A comparison between object-role modelling and the object paradigm
Ontologies are key enablers for sharing precise and machine-understandable semantics among different applications and parties. Yet, for ontologies to meet these expectations, their quality must be of a good standard. The quality of an ontology is strongly based on the design method employed. This paper addresses the design problems related to the modelling of ontologies, with specific concentration on the issues related to the quality of the conceptualisations produced. The paper aims
to demonstrate the impact of the modelling paradigm adopted on the quality of ontological models and, consequently, the potential impact that such a decision can have in relation to the development of
software applications. To this aim, an ontology that is conceptualised based on the Object Role Modelling (ORM) approach is re-engineered into a one modelled on the basis of the Object Paradigm (OP). Next, the two ontologies are analytically compared using the specified criteria. The conducted
comparison highlights that using the OP for ontology conceptualisation can provide more expressive, reusable, objective and temporal ontologies than those conceptualised on the basis of the ORM approach
Conceptual Modelling and The Quality of Ontologies: Endurantism Vs. Perdurantism
Ontologies are key enablers for sharing precise and machine-understandable
semantics among different applications and parties. Yet, for ontologies to meet
these expectations, their quality must be of a good standard. The quality of an
ontology is strongly based on the design method employed. This paper addresses
the design problems related to the modelling of ontologies, with specific
concentration on the issues related to the quality of the conceptualisations
produced. The paper aims to demonstrate the impact of the modelling paradigm
adopted on the quality of ontological models and, consequently, the potential
impact that such a decision can have in relation to the development of software
applications. To this aim, an ontology that is conceptualised based on the
Object-Role Modelling (ORM) approach (a representative of endurantism) is
re-engineered into a one modelled on the basis of the Object Paradigm (OP) (a
representative of perdurantism). Next, the two ontologies are analytically
compared using the specified criteria. The conducted comparison highlights that
using the OP for ontology conceptualisation can provide more expressive,
reusable, objective and temporal ontologies than those conceptualised on the
basis of the ORM approach
BPMN: A Meta Model for the Happy Path
Recently, the OMG has been working on developing a new standard for a business process management notation (BPMN). This standard development results in documents that contain the newest approved version of a standard or a standard proposal that can be ammended. It is our vision that such a standard document, that also serves as a specification for BPMN modeling tool developers could benefit from a fact-oriented model in which the same domain knowledge is represented conceptually as a list of concept definitions (including naming conventions), a set of information structure diagrams and the constraints or business rules that govern the instances of the information structure diagrams. In this paper we will show precisely, how such a fact-oriented conceptual view on a standard document can be created, and we will show how a fact-oriented approach can improve the completeness of a specification.management information;
Referent tracking for corporate memories
For corporate memory and enterprise ontology systems to be maximally useful,
they must be freed from certain barriers placed around them by traditional
knowledge management paradigms. This means, above all, that they must mirror
more faithfully those portions of reality which are salient to the workings of the
enterprise, including the changes that occur with the passage of time. The purpose
of this chapter is to demonstrate how theories based on philosophical realism can
contribute to this objective. We discuss how realism-based ontologies (capturing
what is generic) combined with referent tracking (capturing what is specific) can
play a key role in building the robust and useful corporate memories of the future
Data in Business Process Models. A Preliminary Empirical Study
Traditional activity-centric process modeling languages treat data as simple black boxes acting as input or output for activities. Many alternate and emerging process modeling paradigms, such as case handling and artifact-centric process modeling, give data a more central role. This is achieved by introducing lifecycles and states for data objects, which is beneficial when modeling data-or knowledge-intensive processes. We assume that traditional activity-centric process modeling languages lack the capabilities to adequately capture the complexity of such processes. To verify this assumption we conducted an online interview among BPM experts. The results not only allow us to identify various profiles of persons modeling business processes, but also the problems that exist in contemporary modeling languages w.r.t. The modeling of business data. Overall, this preliminary empirical study confirms the necessity of data-awareness in process modeling notations in general
A mathematical resurgence of risk management: an extreme modeling of expert opinions
The Operational Risk Advanced Measurement Approach requires financial institutions to use scenarios to model these risks and to evaluate the pertaining capital charges. Considering that a banking group is composed of numerous entities (branches and subsidiaries), and that each one of them is represented by an Operational Risk Manager (ORM), we propose a novel scenario approach based on ORM expertise to collect information and create new data sets focusing on large losses, and the use of the Extreme Value Theory (EVT) to evaluate the corresponding capital allocation. In this paper, we highlight the importance to consider an a priori knowledge of the experts associated to a a posteriori backtesting based on collected incidents.Basel II; operational risks; EVT; AMA; expert; Value-at-Risk; expected shortfall
An Aggregated Information Technology Checklist for Operational Risk Management
This study addresses the issue of the Information Technology (IT) Governance frameworks and standards that respond to different levels of operational risks, especially those caused by the information systems and technology infrastructure. A requirement analysis regarding Basel II is conducted, a gap analysis between the Information Control Models (ICMs) is performed, and the aggregated IT checklist for Operational Risk Management (ORM) is proposed by mapping the control objectives in ICMs to the operational risk categories described in Basel II as loss event types. The validity and reliability of the study is based on the focus group assessment of the mappingsBasel II, Operational Risk Management, Information Control Model, Information Technology Governance.
A Configurable Matchmaking Framework for Electronic Marketplaces
E-marketplaces constitute a major enabler of B2B and B2C e-commerce activities. This paper proposes a framework for one of the central activities of e-marketplaces: matchmaking of trading intentions lodged by market participants. The framework identifies a core set of concepts and functions that are common to all types of marketplaces and can serve as the basis for describing the distinct styles of matchmaking employed within various market mechanisms. A prototype implementation of the framework based on Web services technology is presented, illustrating its ability to be dynamically configured to meet specific market needs and its potential to serve as a foundation for more fully fledged e-marketplace frameworks
- …