166,934 research outputs found
Ontology-driven conceptual modeling: A'systematic literature mapping and review
All rights reserved. Ontology-driven conceptual modeling (ODCM) is still a relatively new research domain in the field of information systems and there is still much discussion on how the research in ODCM should be performed and what the focus of this research should be. Therefore, this article aims to critically survey the existing literature in order to assess the kind of research that has been performed over the years, analyze the nature of the research contributions and establish its current state of the art by positioning, evaluating and interpreting relevant research to date that is related to ODCM. To understand and identify any gaps and research opportunities, our literature study is composed of both a systematic mapping study and a systematic review study. The mapping study aims at structuring and classifying the area that is being investigated in order to give a general overview of the research that has been performed in the field. A review study on the other hand is a more thorough and rigorous inquiry and provides recommendations based on the strength of the found evidence. Our results indicate that there are several research gaps that should be addressed and we further composed several research opportunities that are possible areas for future research
Ontology modelling methodology for temporal and interdependent applications
The increasing adoption of Semantic Web technology by several classes of applications in recent years, has made ontology engineering a crucial part of application development. Nowadays, the abundant accessibility of interdependent information from multiple resources and representing various fields such as health, transport, and banking etc., further evidence the growing need for utilising ontology for the development of Web applications. While there have been several advances in the adoption of the ontology for application development, less emphasis is being made on the modelling methodologies for representing modern-day application that are characterised by the temporal nature of the data they process, which is captured from multiple sources. Taking into account the benefits of a methodology in the system development, we propose a novel methodology for modelling ontologies representing Context-Aware Temporal and Interdependent Systems (CATIS). CATIS is an ontology development methodology for modelling temporal interdependent applications in order to achieve the desired results when modelling sophisticated applications with temporal and inter dependent attributes to suit today's application requirements
Recommended from our members
Modelling the integration of BP and IT using business process simulation
Information Technology (IT) and Business Process (BP) communities argues that the use of IT to support business processes can bring a number of benefits to the organisation. Most of these benefits, however, can only be seen after the implementation of such technology. Moreover, there are many cases where the benefits brought by the implementation of IT do not fulfil the organisation’s expectations. One reason of this may happen is because research in BP and IS domains show little indication of which modelling methods, techniques or tools can help organisations to foresee the benefits of the integration of IT with BP. This paper describes the insights gained during a UK funded research project, namely ASSESS-IT, which used simulation techniques to address this problem. Considering IT as a two layered system, namely Information Systems (IS) and Computer Networks (CN), ASSESS-IT aimed to depict the benefits that new IT may bring to the BP. This paper uses the outcomes derived from ASSESS-IT to suggest that, in some cases; the relationship between BP and IT could be better understood by looking at the relationship between BP and IS alone. It then proposes an alternative simulation framework, namely ISBPS, that provides the means to develop simulation models that portray quantifiable metrics of the integration of BP and IS, offering in this way an alternative mechanism that can help BP and IS analyst to foresee the benefits that the insertion of a given IS design may bring to the organisational processes
Accurate macroscale modelling of spatial dynamics in multiple dimensions
Developments in dynamical systems theory provides new support for the
macroscale modelling of pdes and other microscale systems such as Lattice
Boltzmann, Monte Carlo or Molecular Dynamics simulators. By systematically
resolving subgrid microscale dynamics the dynamical systems approach constructs
accurate closures of macroscale discretisations of the microscale system. Here
we specifically explore reaction-diffusion problems in two spatial dimensions
as a prototype of generic systems in multiple dimensions. Our approach unifies
into one the modelling of systems by a type of finite elements, and the
`equation free' macroscale modelling of microscale simulators efficiently
executing only on small patches of the spatial domain. Centre manifold theory
ensures that a closed model exist on the macroscale grid, is emergent, and is
systematically approximated. Dividing space either into overlapping finite
elements or into spatially separated small patches, the specially crafted
inter-element/patch coupling also ensures that the constructed discretisations
are consistent with the microscale system/PDE to as high an order as desired.
Computer algebra handles the considerable algebraic details as seen in the
specific application to the Ginzburg--Landau PDE. However, higher order models
in multiple dimensions require a mixed numerical and algebraic approach that is
also developed. The modelling here may be straightforwardly adapted to a wide
class of reaction-diffusion PDEs and lattice equations in multiple space
dimensions. When applied to patches of microscopic simulations our coupling
conditions promise efficient macroscale simulation.Comment: some figures with 3D interaction when viewed in Acrobat Reader. arXiv
admin note: substantial text overlap with arXiv:0904.085
Tools for modelling support and construction of optimization applications
We argue the case for an open systems approach towards modelling and application support. We discuss how the 'usability' and 'skills' analysis naturally leads to a viable strategy for integrating application construction with modelling tools and optimizers. The role of the implementation environment is also seen to be critical in that it is retained as a building block within the resulting system
Recommended from our members
Business process simulation: An alternative modelling technique for the information system development process
This paper discusses the idea that even though information systems development
(ISD) approaches have long advocated the use of integrated organisational views, the
modelling techniques used have not been adapted accordingly and remain focused on
the automated information system (IS) solution. Existing research provides evidence
that business process simulation (BPS) can be used at different points in the ISD
process to provide better integrated organisational views that aid the design of
appropriate IS solutions. Despite this fact, research in this area is not extensive;
suggesting that the potential of using BPS for the ISD process is not yet well
understood. The paper uses the findings from three different case studies to illustrate
the ways BPS has been used at different points in the ISD process. It compares the
results against IS modelling techniques, highlighting the advantages and
disadvantages that BPS has over the latter. The research necessary to develop
appropriate BPS tools and give guidance on their use in the ISD process is discussed
Modelling and simulation framework for reactive transport of organic contaminants in bed-sediments using a pure java object - oriented paradigm
Numerical modelling and simulation of organic contaminant reactive transport in the environment is being increasingly
relied upon for a wide range of tasks associated with risk-based decision-making, such as prediction of contaminant
profiles, optimisation of remediation methods, and monitoring of changes resulting from an implemented remediation
scheme. The lack of integration of multiple mechanistic models to a single modelling framework, however, has
prevented the field of reactive transport modelling in bed-sediments from developing a cohesive understanding of
contaminant fate and behaviour in the aquatic sediment environment. This paper will investigate the problems involved
in the model integration process, discuss modelling and software development approaches, and present preliminary
results from use of CORETRANS, a predictive modelling framework that simulates 1-dimensional organic contaminant
reaction and transport in bed-sediments
A look at cloud architecture interoperability through standards
Enabling cloud infrastructures to evolve into a transparent platform while preserving integrity raises interoperability issues. How components are connected needs to be addressed. Interoperability requires standard data models and communication encoding technologies compatible with the existing Internet infrastructure. To reduce vendor lock-in situations, cloud computing must implement universal strategies regarding standards, interoperability and portability. Open standards are of critical importance and need to be embedded into interoperability solutions. Interoperability is determined at the data level as well as the service level. Corresponding modelling standards and integration solutions shall be analysed
Synthetic Observational Health Data with GANs: from slow adoption to a boom in medical research and ultimately digital twins?
After being collected for patient care, Observational Health Data (OHD) can
further benefit patient well-being by sustaining the development of health
informatics and medical research. Vast potential is unexploited because of the
fiercely private nature of patient-related data and regulations to protect it.
Generative Adversarial Networks (GANs) have recently emerged as a
groundbreaking way to learn generative models that produce realistic synthetic
data. They have revolutionized practices in multiple domains such as
self-driving cars, fraud detection, digital twin simulations in industrial
sectors, and medical imaging.
The digital twin concept could readily apply to modelling and quantifying
disease progression. In addition, GANs posses many capabilities relevant to
common problems in healthcare: lack of data, class imbalance, rare diseases,
and preserving privacy. Unlocking open access to privacy-preserving OHD could
be transformative for scientific research. In the midst of COVID-19, the
healthcare system is facing unprecedented challenges, many of which of are data
related for the reasons stated above.
Considering these facts, publications concerning GAN applied to OHD seemed to
be severely lacking. To uncover the reasons for this slow adoption, we broadly
reviewed the published literature on the subject. Our findings show that the
properties of OHD were initially challenging for the existing GAN algorithms
(unlike medical imaging, for which state-of-the-art model were directly
transferable) and the evaluation synthetic data lacked clear metrics.
We find more publications on the subject than expected, starting slowly in
2017, and since then at an increasing rate. The difficulties of OHD remain, and
we discuss issues relating to evaluation, consistency, benchmarking, data
modelling, and reproducibility.Comment: 31 pages (10 in previous version), not including references and
glossary, 51 in total. Inclusion of a large number of recent publications and
expansion of the discussion accordingl
- …