21,430 research outputs found
Digital ecosystem ontology
Digital ecosystems is a neoteric terminology and there are two major definitions about it respectively from Soluta.Net and from Digital Ecosystem and Business Intelligence Institute. In this paper, to solve the ambiguous problem in digital ecosystem's definitions and to help researchers better understand what it is, by means of ontology, we propose a conceptual model to completely illustrate the concepts in digital ecosystem. By introducing a new ontology notation system, we deliver the digital ecosystem ontology, to well define the components and explain the relationships between these components. Finally we realize the ontology in Protege-OWL and conclude our future works in the field
Value-driven partner search for <i>Energy from Waste</i> projects
Energy from Waste (EfW) projects require complex value chains to operate effectively. To identify business partners, plant operators need to network with organisations whose strategic objectives are aligned with their own. Supplier organisations need to work out where they fit in the value chain. Our aim is to support people in identifying potential business partners, based on their organisation’s interpretation of value. Value for an organisation should reflect its strategy and may be interpreted using key priorities and KPIs (key performance indicators). KPIs may comprise any or all of knowledge, operational, economic, social and convenience indicators. This paper presents an ontology for modelling and prioritising connections within the business environment, and in the process provides means for defining value and mapping these to corresponding KPIs. The ontology is used to guide the design of a visual representation of the environment to aid partner search
Use of anti-terrorism digital ecosystem in the fight against terrorism
In this paper, we propose an Anti-terrorist Digital Ecosystem (ATDES) that enables efficient terrorist identification and protection against terrorist attacks. An Anti-terrorist Digital Environment (ATDE) is designed as being populated by interconnected Anti-terrorist Digital Components (ATDC). ATDC are combined together to support collaboration, cooperation and sharing of available information between various regions, countries and even continents.ATDC may be any useful idea that can be digitalized, transported within the ecosystem and processed by humans or by computers. The key ATDC include ID databases that contain personal records, screening components that read personal records and match them with the available information from the ID databases and machine-readable personal records. The available information is put into one big virtual database and enables matching of personal records.If the available information is to be shared between various ID information resources, standardization of data needs to take place. Ontologies can be used for this purpose. Instantiation of the Ontology concepts result in ID Ontologies that act as personal records. Because Ontology files are machine readable, it is possible to do the matching of personal records with the available ID records from the networked ID databases and to action the results.The significance of this research lies in the unification of the advances of the Ontology technology and Ecosystem paradigm for the purpose of creating a more secure environment in which to fight against terrorism
Interoperability in the OpenDreamKit Project: The Math-in-the-Middle Approach
OpenDreamKit --- "Open Digital Research Environment Toolkit for the
Advancement of Mathematics" --- is an H2020 EU Research Infrastructure project
that aims at supporting, over the period 2015--2019, the ecosystem of
open-source mathematical software systems. From that, OpenDreamKit will deliver
a flexible toolkit enabling research groups to set up Virtual Research
Environments, customised to meet the varied needs of research projects in pure
mathematics and applications.
An important step in the OpenDreamKit endeavor is to foster the
interoperability between a variety of systems, ranging from computer algebra
systems over mathematical databases to front-ends. This is the mission of the
integration work package (WP6). We report on experiments and future plans with
the \emph{Math-in-the-Middle} approach. This information architecture consists
in a central mathematical ontology that documents the domain and fixes a joint
vocabulary, combined with specifications of the functionalities of the various
systems. Interaction between systems can then be enriched by pivoting off this
information architecture.Comment: 15 pages, 7 figure
Development of digital twin ecosystem and ontology in medicine
Summary: Providing citizens with high-quality and safe medical services, providing information support for medical research and continuous medical education, making both doctor’s decisions and management decisions necessitated the provision of tools to ensure complex digitization of healthcare. To achieve these goals, a wide range of modern technologies have emerged. One such technology is digital twin technology.
Modern medicine, being formed in the environment of Health 4.0, includes not only the treatment of patients, but also the management of healthcare, the prevention of diseases and the processes of health restoration. With the increasing popularity of information communication technologies, people’s demand for health services is shifting from offline service to new online models. Currently, the field of online medicine is not developed enough to serve the elderly, chronically ill people and the people with infectious diseases. Using the advantages of digital twins in solving these problems can give positive results.
The article describes the nature, capabilities and applications of digital twin technology. The principles of the formation of the medical digital twin ecosystem are developed to ensure citizens’ accessibility to medical services and to make both medical and managerial decisions. The architecture and structural components of the digital twin ecosystem providing the connection between physical medical objects (patient, hospital, doctor, etc.) and their virtual images are shown. An ontological model for the staged construction and functionalization of the general DT of healthcare is proposed and its hierarchical architecture is establishe
OntoMath 2.0 Ontology: Updates of the Formal Model
This paper is devoted to the problems of ontology-based mathematical
knowledge management and representation. The main attention is paid to the
development of a formal model for the representation of mathematical statements
in the Open Linked Data cloud. The proposed model is intended for applications
that extract mathematical facts from natural language mathematical texts and
represent these facts as Linked Open Data. The model is used in development of
a new version of the OntoMath ontology of professional
mathematics is described. OntoMath underlies a semantic
publishing platform, that takes as an input a collection of mathematical papers
in LaTeX format and builds their ontology-based Linked Open Data
representation. The semantic publishing platform, in turn, is a central
component of OntoMath digital ecosystem, an ecosystem of ontologies, text
analytics tools, and applications for mathematical knowledge management,
including semantic search for mathematical formulas and a recommender system
for mathematical papers. According to the new model, the ontology is organized
into three layers: a foundational ontology layer, a domain ontology layer and a
linguistic layer. The domain ontology layer contains language-independent math
concepts. The linguistic layer provides linguistic grounding for these
concepts, and the foundation ontology layer provides them with meta-ontological
annotations. The concepts are organized in two main hierarchies: the hierarchy
of objects and the hierarchy of reified relationships
Distributed human computation framework for linked data co-reference resolution
Distributed Human Computation (DHC) is a technique used to solve computational problems by incorporating the collaborative effort of a large number of humans. It is also a solution to AI-complete problems such as natural language processing. The Semantic Web with its root in AI is envisioned to be a decentralised world-wide information space for sharing machine-readable data with minimal integration costs. There are many research problems in the Semantic Web that are considered as AI-complete problems. An example is co-reference resolution, which involves determining whether different URIs refer to the same entity. This is considered to be a significant hurdle to overcome in the realisation of large-scale Semantic Web applications. In this paper, we propose a framework for building a DHC system on top of the Linked Data Cloud to solve various computational problems. To demonstrate the concept, we are focusing on handling the co-reference resolution in the Semantic Web when integrating distributed datasets. The traditional way to solve this problem is to design machine-learning algorithms. However, they are often computationally expensive, error-prone and do not scale. We designed a DHC system named iamResearcher, which solves the scientific publication author identity co-reference problem when integrating distributed bibliographic datasets. In our system, we aggregated 6 million bibliographic data from various publication repositories. Users can sign up to the system to audit and align their own publications, thus solving the co-reference problem in a distributed manner. The aggregated results are published to the Linked Data Cloud
- …