55 research outputs found
Cognitive Models and Computational Approaches for improving Situation Awareness Systems
2016 - 2017The world of Internet of Things is pervaded by complex environments
with smart services available every time and everywhere. In
such a context, a serious open issue is the capability of information
systems to support adaptive and collaborative decision processes
in perceiving and elaborating huge amounts of data. This requires
the design and realization of novel socio-technical systems based on
the âhuman-in-the-loopâ paradigm. The presence of both humans
and software in such systems demands for adequate levels of Situation
Awareness (SA). To achieve and maintain proper levels of
SA is a daunting task due to the intrinsic technical characteristics
of systems and the limitations of human cognitive mechanisms.
In the scientific literature, such issues hindering the SA formation
process are defined as SA demons.
The objective of this research is to contribute to the resolution
of the SA demons by means of the identification of information
processing paradigms for an original support to the SA and the
definition of new theoretical and practical approaches based on
cognitive models and computational techniques.
The research work starts with an in-depth analysis and some
preliminary verifications of methods, techniques, and systems of
SA. A major outcome of this analysis is that there is only a limited
use of the Granular Computing paradigm (GrC) in the SA
field, despite the fact that SA and GrC share many concepts and
principles. The research work continues with the definition of contributions
and original results for the resolution of significant SA
demons, exploiting some of the approaches identified in the analysis
phase (i.e., ontologies, data mining, and GrC). The first contribution addresses the issues related to the bad perception of data
by users. We propose a semantic approach for the quality-aware
sensor data management which uses a data imputation technique
based on association rule mining. The second contribution proposes
an original ontological approach to situation management,
namely the Adaptive Goal-driven Situation Management. The approach
uses the ontological modeling of goals and situations and
a mechanism that suggests the most relevant goals to the users at
a given moment. Lastly, the adoption of the GrC paradigm allows
the definition of a novel model for representing and reasoning
on situations based on a set theoretical framework. This model
has been instantiated using the rough sets theory. The proposed
approaches and models have been implemented in prototypical systems.
Their capabilities in improving SA in real applications have
been evaluated with typical methodologies used for SA systems. [edited by Author]XXX cicl
Recommended from our members
Granular computing approach for intelligent classifier design
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University London.Granular computing facilitates dealing with information by providing a theoretical framework to deal with information as granules at different levels of granularity (different levels of specificity/abstraction). It aims to provide an abstract explainable description of the data by forming granules that represent the features or the
underlying structure of corresponding subsets of the data. In this thesis, a granular computing approach to the design of intelligent classification systems is proposed. The proposed approach is employed for different
classification systems to investigate its efficiency. Fuzzy inference systems, neural networks, neuro-fuzzy systems and classifier ensembles are considered to evaluate the efficiency of the proposed approach. Each of the considered systems is designed using the proposed approach and classification performance is evaluated and compared to that of the standard system. The proposed approach is based on constructing information granules from data at multiple levels of granularity. The granulation process is performed using a modified fuzzy c-means algorithm that takes classification problem into account. Clustering is followed by a coarsening process that involves merging small clusters into large ones to form a lower granularity level. The resulted granules are used to build each of the considered binary classifiers in different settings and approaches.
Granules produced by the proposed granulation method are used to build a fuzzy classifier for each granulation level or set of levels. The performance of the classifiers is evaluated using real life data sets and measured by two classification performance measures: accuracy and area under receiver operating characteristic curve. Experimental results show that fuzzy systems constructed using the proposed method achieved better classification performance. In addition, the proposed approach is used for the design of neural network classifiers. Resulted granules from one or more granulation levels are used to train the classifiers at different levels of specificity/abstraction. Using this approach, the classification problem is broken down into the modelling of classification rules represented by the information granules resulting in more interpretable system. Experimental results show that neural network classifiers trained using the proposed approach have better classification performance for most of the data sets. In a similar manner, the proposed approach is used for the training of neuro-fuzzy systems resulting in similar improvement in classification performance. Lastly, neural networks built using the proposed approach are used to construct a classifier ensemble. Information granules are used to generate and train the base classifiers. The final ensemble output is produced by a weighted sum combiner. Based on the experimental results, the proposed approach has improved the classification performance of the base classifiers for most of the data sets. Furthermore, a genetic algorithm is used to determine the combiner weights automatically.Higher Committee for Education Development in Iraq (HCED
A multi-objective optimization approach for the synthesis of granular computing-based classification systems in the graph domain
The synthesis of a pattern recognition system usually aims at the optimization of a given performance index. However, in many real-world scenarios, there exist other desired facets to take into account. In this regard, multi-objective optimization acts as the main tool for the optimization of different (and possibly conflicting) objective functions in order to seek for potential trade-offs among them. In this paper, we propose a three-objective optimization problem for the synthesis of a granular computing-based pattern recognition system in the graph domain. The core pattern recognition engine searches for suitable information granules (i.e., recurrent and/or meaningful subgraphs from the training data) on the top of which the graph embedding procedure towards the Euclidean space is performed. In the latter, any classification system can be employed. The optimization problem aims at jointly optimizing the performance of the classifier, the number of information granules and the structural complexity of the classification model. Furthermore, we address the problem of selecting a suitable number of solutions from the resulting Pareto Fronts in order to compose an ensemble of classifiers to be tested on previously unseen data. To perform such selection, we employed a multi-criteria decision making routine by analyzing different case studies that differ on how much each objective function weights in the ranking process. Results on five open-access datasets of fully labeled graphs show that exploiting the ensemble is effective (especially when the structural complexity of the model plays a minor role in the decision making process) if compared against the baseline solution that solely aims at maximizing the performances
New Fundamental Technologies in Data Mining
The progress of data mining technology and large public popularity establish a need for a comprehensive text on the subject. The series of books entitled by "Data Mining" address the need by presenting in-depth description of novel mining algorithms and many useful applications. In addition to understanding each section deeply, the two books present useful hints and strategies to solving problems in the following chapters. The contributing authors have highlighted many future research directions that will foster multi-disciplinary collaborations and hence will lead to significant development in the field of data mining
Transformation of graphical models to support knowledge transfer
Menschliche Experten verfĂŒgen ĂŒber die FĂ€higkeit, ihr Entscheidungsverhalten flexibel auf die jeweilige Situation abzustimmen. Diese FĂ€higkeit zahlt sich insbesondere dann aus, wenn Entscheidungen unter beschrĂ€nkten Ressourcen wie Zeitrestriktionen getroffen werden mĂŒssen. In solchen Situationen ist es besonders vorteilhaft, die ReprĂ€sentation des zugrunde liegenden Wissens anpassen und Entscheidungsmodelle auf unterschiedlichen Abstraktionsebenen verwenden zu können. Weiterhin zeichnen sich menschliche Experten durch die FĂ€higkeit aus, neben unsicheren Informationen auch unscharfe Wahrnehmungen in die Entscheidungsfindung einzubeziehen.
Klassische entscheidungstheoretische Modelle basieren auf dem Konzept der RationalitĂ€t, wobei in jeder Situation die nutzenmaximale Entscheidung einer Entscheidungsfunktion zugeordnet wird. Neuere graphbasierte Modelle wie Bayes\u27sche Netze oder Entscheidungsnetze machen entscheidungstheoretische Methoden unter dem Aspekt der Modellbildung interessant. Als Hauptnachteil lĂ€sst sich die KomplexitĂ€t nennen, wobei Inferenz in Entscheidungsnetzen NP-hart ist. Zielsetzung dieser Dissertation ist die Transformation entscheidungstheoretischer Modelle in Fuzzy-Regelbasen als Zielsprache. Fuzzy-Regelbasen lassen sich effizient auswerten, eignen sich zur Approximation nichtlinearer funktionaler Beziehungen und garantieren die Interpretierbarkeit des resultierenden Handlungsmodells. Die Ăbersetzung eines Entscheidungsmodells in eine Fuzzy-Regelbasis wird durch einen neuen Transformationsprozess unterstĂŒtzt.
Ein Agent kann zunĂ€chst ein Bayes\u27sches Netz durch Anwendung eines in dieser Arbeit neu vorgestellten parametrisierten Strukturlernalgorithmus generieren lassen. AnschlieĂend lĂ€sst sich durch Anwendung von PrĂ€ferenzlernverfahren und durch PrĂ€zisierung der Wahrscheinlichkeitsinformation ein entscheidungstheoretisches Modell erstellen. Ein Transformationsalgorithmus kompiliert daraus eine Regelbasis, wobei ein ApproximationsmaĂ den erwarteten Nutzenverlust als GĂŒtekriterium berechnet. Anhand eines Beispiels zur ZustandsĂŒberwachung einer Rotationsspindel wird die Praxistauglichkeit des Konzeptes gezeigt.Human experts are able to flexible adjust their decision behaviour with regard to the respective situation. This capability pays in situations under limited resources like time restrictions. It is particularly advantageous to adapt the underlying knowledge representation and to make use of decision models at different levels of abstraction. Furthermore human experts have the ability to include uncertain information and vague perceptions in decision making.
Classical decision-theoretic models are based directly on the concept of rationality, whereby the decision behaviour prescribed by the principle of maximum expected utility. For each observation some optimal decision function prescribes an action that maximizes expected utility. Modern graph-based methods like Bayesian networks or influence diagrams make use of modelling. One disadvantage of decision-theoretic methods concerns the issue of complexity. Finding an optimal decision might become very expensive. Inference in decision networks is known to be NP-hard. This dissertation aimed at combining the advantages of decision-theoretic models with rule-based systems by transforming a decision-theoretic model into a fuzzy rule-based system. Fuzzy rule bases are an efficient implementation from a computational point of view, they can approximate non-linear functional dependencies and they are also intelligible. There was a need for establishing a new transformation process to generate rule-based representations from decision models, which provide an efficient implementation architecture and represent knowledge in an explicit, intelligible way. At first, an agent can apply the new parameterized structure learning algorithm to identify the structure of the Bayesian network. The use of learning approaches to determine preferences and the specification of probability information subsequently enables to model decision and utility nodes and to generate a consolidated decision-theoretic model. Hence, a transformation process compiled a rule base by measuring the utility loss as approximation measure. The transformation process concept has been successfully applied to the problem of representing condition monitoring results for a rotation spindle
Intelligent Systems
This book is dedicated to intelligent systems of broad-spectrum application, such as personal and social biosafety or use of intelligent sensory micro-nanosystems such as "e-nose", "e-tongue" and "e-eye". In addition to that, effective acquiring information, knowledge management and improved knowledge transfer in any media, as well as modeling its information content using meta-and hyper heuristics and semantic reasoning all benefit from the systems covered in this book. Intelligent systems can also be applied in education and generating the intelligent distributed eLearning architecture, as well as in a large number of technical fields, such as industrial design, manufacturing and utilization, e.g., in precision agriculture, cartography, electric power distribution systems, intelligent building management systems, drilling operations etc. Furthermore, decision making using fuzzy logic models, computational recognition of comprehension uncertainty and the joint synthesis of goals and means of intelligent behavior biosystems, as well as diagnostic and human support in the healthcare environment have also been made easier
Active provenance for data intensive research
The role of provenance information in data-intensive research is a significant topic of
discussion among technical experts and scientists. Typical use cases addressing traceability,
versioning and reproducibility of the research findings are extended with more
interactive scenarios in support, for instance, of computational steering and results
management. In this thesis we investigate the impact that lineage records can have on
the early phases of the analysis, for instance performed through near-real-time systems
and Virtual Research Environments (VREs) tailored to the requirements of a specific
community. By positioning provenance at the centre of the computational research
cycle, we highlight the importance of having mechanisms at the data-scientistsâ side
that, by integrating with the abstractions offered by the processing technologies, such
as scientific workflows and data-intensive tools, facilitate the expertsâ contribution to
the lineage at runtime. Ultimately, by encouraging tuning and use of provenance for
rapid feedback, the thesis aims at improving the synergy between different user groups
to increase productivity and understanding of their processes.
We present a model of provenance, called S-PROV, that uses and further extends
PROV and ProvONE. The relationships and properties characterising the workflowâs
abstractions and their concrete executions are re-elaborated to include aspects related
to delegation, distribution and steering of stateful streaming operators. The model is
supported by the Active framework for tuneable and actionable lineage ensuring the
userâs engagement by fostering rapid exploitation. Here, concepts such as provenance
types, configuration and explicit state management allow users to capture complex
provenance scenarios and activate selective controls based on domain and user-defined
metadata. We outline how the traces are recorded in a new comprehensive system,
called S-ProvFlow, enabling different classes of consumers to explore the provenance
data with services and tools for monitoring, in-depth validation and comprehensive
visual-analytics. The work of this thesis will be discussed in the context of an existing
computational framework and the experience matured in implementing provenance-aware
tools for seismology and climate VREs. It will continue to evolve through
newly funded projects, thereby providing generic and user-centred solutions for data-intensive
research
Engineering Systems Integration
Dreamers may envision our future, but it is the pragmatists who build it. Solve the right problem in the right way, mankind moves forward. Solve the right problem in the wrong way or the wrong problem in the right way, however clever or ingenious the solution, neither credits mankind. Instead, this misfire demonstrates a failure to appreciate a crucial step in pragmatic problem solving: systems integration. The first book to address the underlying premises of systems integration and how to exposit them in a practical and productive manner, Engineering Systems Integration: Theory, Metrics, and Methods looks at the fundamental nature of integration, exposes the subtle premises to achieve integration, and posits a substantial theoretical framework that is both simple and clear. Offering systems managers and systems engineers the framework from which to consider their decisions in light of systems integration metrics, the book isolates two basic questions, 1) Is there a way to express the interplay of human actions and the result of system interactions of a product with its environment?, and 2) Are there methods that combine to improve the integration of systems? The author applies the four axioms of General Systems Theory (holism, decomposition, isomorphism, and models) and explores the domains of history and interpretation to devise a theory of systems integration, develop practical guidance applying the three frameworks, and formulate the mathematical constructs needed for systems integration. The practicalities of integrating parts when we build or analyze systems mandate an analysis and evaluation of existing integrative frameworks of causality and knowledge. Integration is not just a word that describes a best practice, an art, or a single discipline. The act of integrating is an approach, operative in all disciplines, in all we see, in all we do
Engineering Systems Integration
Dreamers may envision our future, but it is the pragmatists who build it. Solve the right problem in the right way, mankind moves forward. Solve the right problem in the wrong way or the wrong problem in the right way, however clever or ingenious the solution, neither credits mankind. Instead, this misfire demonstrates a failure to appreciate a crucial step in pragmatic problem solving: systems integration. The first book to address the underlying premises of systems integration and how to exposit them in a practical and productive manner, Engineering Systems Integration: Theory, Metrics, and Methods looks at the fundamental nature of integration, exposes the subtle premises to achieve integration, and posits a substantial theoretical framework that is both simple and clear. Offering systems managers and systems engineers the framework from which to consider their decisions in light of systems integration metrics, the book isolates two basic questions, 1) Is there a way to express the interplay of human actions and the result of system interactions of a product with its environment?, and 2) Are there methods that combine to improve the integration of systems? The author applies the four axioms of General Systems Theory (holism, decomposition, isomorphism, and models) and explores the domains of history and interpretation to devise a theory of systems integration, develop practical guidance applying the three frameworks, and formulate the mathematical constructs needed for systems integration. The practicalities of integrating parts when we build or analyze systems mandate an analysis and evaluation of existing integrative frameworks of causality and knowledge. Integration is not just a word that describes a best practice, an art, or a single discipline. The act of integrating is an approach, operative in all disciplines, in all we see, in all we do
- âŠ