3 research outputs found

    High-level Information Fusion for Constrained SMC Methods and Applications

    Get PDF
    Information Fusion is a field that studies processes utilizing data from various input sources, and techniques exploiting this data to produce estimates and knowledge about objects and situations. On the other hand, human computation is a new and evolving research area that uses human intelligence to solve computational problems that are beyond the scope of existing artificial intelligence algorithms. In previous systems, humans' role was mostly restricted for analysing a finished fusion product; however, in the current systems the role of humans is an integral element in a distributed framework, where many tasks can be accomplished by either humans or machines. Moreover, some information can be provided only by humans not machines, because the observational capabilities and opportunities for traditional electronic (hard) sensors are limited. A source-reliability-adaptive distributed non-linear estimation method applicable to a number of distributed state estimation problems is proposed. The proposed method requires only local data exchange among neighbouring sensor nodes. It therefore provides enhanced reliability, scalability, and ease of deployment. In particular, by taking into account the estimation reliability of each sensor node at any point in time, it yields a more robust distributed estimation. To perform the Multi-Model Particle Filtering (MMPF) in an adaptive distributed manner, a Gaussian approximation of the particle cloud obtained at each sensor node, along with a weighted Consensus Propagation (CP)-based distributed data aggregation scheme, are deployed to dynamically re-weight the particle clouds. The filtering is a soft-data-constrained variant of multi-model particle filter, and is capable of processing both soft human-generated data and conventional hard sensory data. If permanent noise occurs in the estimation provided by a sensor node, due to either a faulty sensing device or misleading soft data, the contribution of that node in the weighted consensus process is immediately reduced in order to alleviate its effect on the estimation provided by the neighbouring nodes and the entire network. The robustness of the proposed source-reliability-adaptive distributed estimation method is demonstrated through simulation results for agile target tracking scenarios. Agility here refers to cases in which the observed dynamics of targets deviate from the given probabilistic characterization. Furthermore, the same concept is applied to model soft data constrained multiple-model Probability Hypothesis Density (PHD) filter that can track agile multiple targets with non-linear dynamics, which is a challenging problem. In this case, a Sequential Monte Carlo-Probability Hypothesis Density (SMC-PHD) filter deploys a Random Set (RS) theoretic formulation, along with Sequential Monte Carlo approximation, a variant of Bayes filtering. In general, the performance of Bayesian filtering-based methods can be enhanced by using extra information incorporated as specific constraints into the filtering process. Following the same principle, the new approach uses a constrained variant of the SMC-PHD filter, in which a fuzzy logic approach is used to transform the inherently vague human-generated data into a set of constraints. These constraints are then enforced on the filtering process by applying them as coefficients to the particles' weights. Because the human generated Soft Data (SD), reports on target-agility level, the proposed constrained-filtering approach is capable of dealing with multiple agile target tracking scenarios

    Pro-active visualization of cyber security on a National Level : a South African case study

    Get PDF
    The need for increased national cyber security situational awareness is evident from the growing number of published national cyber security strategies. Governments are progressively seen as responsible for cyber security, but at the same time increasingly constrained by legal, privacy and resource considerations. Infrastructure and services that form part of the national cyber domain are often not under the control of government, necessitating the need for information sharing between governments and commercial partners. While sharing of security information is necessary, it typically requires considerable time to be implemented effectively. In an effort to decrease the time and effort required for cyber security situational awareness, this study considered commercially available data sources relating to a national cyber domain. Open source information is typically used by attackers to gather information with great success. An understanding of the data provided by these sources can also afford decision makers the opportunity to set priorities more effectively. Through the use of an adapted Joint Directors of Laboratories (JDL) fusion model, an experimental system was implemented that visualized the potential that open source intelligence could have on cyber situational awareness. Datasets used in the validation of the model contained information obtained from eight different data sources over a two year period with a focus on the South African .co.za sub domain. Over a million infrastructure devices were examined in this study along with information pertaining to a potential 88 million vulnerabilities on these devices. During the examination of data sources, a severe lack of information regarding the human aspect in cyber security was identified that led to the creation of a novel Personally Identifiable Information detection sensor (PII). The resultant two million records pertaining to PII in the South African domain were incorporated into the data fusion experiment for processing. The results of this processing are discussed in the three case studies. The results offered in this study aim to highlight how data fusion and effective visualization can serve to move national cyber security from a primarily reactive undertaking to a more pro-active model
    corecore