1,193 research outputs found

    The University Defence Research Collaboration In Signal Processing

    Get PDF
    This chapter describes the development of algorithms for automatic detection of anomalies from multi-dimensional, undersampled and incomplete datasets. The challenge in this work is to identify and classify behaviours as normal or abnormal, safe or threatening, from an irregular and often heterogeneous sensor network. Many defence and civilian applications can be modelled as complex networks of interconnected nodes with unknown or uncertain spatio-temporal relations. The behavior of such heterogeneous networks can exhibit dynamic properties, reflecting evolution in both network structure (new nodes appearing and existing nodes disappearing), as well as inter-node relations. The UDRC work has addressed not only the detection of anomalies, but also the identification of their nature and their statistical characteristics. Normal patterns and changes in behavior have been incorporated to provide an acceptable balance between true positive rate, false positive rate, performance and computational cost. Data quality measures have been used to ensure the models of normality are not corrupted by unreliable and ambiguous data. The context for the activity of each node in complex networks offers an even more efficient anomaly detection mechanism. This has allowed the development of efficient approaches which not only detect anomalies but which also go on to classify their behaviour

    Sensor Signal and Information Processing III

    Get PDF

    The University Defence Research Collaboration In Signal Processing: 2013-2018

    Get PDF
    Signal processing is an enabling technology crucial to all areas of defence and security. It is called for whenever humans and autonomous systems are required to interpret data (i.e. the signal) output from sensors. This leads to the production of the intelligence on which military outcomes depend. Signal processing should be timely, accurate and suited to the decisions to be made. When performed well it is critical, battle-winning and probably the most important weapon which you’ve never heard of. With the plethora of sensors and data sources that are emerging in the future network-enabled battlespace, sensing is becoming ubiquitous. This makes signal processing more complicated but also brings great opportunities. The second phase of the University Defence Research Collaboration in Signal Processing was set up to meet these complex problems head-on while taking advantage of the opportunities. Its unique structure combines two multi-disciplinary academic consortia, in which many researchers can approach different aspects of a problem, with baked-in industrial collaboration enabling early commercial exploitation. This phase of the UDRC will have been running for 5 years by the time it completes in March 2018, with remarkable results. This book aims to present those accomplishments and advances in a style accessible to stakeholders, collaborators and exploiters

    Parallel Prediction Method of Knowledge Proficiency Based on Bloom’s Cognitive Theory

    Get PDF
    Knowledge proficiency refers to the extent to which students master knowledge and reflects their cognitive status. To accurately assess knowledge proficiency, various pedagogical theories have emerged. Bloom’s cognitive theory, proposed in 1956 as one of the classic theories, follows the cognitive progression from foundational to advanced levels, categorizing cognition into multiple tiers including “knowing”, “understanding”, and “application”, thereby constructing a hierarchical cognitive structure. This theory is predominantly employed to frame the design of teaching objectives and guide the implementation of teaching activities. Additionally, due to the large number of students in real-world online education systems, the time required to calculate knowledge proficiency is significantly high and unacceptable. To ensure the applicability of this method in large-scale systems, there is a substantial demand for the design of a parallel prediction model to assess knowledge proficiency. The research in this paper is grounded in Bloom’s Cognitive theory, and a Bloom Cognitive Diagnosis Parallel Model (BloomCDM) for calculating knowledge proficiency is designed based on this theory. The model is founded on the concept of matrix decomposition. In the theoretical modeling phase, hierarchical and inter-hierarchical assumptions are initially established, leading to the abstraction of the mathematical model. Subsequently, subject features are mapped onto the three-tier cognitive space of “knowing”, “understanding”, and “applying” to derive the posterior distribution of the target parameters. Upon determining the objective function of the model, both student and topic characteristic parameters are computed to ascertain students’ knowledge proficiency. During the modeling process, in order to formalize the mathematical expressions of “understanding” and “application”, the notions of “knowledge group” and “higher-order knowledge group” are introduced, along with a parallel method for identifying the structure of higher-order knowledge groups. Finally, the experiments in this paper validate that the model can accurately diagnose students’ knowledge proficiency, affirming the scientific and meaningful integration of Bloom’s cognitive hierarchy in knowledge proficiency assessment

    Design of large polyphase filters in the Quadratic Residue Number System

    Full text link

    Methods for enhanced learning using wearable technologies. A study of the maritime sector

    Get PDF
    Maritime safety is a critical concern due to the potential for serious consequences or accidents for the crew, passengers, environment, and assets resulting from navigation errors or unsafe acts. Traditional training methods face challenges in the rapidly evolving maritime industry, and innovative training methods are being explored. This study explores the use of wearable sensors with biosignal data collection to improve training performance in the maritime sector. Three experiments were conducted progressively to investigate the relationship between navigators' experience levels and biosignal data results, the effects of different training methods on cognitive workload, trainees' stress levels, and their decision-making skills, and the classification of scenario complexity and the biosignal data obtained by the trainees. questionnaire data on stress levels, workload, and user satisfaction of auxiliary training equipment; performance evaluation data on navigational abilities, decision-making skills, and ship-handling abilities; and biosignal data, including electrodermal activity (EDA), body temperature, blood volume pulse (BVP), inter-beat interval (IBI), and heart rate (HR). Several statistical methods and machine-learning algorithms were used in the data analysis. The present dissertation contributes to the advancement of the field of maritime education and training by exploring methods for enhancing learning in complex situations. The use of biosignal data provides insights into the interplay between stress levels and training outcomes in the maritime industry. The proposed conceptual training model underscores the relationship between trainees' stress and safety factors and offers a framework for the development and evaluation of advanced biosignal data-based training systems

    Temperature aware power optimization for multicore floating-point units

    Full text link

    Advances and Applications of Dezert-Smarandache Theory (DSmT) for Information Fusion (Collected Works), Vol. 4

    Get PDF
    The fourth volume on Advances and Applications of Dezert-Smarandache Theory (DSmT) for information fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics. The contributions (see List of Articles published in this book, at the end of the volume) have been published or presented after disseminating the third volume (2009, http://fs.unm.edu/DSmT-book3.pdf) in international conferences, seminars, workshops and journals. First Part of this book presents the theoretical advancement of DSmT, dealing with Belief functions, conditioning and deconditioning, Analytic Hierarchy Process, Decision Making, Multi-Criteria, evidence theory, combination rule, evidence distance, conflicting belief, sources of evidences with different importance and reliabilities, importance of sources, pignistic probability transformation, Qualitative reasoning under uncertainty, Imprecise belief structures, 2-Tuple linguistic label, Electre Tri Method, hierarchical proportional redistribution, basic belief assignment, subjective probability measure, Smarandache codification, neutrosophic logic, Evidence theory, outranking methods, Dempster-Shafer Theory, Bayes fusion rule, frequentist probability, mean square error, controlling factor, optimal assignment solution, data association, Transferable Belief Model, and others. More applications of DSmT have emerged in the past years since the apparition of the third book of DSmT 2009. Subsequently, the second part of this volume is about applications of DSmT in correlation with Electronic Support Measures, belief function, sensor networks, Ground Moving Target and Multiple target tracking, Vehicle-Born Improvised Explosive Device, Belief Interacting Multiple Model filter, seismic and acoustic sensor, Support Vector Machines, Alarm classification, ability of human visual system, Uncertainty Representation and Reasoning Evaluation Framework, Threat Assessment, Handwritten Signature Verification, Automatic Aircraft Recognition, Dynamic Data-Driven Application System, adjustment of secure communication trust analysis, and so on. Finally, the third part presents a List of References related with DSmT published or presented along the years since its inception in 2004, chronologically ordered

    Exploring the Landscape of Ubiquitous In-home Health Monitoring: A Comprehensive Survey

    Full text link
    Ubiquitous in-home health monitoring systems have become popular in recent years due to the rise of digital health technologies and the growing demand for remote health monitoring. These systems enable individuals to increase their independence by allowing them to monitor their health from the home and by allowing more control over their well-being. In this study, we perform a comprehensive survey on this topic by reviewing a large number of literature in the area. We investigate these systems from various aspects, namely sensing technologies, communication technologies, intelligent and computing systems, and application areas. Specifically, we provide an overview of in-home health monitoring systems and identify their main components. We then present each component and discuss its role within in-home health monitoring systems. In addition, we provide an overview of the practical use of ubiquitous technologies in the home for health monitoring. Finally, we identify the main challenges and limitations based on the existing literature and provide eight recommendations for potential future research directions toward the development of in-home health monitoring systems. We conclude that despite extensive research on various components needed for the development of effective in-home health monitoring systems, the development of effective in-home health monitoring systems still requires further investigation.Comment: 35 pages, 5 figure

    Overview of Bayesian sequential Monte Carlo methods for group and extended object tracking

    Get PDF
    This work presents the current state-of-the-art in techniques for tracking a number of objects moving in a coordinated and interacting fashion. Groups are structured objects characterized with particular motion patterns. The group can be comprised of a small number of interacting objects (e.g. pedestrians, sport players, convoy of cars) or of hundreds or thousands of components such as crowds of people. The group object tracking is closely linked with extended object tracking but at the same time has particular features which differentiate it from extended objects. Extended objects, such as in maritime surveillance, are characterized by their kinematic states and their size or volume. Both group and extended objects give rise to a varying number of measurements and require trajectory maintenance. An emphasis is given here to sequential Monte Carlo (SMC) methods and their variants. Methods for small groups and for large groups are presented, including Markov Chain Monte Carlo (MCMC) methods, the random matrices approach and Random Finite Set Statistics methods. Efficient real-time implementations are discussed which are able to deal with the high dimensionality and provide high accuracy. Future trends and avenues are traced. © 2013 Elsevier Inc. All rights reserved
    • 

    corecore