3,719 research outputs found

    Neuron Sensitivity Guided Test Case Selection for Deep Learning Testing

    Full text link
    Deep Neural Networks~(DNNs) have been widely deployed in software to address various tasks~(e.g., autonomous driving, medical diagnosis). However, they could also produce incorrect behaviors that result in financial losses and even threaten human safety. To reveal the incorrect behaviors in DNN and repair them, DNN developers often collect rich unlabeled datasets from the natural world and label them to test the DNN models. However, properly labeling a large number of unlabeled datasets is a highly expensive and time-consuming task. To address the above-mentioned problem, we propose NSS, Neuron Sensitivity guided test case Selection, which can reduce the labeling time by selecting valuable test cases from unlabeled datasets. NSS leverages the internal neuron's information induced by test cases to select valuable test cases, which have high confidence in causing the model to behave incorrectly. We evaluate NSS with four widely used datasets and four well-designed DNN models compared to SOTA baseline methods. The results show that NSS performs well in assessing the test cases' probability of fault triggering and model improvement capabilities. Specifically, compared with baseline approaches, NSS obtains a higher fault detection rate~(e.g., when selecting 5\% test case from the unlabeled dataset in MNIST \& LeNet1 experiment, NSS can obtain 81.8\% fault detection rate, 20\% higher than baselines)

    Detection and Localization of Leaks in Water Networks

    Get PDF
    Today, 844 million humans around the world have no access to safe drinking water. Furthermore, every 90 seconds, one child dies from water-related illnesses. Major cities lose 15% - 50% of their water and, in some cases, losses may reach up to 70%, mostly due to leaks. Therefore, it is paramount to preserve water as an invaluable resource through water networks, particularly in large cities in which leak repair may cause disruption. Municipalities usually tackle leak problems using various detection systems and technologies, often long after leaks occur; however, such efforts are not enough to detect leaks at early stages. Therefore, the main objectives of the present research are to develop and validate a leak detection system and to optimize leak repair prioritization. The development of the leak detection models goes through several phases: (1) technology and device selection, (2) experimental work, (3) signal analysis, (4) selection of parameters, (5) machine learning model development and (6) validation of developed models. To detect leaks, vibration signals are collected through a variety of controlled experiments on PVC and ductile iron pipelines using wireless accelerometers, i.e., micro-electronic mechanical sensors (MEMS). The signals are analyzed to pinpoint leaks in water pipelines. Similarly, acoustic signals are collected from a pilot project in the city of Montreal, using noise loggers as another detection technology. The collected signals are also analyzed to detect and pinpoint the leaks. The leak detection system has presented promising results using both technologies. The developed MEMS model is capable of accurately pinpointing leaks within 12 centimeters from the exact location. Comparatively, for noise loggers, the developed model can detect the exact leak location within a 25-cm radius for an actual leak. The leak repair prioritization model uses two optimization techniques: (1) a well-known genetic algorithm and (2) a newly innovative Lazy Serpent Algorithm that is developed in the present research. The Lazy Serpent Algorithm has proved capable of surpassing the genetic algorithm in determining a more optimal schedule using much less computation time. The developed research proves that automated real-time leak detection is possible and can help governments save water resource and funds. The developed research proves the viability of accelerometers as a standalone leak detection technology and opens the door for further research and experimentations. The leak detection system model helps municipalities and water resource agencies rapidly detect leaks when they occur in real-time. The developed pinpointing models facilitate the leak repair process by precisely determine the leak location where the repair works should be conducted. The Lazy Serpent Algorithm helps municipalities better distribute their resources to maximize their desired benefits

    Heath-PRIOR: An Intelligent Ensemble Architecture to Identify Risk Cases in Healthcare

    Get PDF
    Smart city environments, when applied to healthcare, improve the quality of people\u27s lives, enabling, for instance, disease prediction and treatment monitoring. In medical settings, case prioritization is of great importance, with beneficial outcomes both in terms of patient health and physicians\u27 daily work. Recommender systems are an alternative to automatically integrate the data generated in such environments with predictive models and recommend actions, content, or services. The data produced by smart devices are accurate and reliable for predictive and decision-making contexts. This study main purpose is to assist patients and doctors in the early detection of disease or prediction of postoperative worsening through constant monitoring. To achieve this objective, this study proposes an architecture for recommender systems applied to healthcare, which can prioritize emergency cases. The architecture brings an ensemble approach for prediction, which adopts multiple Machine Learning algorithms. The methodology used to carry out the study followed three steps. First, a systematic literature mapping, second, the construction and development of the architecture, and third, the evaluation through two case studies. The results demonstrated the feasibility of the proposal. The predictions are promising and adherent to the application context for accurate datasets with a low amount of noises or missing values

    Cyber Security and Critical Infrastructures 2nd Volume

    Get PDF
    The second volume of the book contains the manuscripts that were accepted for publication in the MDPI Special Topic "Cyber Security and Critical Infrastructure" after a rigorous peer-review process. Authors from academia, government and industry contributed their innovative solutions, consistent with the interdisciplinary nature of cybersecurity. The book contains 16 articles, including an editorial that explains the current challenges, innovative solutions and real-world experiences that include critical infrastructure and 15 original papers that present state-of-the-art innovative solutions to attacks on critical systems

    Magnetoencephalography as a tool in psychiatric research: current status and perspective

    Get PDF
    The application of neuroimaging to provide mechanistic insights into circuit dysfunctions in major psychiatric conditions and the development of biomarkers are core challenges in current psychiatric research. In this review, we propose that recent technological and analytic advances in Magnetoencephalography (MEG), a technique which allows the measurement of neuronal events directly and non-invasively with millisecond resolution, provides novel opportunities to address these fundamental questions. Because of its potential in delineating normal and abnormal brain dynamics, we propose that MEG provides a crucial tool to advance our understanding of pathophysiological mechanisms of major neuropsychiatric conditions, such as Schizophrenia, Autism Spectrum Disorders, and the dementias. In our paper, we summarize the mechanisms underlying the generation of MEG signals and the tools available to reconstruct generators and underlying networks using advanced source-reconstruction techniques. We then survey recent studies that have utilized MEG to examine aberrant rhythmic activity in neuropsychiatric disorders. This is followed by links with preclinical research, which have highlighted possible neurobiological mechanisms, such as disturbances in excitation/inhibition parameters, which could account for measured changes in neural oscillations. In the final section of the paper, challenges as well as novel methodological developments are discussed which could pave the way for a widespread application of MEG in translational research with the aim of developing biomarkers for early detection and diagnosis

    Quadri-dimensional approach for data analytics in mobile networks

    Get PDF
    The telecommunication market is growing at a very fast pace with the evolution of new technologies to support high speed throughput and the availability of a wide range of services and applications in the mobile networks. This has led to a need for communication service providers (CSPs) to shift their focus from network elements monitoring towards services monitoring and subscribers’ satisfaction by introducing the service quality management (SQM) and the customer experience management (CEM) that require fast responses to reduce the time to find and solve network problems, to ensure efficiency and proactive maintenance, to improve the quality of service (QoS) and the quality of experience (QoE) of the subscribers. While both the SQM and the CEM demand multiple information from different interfaces, managing multiple data sources adds an extra layer of complexity with the collection of data. While several studies and researches have been conducted for data analytics in mobile networks, most of them did not consider analytics based on the four dimensions involved in the mobile networks environment which are the subscriber, the handset, the service and the network element with multiple interface correlation. The main objective of this research was to develop mobile network analytics models applied to the 3G packet-switched domain by analysing data from the radio network with the Iub interface and the core network with the Gn interface to provide a fast root cause analysis (RCA) approach considering the four dimensions involved in the mobile networks. This was achieved by using the latest computer engineering advancements which are Big Data platforms and data mining techniques through machine learning algorithms.Electrical and Mining EngineeringM. Tech. (Electrical Engineering

    Mapping the Structure and Evolution of Software Testing Research Over the Past Three Decades

    Full text link
    Background: The field of software testing is growing and rapidly-evolving. Aims: Based on keywords assigned to publications, we seek to identify predominant research topics and understand how they are connected and have evolved. Method: We apply co-word analysis to map the topology of testing research as a network where author-assigned keywords are connected by edges indicating co-occurrence in publications. Keywords are clustered based on edge density and frequency of connection. We examine the most popular keywords, summarize clusters into high-level research topics, examine how topics connect, and examine how the field is changing. Results: Testing research can be divided into 16 high-level topics and 18 subtopics. Creation guidance, automated test generation, evolution and maintenance, and test oracles have particularly strong connections to other topics, highlighting their multidisciplinary nature. Emerging keywords relate to web and mobile apps, machine learning, energy consumption, automated program repair and test generation, while emerging connections have formed between web apps, test oracles, and machine learning with many topics. Random and requirements-based testing show potential decline. Conclusions: Our observations, advice, and map data offer a deeper understanding of the field and inspiration regarding challenges and connections to explore.Comment: To appear, Journal of Systems and Softwar

    Machine Learning and Integrative Analysis of Biomedical Big Data.

    Get PDF
    Recent developments in high-throughput technologies have accelerated the accumulation of massive amounts of omics data from multiple sources: genome, epigenome, transcriptome, proteome, metabolome, etc. Traditionally, data from each source (e.g., genome) is analyzed in isolation using statistical and machine learning (ML) methods. Integrative analysis of multi-omics and clinical data is key to new biomedical discoveries and advancements in precision medicine. However, data integration poses new computational challenges as well as exacerbates the ones associated with single-omics studies. Specialized computational approaches are required to effectively and efficiently perform integrative analysis of biomedical data acquired from diverse modalities. In this review, we discuss state-of-the-art ML-based approaches for tackling five specific computational challenges associated with integrative analysis: curse of dimensionality, data heterogeneity, missing data, class imbalance and scalability issues
    • …
    corecore