4,379 research outputs found

    GRU-based denoising autoencoder for detection and clustering of unknown single and concurrent faults during system integration testing of automotive software systems

    Get PDF
    Recently, remarkable successes have been achieved in the quality assurance of automotive software systems (ASSs) through the utilization of real-time hardware-in-the-loop (HIL) simulation. Based on the HIL platform, safe, flexible and reliable realistic simulation during the system development process can be enabled. However, notwithstanding the test automation capability, large amounts of recordings data are generated as a result of HIL test executions. Expert knowledge-based approaches to analyze the generated recordings, with the aim of detecting and identifying the faults, are costly in terms of time, effort and difficulty. Therefore, in this study, a novel deep learning-based methodology is proposed so that the faults of automotive sensor signals can be efficiently and automatically detected and identified without human intervention. Concretely, a hybrid GRU-based denoising autoencoder (GRU-based DAE) model with the k-means algorithm is developed for the fault-detection and clustering problem in sequential data. By doing so, based on the real-time historical data, not only individual faults but also unknown simultaneous faults under noisy conditions can be accurately detected and clustered. The applicability and advantages of the proposed method for the HIL testing process are demonstrated by two automotive case studies. To be specific, a high-fidelity gasoline engine and vehicle dynamic system along with an entire vehicle model are considered to verify the performance of the proposed model. The superiority of the proposed architecture compared to other autoencoder variants is presented in the results in terms of reconstruction error under several noise levels. The validation results indicate that the proposed model can perform high detection and clustering accuracy of unknown faults compared to stand-alone techniques

    A Functional Architecture Approach to Neural Systems

    Get PDF
    The technology for the design of systems to perform extremely complex combinations of real-time functionality has developed over a long period. This technology is based on the use of a hardware architecture with a physical separation into memory and processing, and a software architecture which divides functionality into a disciplined hierarchy of software components which exchange unambiguous information. This technology experiences difficulty in design of systems to perform parallel processing, and extreme difficulty in design of systems which can heuristically change their own functionality. These limitations derive from the approach to information exchange between functional components. A design approach in which functional components can exchange ambiguous information leads to systems with the recommendation architecture which are less subject to these limitations. Biological brains have been constrained by natural pressures to adopt functional architectures with this different information exchange approach. Neural networks have not made a complete shift to use of ambiguous information, and do not address adequate management of context for ambiguous information exchange between modules. As a result such networks cannot be scaled to complex functionality. Simulations of systems with the recommendation architecture demonstrate the capability to heuristically organize to perform complex functionality

    Causal Inference in Disease Spread across a Heterogeneous Social System

    Full text link
    Diffusion processes are governed by external triggers and internal dynamics in complex systems. Timely and cost-effective control of infectious disease spread critically relies on uncovering the underlying diffusion mechanisms, which is challenging due to invisible causality between events and their time-evolving intensity. We infer causal relationships between infections and quantify the reflexivity of a meta-population, the level of feedback on event occurrences by its internal dynamics (likelihood of a regional outbreak triggered by previous cases). These are enabled by our new proposed model, the Latent Influence Point Process (LIPP) which models disease spread by incorporating macro-level internal dynamics of meta-populations based on human mobility. We analyse 15-year dengue cases in Queensland, Australia. From our causal inference, outbreaks are more likely driven by statewide global diffusion over time, leading to complex behavior of disease spread. In terms of reflexivity, precursory growth and symmetric decline in populous regions is attributed to slow but persistent feedback on preceding outbreaks via inter-group dynamics, while abrupt growth but sharp decline in peripheral areas is led by rapid but inconstant feedback via intra-group dynamics. Our proposed model reveals probabilistic causal relationships between discrete events based on intra- and inter-group dynamics and also covers direct and indirect diffusion processes (contact-based and vector-borne disease transmissions).Comment: arXiv admin note: substantial text overlap with arXiv:1711.0635

    Malfunction diagnosis in industrial process systems using data mining for knowledge discovery

    Get PDF
    The determination of abnormal behavior at process industries gains increasing interest as strict regulations and highly competitive operation conditions are regularly applied at the process systems. A synergetic approach in exploring the behavior of industrial processes is proposed, targeting at the discovery of patterns and implement fault detection (malfunction) diagnosis. The patterns are based on highly correlated time series. The concept is based on the fact that if independent time series are combined based on rules, we can extract scenarios of functional and non-functional situations so as to monitor hazardous procedures occurring in workplaces. The selected methods combine and apply actions on historically stored, experimental data from a chemical pilot plant, located at CERTH/CPERI. The implementation of the clustering and classification methods showed promising results of determining with great accuracy (97%) the potential abnormal situations

    REISCH: incorporating lightweight and reliable algorithms into healthcare applications of WSNs

    Get PDF
    Healthcare institutions require advanced technology to collect patients' data accurately and continuously. The tradition technologies still suffer from two problems: performance and security efficiency. The existing research has serious drawbacks when using public-key mechanisms such as digital signature algorithms. In this paper, we propose Reliable and Efficient Integrity Scheme for Data Collection in HWSN (REISCH) to alleviate these problems by using secure and lightweight signature algorithms. The results of the performance analysis indicate that our scheme provides high efficiency in data integration between sensors and server (saves more than 24% of alive sensors compared to traditional algorithms). Additionally, we use Automated Validation of Internet Security Protocols and Applications (AVISPA) to validate the security procedures in our scheme. Security analysis results confirm that REISCH is safe against some well-known attacks
    • …
    corecore