3,005 research outputs found

    Deep Learning with Dynamically Weighted Loss Function for Sensor-Based Prognostics and Health Management

    Get PDF
    Deep learning has been employed to prognostic and health management of automotive and aerospace with promising results. Literature in this area has revealed that most contributions regarding deep learning is largely focused on the model’s architecture. However, contributions regarding improvement of different aspects in deep learning, such as custom loss function for prognostic and health management are scarce. There is therefore an opportunity to improve upon the effectiveness of deep learning for the system’s prognostics and diagnostics without modifying the models’ architecture. To address this gap, the use of two different dynamically weighted loss functions, a newly proposed weighting mechanism and a focal loss function for prognostics and diagnostics task are investigated. A dynamically weighted loss function is expected to modify the learning process by augmenting the loss function with a weight value corresponding to the learning error of each data instance. The objective is to force deep learning models to focus on those instances where larger learning errors occur in order to improve their performance. The two loss functions used are evaluated using four popular deep learning architectures, namely, deep feedforward neural network, one-dimensional convolutional neural network, bidirectional gated recurrent unit and bidirectional long short-term memory on the commercial modular aero-propulsion system simulation data from NASA and air pressure system failure data for Scania trucks. Experimental results show that dynamically-weighted loss functions helps us achieve significant improvement for remaining useful life prediction and fault detection rate over non-weighted loss function predictions

    Towards a Taxonomic Benchmarking Framework for Predictive Maintenance: The Case of NASA’s Turbofan Degradation

    Get PDF
    The availability of datasets for analytical solution development is a common bottleneck in data-driven predictive maintenance. Therefore, novel solutions are mostly based on synthetic benchmarking examples, such as NASA’s C-MAPSS datasets, where researchers from various disciplines like artificial intelligence and statistics apply and test their methodical approaches. The majority of studies, however, only evaluate the overall solution against a final prediction score, where we argue that a more fine-grained consideration is required distinguishing between detailed method components to measure their particular impact along the prognostic development process. To address this issue, we first conduct a literature review resulting in more than one hundred studies using the C-MAPSS datasets. Subsequently, we apply a taxonomy approach to receive dimensions and characteristics that decompose complex analytical solutions into more manageable components. The result is a first draft of a systematic benchmarking framework as a more comparable basis for future development and evaluation purposes

    AxonDeepSeg: automatic axon and myelin segmentation from microscopy data using convolutional neural networks

    Get PDF
    Segmentation of axon and myelin from microscopy images of the nervous system provides useful quantitative information about the tissue microstructure, such as axon density and myelin thickness. This could be used for instance to document cell morphometry across species, or to validate novel non-invasive quantitative magnetic resonance imaging techniques. Most currently-available segmentation algorithms are based on standard image processing and usually require multiple processing steps and/or parameter tuning by the user to adapt to different modalities. Moreover, only few methods are publicly available. We introduce AxonDeepSeg, an open-source software that performs axon and myelin segmentation of microscopic images using deep learning. AxonDeepSeg features: (i) a convolutional neural network architecture; (ii) an easy training procedure to generate new models based on manually-labelled data and (iii) two ready-to-use models trained from scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Results show high pixel-wise accuracy across various species: 85% on rat SEM, 81% on human SEM, 95% on mice TEM and 84% on macaque TEM. Segmentation of a full rat spinal cord slice is computed and morphological metrics are extracted and compared against the literature. AxonDeepSeg is freely available at https://github.com/neuropoly/axondeepsegComment: 14 pages, 7 figure

    Designing a NISQ reservoir with maximal memory capacity for volatility forecasting

    Full text link
    Forecasting the CBOE volatility index (VIX) is a highly non-linear and memory-intensive task. In this paper, we use quantum reservoir computing to forecast the VIX using S&P500 (SPX) time-series. Our reservoir is a hybrid quantum-classical system executed on IBM's 53-qubit Rochester chip. We encode the SPX values in the rotation angles and linearly combine the average spin of the six-qubit register to predict the value of VIX at the next time step. Our results demonstrate a potential application of noisy intermediate scale quantum (NISQ) devices to complex, real world applications
    • …
    corecore