3,637 research outputs found

    A Comprehensive Survey of Deep Learning in Remote Sensing: Theories, Tools and Challenges for the Community

    Full text link
    In recent years, deep learning (DL), a re-branding of neural networks (NNs), has risen to the top in numerous areas, namely computer vision (CV), speech recognition, natural language processing, etc. Whereas remote sensing (RS) possesses a number of unique challenges, primarily related to sensors and applications, inevitably RS draws from many of the same theories as CV; e.g., statistics, fusion, and machine learning, to name a few. This means that the RS community should be aware of, if not at the leading edge of, of advancements like DL. Herein, we provide the most comprehensive survey of state-of-the-art RS DL research. We also review recent new developments in the DL field that can be used in DL for RS. Namely, we focus on theories, tools and challenges for the RS community. Specifically, we focus on unsolved challenges and opportunities as it relates to (i) inadequate data sets, (ii) human-understandable solutions for modelling physical phenomena, (iii) Big Data, (iv) non-traditional heterogeneous data sources, (v) DL architectures and learning algorithms for spectral, spatial and temporal data, (vi) transfer learning, (vii) an improved theoretical understanding of DL systems, (viii) high barriers to entry, and (ix) training and optimizing the DL.Comment: 64 pages, 411 references. To appear in Journal of Applied Remote Sensin

    Continuous-variable quantum neural networks

    Full text link
    We introduce a general method for building neural networks on quantum computers. The quantum neural network is a variational quantum circuit built in the continuous-variable (CV) architecture, which encodes quantum information in continuous degrees of freedom such as the amplitudes of the electromagnetic field. This circuit contains a layered structure of continuously parameterized gates which is universal for CV quantum computation. Affine transformations and nonlinear activation functions, two key elements in neural networks, are enacted in the quantum network using Gaussian and non-Gaussian gates, respectively. The non-Gaussian gates provide both the nonlinearity and the universality of the model. Due to the structure of the CV model, the CV quantum neural network can encode highly nonlinear transformations while remaining completely unitary. We show how a classical network can be embedded into the quantum formalism and propose quantum versions of various specialized model such as convolutional, recurrent, and residual networks. Finally, we present numerous modeling experiments built with the Strawberry Fields software library. These experiments, including a classifier for fraud detection, a network which generates Tetris images, and a hybrid classical-quantum autoencoder, demonstrate the capability and adaptability of CV quantum neural networks

    Applying Deep Learning to Fast Radio Burst Classification

    Get PDF
    Upcoming Fast Radio Burst (FRB) surveys will search \sim10\,3^3 beams on sky with very high duty cycle, generating large numbers of single-pulse candidates. The abundance of false positives presents an intractable problem if candidates are to be inspected by eye, making it a good application for artificial intelligence (AI). We apply deep learning to single pulse classification and develop a hierarchical framework for ranking events by their probability of being true astrophysical transients. We construct a tree-like deep neural network (DNN) that takes multiple or individual data products as input (e.g. dynamic spectra and multi-beam detection information) and trains on them simultaneously. We have built training and test sets using false-positive triggers from real telescopes, along with simulated FRBs, and single pulses from pulsars. Training of the DNN was independently done for two radio telescopes: the CHIME Pathfinder, and Apertif on Westerbork. High accuracy and recall can be achieved with a labelled training set of a few thousand events. Even with high triggering rates, classification can be done very quickly on Graphical Processing Units (GPUs). That speed is essential for selective voltage dumps or issuing real-time VOEvents. Next, we investigate whether dedispersion back-ends could be completely replaced by a real-time DNN classifier. It is shown that a single forward propagation through a moderate convolutional network could be faster than brute-force dedispersion; but the low signal-to-noise per pixel makes such a classifier sub-optimal for this problem. Real-time automated classification may prove useful for bright, unexpected signals, both now and in the era of radio astronomy when data volumes and the searchable parameter spaces further outgrow our ability to manually inspect the data, such as for SKA and ngVLA

    Unmasking Clever Hans Predictors and Assessing What Machines Really Learn

    Full text link
    Current learning machines have successfully solved hard application problems, reaching high accuracy and displaying seemingly "intelligent" behavior. Here we apply recent techniques for explaining decisions of state-of-the-art learning machines and analyze various tasks from computer vision and arcade games. This showcases a spectrum of problem-solving behaviors ranging from naive and short-sighted, to well-informed and strategic. We observe that standard performance evaluation metrics can be oblivious to distinguishing these diverse problem solving behaviors. Furthermore, we propose our semi-automated Spectral Relevance Analysis that provides a practically effective way of characterizing and validating the behavior of nonlinear learning machines. This helps to assess whether a learned model indeed delivers reliably for the problem that it was conceived for. Furthermore, our work intends to add a voice of caution to the ongoing excitement about machine intelligence and pledges to evaluate and judge some of these recent successes in a more nuanced manner.Comment: Accepted for publication in Nature Communication
    corecore