50,351 research outputs found

    Convolutional Neural Networks Applied to Neutrino Events in a Liquid Argon Time Projection Chamber

    Full text link
    We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at or near ground level

    Convolutional neural networks applied to neutrino events in a liquid argon time projection chamber

    Get PDF
    We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at or near ground level

    Combination of satellite imagery and wind data in deep learning approach to detect oil spills

    Get PDF
    The ocean is vulnerable to oil related activities such as oil production and transport that can harm the environment. Environmental damages from oil spills can be large if not dealt with. Satellite images from radar are useful to detect oil spills because they cover both day and night and penetrates clouds. However, detecting oil spills in ocean areas from satellite images are not a trivial task due to abundance of lookalikes from other natural sources, like river inputs or geological seepage. Auxiliary data such as wind speed in the monitored area, are used to separate oil spills from natural occurring slicks in the manual oil detection process. One solution to detect oil spills is applying artificial intelligence techniques like convolutional neural networks. These convolutional neural networks have usually been a candidate to create an automatic oil detection process. However, the convolutional neural networks have problems with distinguishing between spilled oil spills and look-alikes. This project is about exploring the possibility of detecting oil spills from satellite images and distinguish between spilled oil spills and natural occurring ones by using wind speed data of the area. The convolutional neural network takes in both satellite images and auxiliary wind speed data of the area monitored. Two convolutional neural networks are designed and setup, where one includes auxiliary wind speed data and the other does not. Both CNN’s will have the same satellite images and oil spills to detect such that a direct comparison can be made between them. This work will also be a proof of concept to an automated oil spill detection process that specifically uses wind data in addition to the satellite images. To measure any difference in validation loss, precision or recall by using wind data, both convolutional neural networks are tuned to the same recall such that the false negatives are as low as possible for both neural networks. The comparison between the two neural networks shows that the neural network that includes wind data has 15 % lower validation loss and a slightly higher precision than the neural network that does not include wind data. However, this result is achieved by using wind data generated from the satellite image itself, which metrological wind data is not. A comparison test like this but with metrological wind data instead of wind data generated from the satellite image is considered future work that is worth exploring

    Convolutional Neural Networks Applied to Neutrino Events in a Liquid Argon Time Projection Chamber

    Get PDF
    We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at or near ground level

    Vanishing Point Detection with Direct and Transposed Fast Hough Transform inside the neural network

    Get PDF
    In this paper, we suggest a new neural network architecture for vanishing point detection in images. The key element is the use of the direct and transposed Fast Hough Transforms separated by convolutional layer blocks with standard activation functions. It allows us to get the answer in the coordinates of the input image at the output of the network and thus to calculate the coordinates of the vanishing point by simply selecting the maximum. Besides, it was proved that calculation of the transposed Fast Hough Transform can be performed using the direct one. The use of integral operators enables the neural network to rely on global rectilinear features in the image, and so it is ideal for detecting vanishing points. To demonstrate the effectiveness of the proposed architecture, we use a set of images from a DVR and show its superiority over existing methods. Note, in addition, that the proposed neural network architecture essentially repeats the process of direct and back projection used, for example, in computed tomography.Comment: 9 pages, 9 figures, submitted to "Computer Optics"; extra experiment added, new theorem proof added, references added; typos correcte
    corecore