8,712 research outputs found

    A Deep Cascade of Convolutional Neural Networks for MR Image Reconstruction

    Full text link
    The acquisition of Magnetic Resonance Imaging (MRI) is inherently slow. Inspired by recent advances in deep learning, we propose a framework for reconstructing MR images from undersampled data using a deep cascade of convolutional neural networks to accelerate the data acquisition process. We show that for Cartesian undersampling of 2D cardiac MR images, the proposed method outperforms the state-of-the-art compressed sensing approaches, such as dictionary learning-based MRI (DLMRI) reconstruction, in terms of reconstruction error, perceptual quality and reconstruction speed for both 3-fold and 6-fold undersampling. Compared to DLMRI, the error produced by the method proposed is approximately twice as small, allowing to preserve anatomical structures more faithfully. Using our method, each image can be reconstructed in 23 ms, which is fast enough to enable real-time applications

    Transformer Networks for Trajectory Forecasting

    Full text link
    Most recent successes on forecasting the people motion are based on LSTM models and all most recent progress has been achieved by modelling the social interaction among people and the people interaction with the scene. We question the use of the LSTM models and propose the novel use of Transformer Networks for trajectory forecasting. This is a fundamental switch from the sequential step-by-step processing of LSTMs to the only-attention-based memory mechanisms of Transformers. In particular, we consider both the original Transformer Network (TF) and the larger Bidirectional Transformer (BERT), state-of-the-art on all natural language processing tasks. Our proposed Transformers predict the trajectories of the individual people in the scene. These are "simple" model because each person is modelled separately without any complex human-human nor scene interaction terms. In particular, the TF model without bells and whistles yields the best score on the largest and most challenging trajectory forecasting benchmark of TrajNet. Additionally, its extension which predicts multiple plausible future trajectories performs on par with more engineered techniques on the 5 datasets of ETH + UCY. Finally, we show that Transformers may deal with missing observations, as it may be the case with real sensor data. Code is available at https://github.com/FGiuliari/Trajectory-Transformer.Comment: 18 pages, 3 figure

    When Kernel Methods meet Feature Learning: Log-Covariance Network for Action Recognition from Skeletal Data

    Full text link
    Human action recognition from skeletal data is a hot research topic and important in many open domain applications of computer vision, thanks to recently introduced 3D sensors. In the literature, naive methods simply transfer off-the-shelf techniques from video to the skeletal representation. However, the current state-of-the-art is contended between to different paradigms: kernel-based methods and feature learning with (recurrent) neural networks. Both approaches show strong performances, yet they exhibit heavy, but complementary, drawbacks. Motivated by this fact, our work aims at combining together the best of the two paradigms, by proposing an approach where a shallow network is fed with a covariance representation. Our intuition is that, as long as the dynamics is effectively modeled, there is no need for the classification network to be deep nor recurrent in order to score favorably. We validate this hypothesis in a broad experimental analysis over 6 publicly available datasets.Comment: 2017 IEEE Computer Vision and Pattern Recognition (CVPR) Workshop

    Fast Radio Burst 121102 Pulse Detection and Periodicity: A Machine Learning Approach

    Get PDF
    We report the detection of 72 new pulses from the repeating fast radio burst FRB 121102 in Breakthrough Listen C-band (4-8 GHz) observations at the Green Bank Telescope. The new pulses were found with a convolutional neural network in data taken on August 26, 2017, where 21 bursts have been previously detected. Our technique combines neural network detection with dedispersion verification. For the current application we demonstrate its advantage over a traditional brute-force dedis- persion algorithm in terms of higher sensitivity, lower false positive rates, and faster computational speed. Together with the 21 previously reported pulses, this observa- tion marks the highest number of FRB 121102 pulses from a single observation, total- ing 93 pulses in five hours, including 45 pulses within the first 30 minutes. The number of data points reveal trends in pulse fluence, pulse detection rate, and pulse frequency structure. We introduce a new periodicity search technique, based on the Rayleigh test, to analyze the time of arrivals, with which we exclude with 99% confidence pe- riodicity in time of arrivals with periods larger than 5.1 times the model-dependent time-stamp uncertainty. In particular, we rule out constant periods >10 ms in the barycentric arrival times, though intrinsic periodicity in the time of emission remains plausible.Comment: 32 pages, 10 figure
    • …
    corecore