23,515 research outputs found

    Progressive Multi-Scale Residual Network for Single Image Super-Resolution

    Full text link
    Multi-scale convolutional neural networks (CNNs) achieve significant success in single image super-resolution (SISR), which considers the comprehensive information from different receptive fields. However, recent multi-scale networks usually aim to build the hierarchical exploration with different sizes of filters, which lead to high computation complexity costs, and seldom focus on the inherent correlations among different scales. This paper converts the multi-scale exploration into a sequential manner, and proposes a progressive multi-scale residual network (PMRN) for SISR problem. Specifically, we devise a progressive multi-scale residual block (PMRB) to substitute the larger filters with small filter combinations, and gradually explore the hierarchical information. Furthermore, channel- and pixel-wise attention mechanism (CPA) is designed for finding the inherent correlations among image features with weighting and bias factors, which concentrates more on high-frequency information. Experimental results show that the proposed PMRN recovers structural textures more effectively with superior PSNR/SSIM results than other small networks. The extension model PMRN+^+ with self-ensemble achieves competitive or better results than large networks with much fewer parameters and lower computation complexity.Comment: This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessibl

    A Fully Progressive Approach to Single-Image Super-Resolution

    Full text link
    Recent deep learning approaches to single image super-resolution have achieved impressive results in terms of traditional error measures and perceptual quality. However, in each case it remains challenging to achieve high quality results for large upsampling factors. To this end, we propose a method (ProSR) that is progressive both in architecture and training: the network upsamples an image in intermediate steps, while the learning process is organized from easy to hard, as is done in curriculum learning. To obtain more photorealistic results, we design a generative adversarial network (GAN), named ProGanSR, that follows the same progressive multi-scale design principle. This not only allows to scale well to high upsampling factors (e.g., 8x) but constitutes a principled multi-scale approach that increases the reconstruction quality for all upsampling factors simultaneously. In particular ProSR ranks 2nd in terms of SSIM and 4th in terms of PSNR in the NTIRE2018 SISR challenge [34]. Compared to the top-ranking team, our model is marginally lower, but runs 5 times faster

    Deep Laplacian Pyramid Networks for Fast and Accurate Super-Resolution

    Full text link
    Convolutional neural networks have recently demonstrated high-quality reconstruction for single-image super-resolution. In this paper, we propose the Laplacian Pyramid Super-Resolution Network (LapSRN) to progressively reconstruct the sub-band residuals of high-resolution images. At each pyramid level, our model takes coarse-resolution feature maps as input, predicts the high-frequency residuals, and uses transposed convolutions for upsampling to the finer level. Our method does not require the bicubic interpolation as the pre-processing step and thus dramatically reduces the computational complexity. We train the proposed LapSRN with deep supervision using a robust Charbonnier loss function and achieve high-quality reconstruction. Furthermore, our network generates multi-scale predictions in one feed-forward pass through the progressive reconstruction, thereby facilitates resource-aware applications. Extensive quantitative and qualitative evaluations on benchmark datasets show that the proposed algorithm performs favorably against the state-of-the-art methods in terms of speed and accuracy.Comment: This work is accepted in CVPR 2017. The code and datasets are available on http://vllab.ucmerced.edu/wlai24/LapSRN
    • …
    corecore