94 research outputs found

    Single-Image Deraining via Recurrent Residual Multiscale Networks.

    Full text link
    Existing deraining approaches represent rain streaks with different rain layers and then separate the layers from the background image. However, because of the complexity of real-world rain, such as various densities, shapes, and directions of rain streaks, it is very difficult to decompose a rain image into clean background and rain layers. In this article, we develop a novel single-image deraining method based on residual multiscale pyramid to mitigate the difficulty of rain image decomposition. To be specific, we progressively remove rain streaks in a coarse-to-fine fashion, where heavy rain is first removed in coarse-resolution levels and then light rain is eliminated in fine-resolution levels. Furthermore, based on the observation that residuals between a restored image and its corresponding rain image give critical clues of rain streaks, we regard the residuals as an attention map to remove rains in the consecutive finer level image. To achieve a powerful yet compact deraining framework, we construct our network by recurrent layers and remove rain with the same network in different pyramid levels. In addition, we design a multiscale kernel selection network (MSKSN) to facilitate our single network to remove rain streaks at different levels. In this manner, we reduce 81% of the model parameters without decreasing deraining performance compared with our prior work. Extensive experimental results on widely used benchmarks show that our approach achieves superior deraining performance compared with the state of the art

    RCDNet: An Interpretable Rain Convolutional Dictionary Network for Single Image Deraining

    Full text link
    As a common weather, rain streaks adversely degrade the image quality. Hence, removing rains from an image has become an important issue in the field. To handle such an ill-posed single image deraining task, in this paper, we specifically build a novel deep architecture, called rain convolutional dictionary network (RCDNet), which embeds the intrinsic priors of rain streaks and has clear interpretability. In specific, we first establish a RCD model for representing rain streaks and utilize the proximal gradient descent technique to design an iterative algorithm only containing simple operators for solving the model. By unfolding it, we then build the RCDNet in which every network module has clear physical meanings and corresponds to each operation involved in the algorithm. This good interpretability greatly facilitates an easy visualization and analysis on what happens inside the network and why it works well in inference process. Moreover, taking into account the domain gap issue in real scenarios, we further design a novel dynamic RCDNet, where the rain kernels can be dynamically inferred corresponding to input rainy images and then help shrink the space for rain layer estimation with few rain maps so as to ensure a fine generalization performance in the inconsistent scenarios of rain types between training and testing data. By end-to-end training such an interpretable network, all involved rain kernels and proximal operators can be automatically extracted, faithfully characterizing the features of both rain and clean background layers, and thus naturally lead to better deraining performance. Comprehensive experiments substantiate the superiority of our method, especially on its well generality to diverse testing scenarios and good interpretability for all its modules. Code is available in \emph{\url{https://github.com/hongwang01/DRCDNet}}
    • …
    corecore