814 research outputs found

    RainDiffusion:When Unsupervised Learning Meets Diffusion Models for Real-world Image Deraining

    Full text link
    What will happen when unsupervised learning meets diffusion models for real-world image deraining? To answer it, we propose RainDiffusion, the first unsupervised image deraining paradigm based on diffusion models. Beyond the traditional unsupervised wisdom of image deraining, RainDiffusion introduces stable training of unpaired real-world data instead of weakly adversarial training. RainDiffusion consists of two cooperative branches: Non-diffusive Translation Branch (NTB) and Diffusive Translation Branch (DTB). NTB exploits a cycle-consistent architecture to bypass the difficulty in unpaired training of standard diffusion models by generating initial clean/rainy image pairs. DTB leverages two conditional diffusion modules to progressively refine the desired output with initial image pairs and diffusive generative prior, to obtain a better generalization ability of deraining and rain generation. Rain-Diffusion is a non adversarial training paradigm, serving as a new standard bar for real-world image deraining. Extensive experiments confirm the superiority of our RainDiffusion over un/semi-supervised methods and show its competitive advantages over fully-supervised ones.Comment: 9 page

    Fast Predictive Image Registration

    Full text link
    We present a method to predict image deformations based on patch-wise image appearance. Specifically, we design a patch-based deep encoder-decoder network which learns the pixel/voxel-wise mapping between image appearance and registration parameters. Our approach can predict general deformation parameterizations, however, we focus on the large deformation diffeomorphic metric mapping (LDDMM) registration model. By predicting the LDDMM momentum-parameterization we retain the desirable theoretical properties of LDDMM, while reducing computation time by orders of magnitude: combined with patch pruning, we achieve a 1500x/66x speed up compared to GPU-based optimization for 2D/3D image registration. Our approach has better prediction accuracy than predicting deformation or velocity fields and results in diffeomorphic transformations. Additionally, we create a Bayesian probabilistic version of our network, which allows evaluation of deformation field uncertainty through Monte Carlo sampling using dropout at test time. We show that deformation uncertainty highlights areas of ambiguous deformations. We test our method on the OASIS brain image dataset in 2D and 3D

    Fast and Robust Femur Segmentation from Computed Tomography Images for Patient-Specific Hip Fracture Risk Screening

    Full text link
    Osteoporosis is a common bone disease that increases the risk of bone fracture. Hip-fracture risk screening methods based on finite element analysis depend on segmented computed tomography (CT) images; however, current femur segmentation methods require manual delineations of large data sets. Here we propose a deep neural network for fully automated, accurate, and fast segmentation of the proximal femur from CT. Evaluation on a set of 1147 proximal femurs with ground truth segmentations demonstrates that our method is apt for hip-fracture risk screening, bringing us one step closer to a clinically viable option for screening at-risk patients for hip-fracture susceptibility.Comment: This article has been accepted for publication in Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, published by Taylor & Franci

    Quantum-inspired computational imaging

    Get PDF
    Computational imaging combines measurement and computational methods with the aim of forming images even when the measurement conditions are weak, few in number, or highly indirect. The recent surge in quantum-inspired imaging sensors, together with a new wave of algorithms allowing on-chip, scalable and robust data processing, has induced an increase of activity with notable results in the domain of low-light flux imaging and sensing. We provide an overview of the major challenges encountered in low-illumination (e.g., ultrafast) imaging and how these problems have recently been addressed for imaging applications in extreme conditions. These methods provide examples of the future imaging solutions to be developed, for which the best results are expected to arise from an efficient codesign of the sensors and data analysis tools.Y.A. acknowledges support from the UK Royal Academy of Engineering under the Research Fellowship Scheme (RF201617/16/31). S.McL. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grant EP/J015180/1). V.G. acknowledges support from the U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office award W911NF-10-1-0404, the U.S. DARPA REVEAL program through contract HR0011-16-C-0030, and U.S. National Science Foundation through grants 1161413 and 1422034. A.H. acknowledges support from U.S. Army Research Office award W911NF-15-1-0479, U.S. Department of the Air Force grant FA8650-15-D-1845, and U.S. Department of Energy National Nuclear Security Administration grant DE-NA0002534. D.F. acknowledges financial support from the UK Engineering and Physical Sciences Research Council (grants EP/M006514/1 and EP/M01326X/1). (RF201617/16/31 - UK Royal Academy of Engineering; EP/J015180/1 - UK Engineering and Physical Sciences Research Council; EP/M006514/1 - UK Engineering and Physical Sciences Research Council; EP/M01326X/1 - UK Engineering and Physical Sciences Research Council; W911NF-10-1-0404 - U.S. Defense Advanced Research Projects Agency (DARPA) InPho program through U.S. Army Research Office; HR0011-16-C-0030 - U.S. DARPA REVEAL program; 1161413 - U.S. National Science Foundation; 1422034 - U.S. National Science Foundation; W911NF-15-1-0479 - U.S. Army Research Office; FA8650-15-D-1845 - U.S. Department of the Air Force; DE-NA0002534 - U.S. Department of Energy National Nuclear Security Administration)Accepted manuscrip

    Classification Modeling for Malaysian Blooming Flower Images Using Neural Networks

    Get PDF
    Image processing is a rapidly growing research area of computer science and remains as a challenging problem within the computer vision fields. For the classification of flower images, the problem is mainly due to the huge similarities in terms of colour and texture. The appearance of the image itself such as variation of lights due to different lighting condition, shadow effect on the object’s surface, size, shape, rotation and position, background clutter, states of blooming or budding may affect the utilized classification techniques. This study aims to develop a classification model for Malaysian blooming flowers using neural network with the back propagation algorithms. The flower image is extracted through Region of Interest (ROI) in which texture and colour are emphasized in this study. In this research, a total of 960 images were extracted from 16 types of flowers. Each ROI was represented by three colour attributes (Hue, Saturation, and Value) and four textures attribute (Contrast, Correlation, Energy and Homogeneity). In training and testing phases, experiments were carried out to observe the classification performance of Neural Networks with duplication of difficult pattern to learn (referred to as DOUBLE) as this could possibly explain as to why some flower images were difficult to learn by classifiers. Results show that the overall performance of Neural Network with DOUBLE is 96.3% while actual data set is 68.3%, and the accuracy obtained from Logistic Regression with actual data set is 60.5%. The Decision Tree classification results indicate that the highest performance obtained by Chi-Squared Automatic Interaction Detection(CHAID) and Exhaustive CHAID (EX-CHAID) is merely 42% with DOUBLE. The findings from this study indicate that Neural Network with DOUBLE data set produces highest performance compared to Logistic Regression and Decision Tree. Therefore, NN has been potential in building Malaysian blooming flower model. Future studies can be focused on increasing the sample size and ROI thus may lead to a higher percentage of accuracy. Nevertheless, the developed flower model can be used as part of the Malaysian Blooming Flower recognition system in the future where the colours and texture are needed in the flower identification process
    • …
    corecore