137 research outputs found

    Image denoising with multi-layer perceptrons, part 1: comparison with existing algorithms and with bounds

    Full text link
    Image denoising can be described as the problem of mapping from a noisy image to a noise-free image. The best currently available denoising methods approximate this mapping with cleverly engineered algorithms. In this work we attempt to learn this mapping directly with plain multi layer perceptrons (MLP) applied to image patches. We will show that by training on large image databases we are able to outperform the current state-of-the-art image denoising methods. In addition, our method achieves results that are superior to one type of theoretical bound and goes a large way toward closing the gap with a second type of theoretical bound. Our approach is easily adapted to less extensively studied types of noise, such as mixed Poisson-Gaussian noise, JPEG artifacts, salt-and-pepper noise and noise resembling stripes, for which we achieve excellent results as well. We will show that combining a block-matching procedure with MLPs can further improve the results on certain images. In a second paper, we detail the training trade-offs and the inner mechanisms of our MLPs

    Computing Functions of Random Variables via Reproducing Kernel Hilbert Space Representations

    Full text link
    We describe a method to perform functional operations on probability distributions of random variables. The method uses reproducing kernel Hilbert space representations of probability distributions, and it is applicable to all operations which can be applied to points drawn from the respective distributions. We refer to our approach as {\em kernel probabilistic programming}. We illustrate it on synthetic data, and show how it can be used for nonparametric structural equation models, with an application to causal inference

    Technical report on Separation methods for nonlinear mixtures

    Get PDF

    Extreme events in gross primary production: a characterization across continents

    Get PDF
    Climate extremes can affect the functioning of terrestrial ecosystems, for instance via a reduction of the photosynthetic capacity or alterations of respiratory processes. Yet the dominant regional and seasonal effects of hydrometeorological extremes are still not well documented and in the focus of this paper. Specifically, we quantify and characterize the role of large spatiotemporal extreme events in gross primary production (GPP) as triggers of continental anomalies. We also investigate seasonal dynamics of extreme impacts on continental GPP anomalies. We find that the 50 largest positive extremes (i.e., statistically unusual increases in carbon uptake rates) and negative extremes (i.e., statistically unusual decreases in carbon uptake rates) on each continent can explain most of the continental variation in GPP, which is in line with previous results obtained at the global scale. We show that negative extremes are larger than positive ones and demonstrate that this asymmetry is particularly strong in South America and Europe. Our analysis indicates that the overall impacts and the spatial extents of GPP extremes are power-law distributed with exponents that vary little across continents. Moreover, we show that on all continents and for all data sets the spatial extents play a more important role for the overall impact of GPP extremes compared to the durations or maximal GPP. An analysis of possible causes across continents indicates that most negative extremes in GPP can be attributed clearly to water scarcity, whereas extreme temperatures play a secondary role. However, for Europe, South America and Oceania we also identify fire as an important driver. Our findings are consistent with remote sensing products. An independent validation against a literature survey on specific extreme events supports our results to a large extent

    The effect of immediate breast reconstruction on the timing of adjuvant chemotherapy: a systematic review

    Get PDF
    Adjuvant chemotherapy is often needed to achieve adequate breast cancer control. The increasing popularity of immediate breast reconstruction (IBR) raises concerns that this procedure may delay the time to adjuvant chemotherapy (TTC), which may negatively impact oncological outcome. The current systematic review aims to investigate this effect. During October 2014, a systematic search for clinical studies was performed in six databases with keywords related to breast reconstruction and chemotherapy. Eligible studies met the following inclusion criteria: (1) research population consisted of women receiving therapeutic mastectomy, (2) comparison of IBR with mastectomy only groups, (3) TTC was clearly presented and mentioned as outcome measure, and (4) original studies only (e.g., cohort study, randomized controlled trial, case–control). Fourteen studies were included, representing 5270 patients who had received adjuvant chemotherapy, of whom 1942 had undergone IBR and 3328 mastectomy only. One study found a significantly shorter mean TTC of 12.6 days after IBR, four studies found a significant delay after IBR averaging 6.6–16.8 days, seven studies found no significant difference in TTC between IBR and mastectomy only, and two studies did not perform statistical analyses for comparison. In studies that measured TTC from surgery, mean TTC varied from 29 to 61 days for IBR and from 21 to 60 days for mastectomy only. This systematic review of the current literature showed that IBR does not necessarily delay the start of adjuvant chemotherapy to a clinically relevant extent, suggesting that in general IBR is a valid option for non-metastatic breast cancer patients

    Comparison of cardiac volumetry using real-time MRI during free-breathing with standard cine MRI during breath-hold in children

    Get PDF
    Background Cardiac real-time magnetic resonance imaging (RT-MRI) provides high-quality images even during free- breathing. Difficulties in post-processing impede its use in clinical routine. Objective To demonstrate the feasibility of quantitative analysis of cardiac free-breathing RT-MRI and to compare image quality and volumetry during free-breathing RT-MRI in pediatric patients to standard breath-hold cine MRI. Materials and methods Pediatric patients (n= 22) received cardiac RT-MRI volumetry during free breathing (1.5 T; short axis; 30 frames per s) in addition to standard breath-hold cine imaging in end-expiration. Real-time images were binned retrospec- tively based on electrocardiography and respiratory bellows. Image quality and volumetry were compared using the European Cardiovascular Magnetic Resonance registry score, structure visibility rating, linear regression and Bland–Altman analyses. Results Additional time for binning of real-time images was 2 min. For both techniques, image quality was rated good to excellent. RT-MRI was significantly more robust against artifacts (P< 0.01). Linear regression revealed good correlations for the ventricular volumes. Bland–Altman plots showed a good limit of agreement (LoA) for end-diastolic volume (left ventricle [LV]: LoA -0.1 ± 2.7 ml/m2, right ventricle [RV]: LoA -1.9 ± 3.4 ml/m2), end-systolic volume (LV: LoA 0.4 ± 1.9 ml/m2, RV: LoA 0.6 ± 2.0 ml/m2), stroke volume (LV: LoA -0.5± 2.3 ml/m2, RV: LoA -2.6± 3.3 ml/m2) and ejection fraction (LV: LoA -0.5 ± 1.6%, RV: LoA -2.1 ± 2.8%). Conclusion Compared to standard cine MRI with breath hold, RT-MRI during free breathing with retrospective respiratory binning offers good image quality, reduced image artifacts enabling fast quantitative evaluations of ventricular volumes in clinical practice under physiological conditions

    Image analysis for cosmology: results from the GREAT10 Galaxy Challenge

    Get PDF
    In this paper, we present results from the weak-lensing shape measurement GRavitational lEnsing Accuracy Testing 2010 (GREAT10) Galaxy Challenge. This marks an order of magnitude step change in the level of scrutiny employed in weak-lensing shape measurement analysis. We provide descriptions of each method tested and include 10 evaluation metrics over 24 simulation branches. GREAT10 was the first shape measurement challenge to include variable fields; both the shear field and the point spread function (PSF) vary across the images in a realistic manner. The variable fields enable a variety of metrics that are inaccessible to constant shear simulations, including a direct measure of the impact of shape measurement inaccuracies, and the impact of PSF size and ellipticity, on the shear power spectrum. To assess the impact of shape measurement bias for cosmic shear, we present a general pseudo-Cℓ formalism that propagates spatially varying systematics in cosmic shear through to power spectrum estimates. We also show how one-point estimators of bias can be extracted from variable shear simulations. The GREAT10 Galaxy Challenge received 95 submissions and saw a factor of 3 improvement in the accuracy achieved by other shape measurement methods. The best methods achieve sub-per cent average biases. We find a strong dependence on accuracy as a function of signal-to-noise ratio, and indications of a weak dependence on galaxy type and size. Some requirements for the most ambitious cosmic shear experiments are met above a signal-to-noise ratio of 20. These results have the caveat that the simulated PSF was a ground-based PSF. Our results are a snapshot of the accuracy of current shape measurement methods and are a benchmark upon which improvement can be brought. This provides a foundation for a better understanding of the strengths and limitations of shape measurement method
    • 

    corecore