19 research outputs found

    Pointwise adaptive estimation for quantile regression

    Get PDF
    A nonparametric procedure for quantile regression, or more generally nonparametric M-estimation, is proposed which is completely data-driven and adapts locally to the regularity of the regression function. This is achieved by considering in each point M-estimators over different local neighbourhoods and by a local model selection procedure based on sequential testing. Non-asymptotic risk bounds are obtained, which yield rate-optimality for large sample asymptotics under weak conditions. Simulations for different univariate median regression models show good finite sample properties, also in comparison to traditional methods. The approach is the basis for denoising CT scans in cancer research.M-estimation, median regression, robust estimation, local model selection, unsupervised learning, local bandwidth selection, median filter, Lepski procedure, minimax rate, image denoising

    Laplace deconvolution on the basis of time domain data and its application to Dynamic Contrast Enhanced imaging

    Full text link
    In the present paper we consider the problem of Laplace deconvolution with noisy discrete non-equally spaced observations on a finite time interval. We propose a new method for Laplace deconvolution which is based on expansions of the convolution kernel, the unknown function and the observed signal over Laguerre functions basis (which acts as a surrogate eigenfunction basis of the Laplace convolution operator) using regression setting. The expansion results in a small system of linear equations with the matrix of the system being triangular and Toeplitz. Due to this triangular structure, there is a common number mm of terms in the function expansions to control, which is realized via complexity penalty. The advantage of this methodology is that it leads to very fast computations, produces no boundary effects due to extension at zero and cut-off at TT and provides an estimator with the risk within a logarithmic factor of the oracle risk. We emphasize that, in the present paper, we consider the true observational model with possibly nonequispaced observations which are available on a finite interval of length TT which appears in many different contexts, and account for the bias associated with this model (which is not present when TT\rightarrow\infty). The study is motivated by perfusion imaging using a short injection of contrast agent, a procedure which is applied for medical assessment of micro-circulation within tissues such as cancerous tumors. Presence of a tuning parameter aa allows to choose the most advantageous time units, so that both the kernel and the unknown right hand side of the equation are well represented for the deconvolution. The methodology is illustrated by an extensive simulation study and a real data example which confirms that the proposed technique is fast, efficient, accurate, usable from a practical point of view and very competitive.Comment: 36 pages, 9 figures. arXiv admin note: substantial text overlap with arXiv:1207.223

    Fast pseudo-CT synthesis from MRI T1-weighted images using a patch-based approach

    Get PDF
    MRI-based bone segmentation is a challenging task because bone tissue and air both present low signal intensity on MR images, making it difficult to accurately delimit the bone boundaries. However, estimating bone from MRI images may allow decreasing patient ionization by removing the need of patient-specific CT acquisition in several applications. In this work, we propose a fast GPU-based pseudo-CT generation from a patient-specific MRI T1-weighted image using a group-wise patch-based approach and a limited MRI and CT atlas dictionary. For every voxel in the input MR image, we compute the similarity of the patch containing that voxel with the patches of all MR images in the database, which lie in a certain anatomical neighborhood. The pseudo-CT is obtained as a local weighted linear combination of the CT values of the corresponding patches. The algorithm was implemented in a GPU. The use of patch-based techniques allows a fast and accurate estimation of the pseudo-CT from MR T1-weighted images, with a similar accuracy as the patient-specific CT. The experimental normalized cross correlation reaches 0.9324±0.0048 for an atlas with 10 datasets. The high NCC values indicate how our method can accurately approximate the patient-specific CT. The GPU implementation led to a substantial decrease in computational time making the approach suitable for real applications

    Reconstruction of one-dimensional chaotic maps from sequences of probability density functions

    Get PDF
    In many practical situations, it is impossible to measure the individual trajectories generated by an unknown chaotic system, but we can observe the evolution of probability density functions generated by such a system. The paper proposes for the first time a matrix-based approach to solve the generalized inverse Frobenius–Perron problem, that is, to reconstruct an unknown one-dimensional chaotic transformation, based on a temporal sequence of probability density functions generated by the transformation. Numerical examples are used to demonstrate the applicability of the proposed approach and evaluate its robustness with respect to constantly applied stochastic perturbations

    Pointwise adaptive estimation for quantile regression

    Get PDF
    A nonparametric procedure for quantile regression, or more generally nonparametric M-estimation, is proposed which is completely data-driven and adapts locally to the regularity of the regression function. This is achieved by considering in each point M-estimators over different local neighbourhoods and by a local model selection procedure based on sequential testing. Non-asymptotic risk bounds are obtained, which yield rate-optimality for large sample asymptotics under weak conditions. Simulations for different univariate median regression models show good finite sample properties, also in comparison to traditional methods. The approach is the basis for denoising CT scans in cancer research

    Combining Regular and Irregular Histograms by Penalized Likelihood

    Get PDF
    A new fully automatic procedure for the construction of histograms is proposed. It consists of constructing both a regular and an irregular histogram and then choosing between the two. To choose the number of bins in the irregular histogram, two different penalties motivated by recent work in model selection are proposed. A description of the algorithm and a proper tuning of the penalties is given. Finally, different versions of the procedure are compared to other existing proposals for a wide range of densities and sample sizes. In the simulations, the squared Hellinger risk of the new procedure is always at most twice as large as the risk of the best of the other methods. The procedure is implemented in an R-Package. Key words: irregular histogram, density estimation, penalized likelihood, dynamic programmin

    Laplace Deconvolution On The Basis Of Time Domain Data And Its Application To Dynamic Contrast-Enhanced Imaging

    No full text
    We consider the problem of Laplace deconvolution with noisy discrete non-equally spaced observations on a finite time interval. We propose a new method for Laplace deconvolution which is based on expansions of the convolution kernel, the unknown function and the observed signal over a Laguerre functions basis (which acts as a surrogate eigenfunction basis of the Laplace convolution operator) using a regression setting. The expansion results in a small system of linear equations with the matrix of the system being triangular and Toeplitz. Because of this triangular structure, there is a common number m of terms in the function expansions to control, which is realized via a complexity penalty. The advantage of this methodology is that it leads to very fast computations, produces no boundary effects due to extension at zero and cut-off at T and provides an estimator with the risk within a logarithmic factor of m of the oracle risk. We emphasize that we consider the true observational model with possibly non-equispaced observations which are available on a finite interval of length T which appears in many different contexts, and we account for the bias associated with this model (which is not present in the case T→∞). The study is motivated by perfusion imaging using a short injection of contrast agent, a procedure which is applied for medical assessment of microcirculation within tissues such as cancerous tumours. The presence of a tuning parameter a allows the choice of the most advantageous time units, so that both the kernel and the unknown right-hand side of the equation are well represented for the deconvolution. The methodology is illustrated by an extensive simulation study and a real data example which confirms that the technique proposed is fast, efficient, accurate, usable from a practical point of view and very competitive

    Growing time homogeneous neighborhoods for denoising and clustering dynamic contrast enhanced-ct sequences

    Get PDF
    Following intravenous contrast injection, Dynamic Contrast Enhanced Computed Tomography (DCE-CT) allows access to tissue perfusion parameters. Unfortunately, safety concerns limit strongly the X-ray in DCE-CT, which produces noisy images hardly usable for direct evaluation of tissue enhancement with a spatial resolution that preserves spatial heterogeneity within tumors. Based on statistical multiple hypothesis testing, a new denoising algorithm for DCE-imaging sequences is proposed. Its main interest consists in preserving the enhancement structures typical of microvascular behaviors, important for diagnosis. This is achieved by mixing a spatial local approach for aggregation of voxels and a time-global statistical test procedure to separate the tissue dynamics. Applied to DCE-CT sequences, this new algorithm shows its capacity not only to preserve organ shapes but also to distinguish and denoise tissue enhancements even for small vessels or tumor structures. In a second step, using the denoised sequence, the same tests are used to build unsupervised and automatic tissue clustering. This clustering allows to differentiate, up to pixel level, tissues without any prior knowledge on their number. 1
    corecore