251,480 research outputs found

    Extreme Value Distribution Based Gene Selection Criteria for Discriminant Microarray Data Analysis Using Logistic Regression

    Full text link
    One important issue commonly encountered in the analysis of microarray data is to decide which and how many genes should be selected for further studies. For discriminant microarray data analyses based on statistical models, such as the logistic regression models, gene selection can be accomplished by a comparison of the maximum likelihood of the model given the real data, L^(D∣M)\hat{L}(D|M), and the expected maximum likelihood of the model given an ensemble of surrogate data with randomly permuted label, L^(D0∣M)\hat{L}(D_0|M). Typically, the computational burden for obtaining L^(D0∣M)\hat{L}(D_0|M) is immense, often exceeding the limits of computing available resources by orders of magnitude. Here, we propose an approach that circumvents such heavy computations by mapping the simulation problem to an extreme-value problem. We present the derivation of an asymptotic distribution of the extreme-value as well as its mean, median, and variance. Using this distribution, we propose two gene selection criteria, and we apply them to two microarray datasets and three classification tasks for illustration.Comment: to be published in Journal of Computational Biology (2004

    Accuracy of Patient-Specific Organ Dose Estimates Obtained Using an Automated Image Segmentation Algorithm

    Get PDF
    The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was -7%, with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors

    An Optimal Algorithm for Sliding Window Order Statistics

    Get PDF
    Assume there is a data stream of elements and a window of size m. Sliding window algorithms compute various statistic functions over the last m elements of the data stream seen so far. The time complexity of a sliding window algorithm is measured as the time required to output an updated statistic function value every time a new element is read. For example, it is well known that computing the sliding window maximum/minimum has time complexity O(1) while computing the sliding window median has time complexity O(log m). In this paper we close the gap between these two cases by (1) presenting an algorithm for computing the sliding window k-th smallest element in O(log k) time and (2) prove that this time complexity is optimal

    An Efficient Approach of Removing the High Density Salt

    Get PDF
    Images are often corrupted by impulse noise, also known as salt and pepper noise. Salt and pepper noise can corrupt the images where the corrupted pixel takes either maximum or minimum gray level. Amongst these standard median filter has been established as reliable - method to remove the salt and pepper noise without harming the edge details. However, the major problem of standard Median Filter (MF) is that the filter is effective only at low noise densities. When the noise level is over 50% the edge details of the original image will not be preserved by standard median filter. Adaptive Median Filter (AMF) performs well at low noise densities. In our proposed method, first we apply the Stationary Wavelet Transform (SWT) for noise added image. It will separate into four bands like LL, LH, HL and HH. Further, we calculate the window size 3x3 for LL band image by Reading the pixels from the window, computing the minimum, maximum and median values from inside the window. Then we find out the noise and noise free pixels inside the window by applying our algorithm which replaces the noise pixels. The higher bands are smoothing by soft thresholding method. Then all the coefficients are decomposed by inverse stationary wavelet transform. The performance of the proposed algorithm is tested for various levels of noise corruption and compared with standard filters namely standard median filter (SMF), weighted median filter (WMF). Our proposed method performs well in removing low to medium density impulse noise with detail preservation up to a noise density of 70% and it gives better Peak Signal-to-Noise Ratio (PSNR) and Mean square error (MSE) values

    Generation of High Spatial Resolution Terrestrial Surface from Low Spatial Resolution Elevation Contour Maps via Hierarchical Computation of Median Elevation Regions

    Full text link
    We proposed a simple yet effective morphological approach to convert a sparse Digital Elevation Model (DEM) to a dense Digital Elevation Model. The conversion is similar to that of the generation of high-resolution DEM from its low-resolution DEM. The approach involves the generation of median contours to achieve the purpose. It is a sequential step of the I) decomposition of the existing sparse Contour map into the maximum possible Threshold Elevation Region (TERs). II) Computing all possible non-negative and non-weighted Median Elevation Region (MER) hierarchically between the successive TER decomposed from a sparse contour map. III) Computing the gradient of all TER, and MER computed from previous steps would yield the predicted intermediate elevation contour at a higher spatial resolution. We presented this approach initially with some self-made synthetic data to show how the contour prediction works and then experimented with the available contour map of Washington, NH to justify its usefulness. This approach considers the geometric information of existing contours and interpolates the elevation contour at a new spatial region of a topographic surface until no elevation contours are necessary to generate. This novel approach is also very low-cost and robust as it uses elevation contours.Comment: 11 pages, 6 figures,1 table, 1 algorith
    • …
    corecore