361,388 research outputs found

    Segmentation algorithms of biomedical images: development and quantitative evaluation

    No full text
    The article presents the comparative analysis of the biomedical image segmentation methods. The work discusses segmentation methods on the basis of previous labeling and spatial moments. The experimental results show that the developed methods have higher accuracy by signal-noise ratio compared to the nowadays known. Moreover the authors have developed the quantitative evaluation of the segmentation algorithms based on the metrical approach.У статті представлений порівняльний аналіз методів сегментації біомедичних зображень. У роботі досліджуються методи сегментації на основі попередньої розмітки та просторових моментів. Експериментальні результати показують, що розроблені методи мають більш високу точність за співвідношенням сигнал-шум у порівнянні з відомими. Крім того, автори розробили алгоритм кількісної оцінки алгоритмів сегментації на основі метричного підходу.The proposed research has been developed within the state budget project "Hybrid Intelligent Information Technology Diagnosing of Precancerous Breast Cancer Based on Image Analysis" (state registration number 1016U002500)

    Face recognition based on curvelets, invariant moments features and SVM

    Get PDF
    Recent studies highlighted on face recognition methods. In this paper, a new algorithm is proposed for face recognition by combining Fast Discrete Curvelet Transform (FDCvT) and Invariant Moments with Support vector machine (SVM), which improves rate of face recognition in various situations. The reason of using this approach depends on two things. first, Curvelet transform which is a multi-resolution method, that can efficiently represent image edge discontinuities; Second, the Invariant Moments analysis which is a statistical method that meets with the translation, rotation and scale invariance in the image. Furthermore, SVM is employed to classify the face image based on the extracted features. This process is applied on each of ORL and Yale databases to evaluate the performance of the suggested method. Experimentally, the proposed method results show that our system can compose efficient and reasonable face recognition feature, and obtain useful recognition accuracy, which is able to face and side-face states detection of persons to decrease fault rate of production

    A Monte Carlo Template based analysis for Air-Cherenkov Arrays

    Get PDF
    We present a high-performance event reconstruction algorithm: an Image Pixel-wise fit for Atmospheric Cherenkov Telescopes (ImPACT). The reconstruction algorithm is based around the likelihood fitting of camera pixel amplitudes to an expected image template. A maximum likelihood fit is performed to find the best-fit shower parameters. A related reconstruction algorithm has already been shown to provide significant improvements over traditional reconstruction for both the CAT and H.E.S.S. experiments. We demonstrate a significant improvement to the template generation step of the procedure, by the use of a full Monte Carlo air shower simulation in combination with a ray-tracing optics simulation to more accurately model the expected camera images. This reconstruction step is combined with an MVA-based background rejection. Examples are shown of the performance of the ImPACT analysis on both simulated and measured (from a strong VHE source) gamma-ray data from the H.E.S.S. array, demonstrating an improvement in sensitivity of more than a factor two in observation time over traditional image moments-fitting methods, with comparable performance to previous likelihood fitting analyses. ImPACT is a particularly promising approach for future large arrays such as the Cherenkov Telescope Array (CTA) due to its improved high-energy performance and suitability for arrays of mixed telescope types.Comment: 13 pages, 10 figure

    CASE STUDY ON RE-ADJUSTMENTS DEPENDING ON PRICE MODIFICATION

    Get PDF
    Inflationary moments, characterized by significant price rises, have proved that accountingsystems based on historical costs provide a distorted image of the reality: the elements ofthe balance sheet are under-valuated, and the stock-related expenses and amortization in theprofit and loss account are also under-valuated. Under these circumstances, the result isover-valuated, and its distribution leads to allotments from the company’s capital. In thispaper we draw up a case study with regards to the methods used for adjusting pricemodification, clearly outlining, through a comparative analysis, the main differencesbetween the accounting system based on historical cost and inflation accounting

    Direct Object Recognition Using No Higher Than Second or Third Order Statistics of the Image

    Get PDF
    Novel algorithms for object recognition are described that directly recover the transformations relating the image to its model. Unlike methods fitting the typical conventional framework, these new methods do not require exhaustive search for each feature correspondence in order to solve for the transformation. Yet they allow simultaneous object identification and recovery of the transformation. Given hypothesized % potentially corresponding regions in the model and data (2D views) --- which are from planar surfaces of the 3D objects --- these methods allow direct compututation of the parameters of the transformation by which the data may be generated from the model. We propose two algorithms: one based on invariants derived from no higher than second and third order moments of the image, the other via a combination of the affine properties of geometrical and the differential attributes of the image. Empirical results on natural images demonstrate the effectiveness of the proposed algorithms. A sensitivity analysis of the algorithm is presented. We demonstrate in particular that the differential method is quite stable against perturbations --- although not without some error --- when compared with conventional methods. We also demonstrate mathematically that even a single point correspondence suffices, theoretically at least, to recover affine parameters via the differential method

    Shapelets "multiple multipole" shear measurement methods

    Get PDF
    The measurement of weak gravitational lensing is currently limited to a precision of ~10% by instabilities in galaxy shape measurement techniques and uncertainties in their calibration. The potential of large, on-going and future cosmic shear surveys will only be realised with the development of more accurate image analysis methods. We present a description of several possible shear measurement methods using the linear "shapelets" decomposition. Shapelets provides a complete reconstruction of any galaxy image, including higher-order shape moments that can be used to generalise the KSB method to arbitrary order. Many independent shear estimators can then be formed for each object, using linear combinations of shapelet coefficients. These estimators can be treated separately, to improve their overall calibration; or combined in more sophisticated ways, to eliminate various instabilities and a calibration bias. We apply several methods to simulated astronomical images containing a known input shear, and demonstrate the dramatic improvement in shear recovery using shapelets. A complete IDL software package to perform image analysis and manipulation in shapelet space can be downloaded from the shapelets web site at http://www.astro.caltech.edu/~rjm/shapelets/ .Comment: 6 pages, 2 figures. To be published in "Impact of Gravitational Lensing on Cosmology", IAU Symposium 225, eds. Mellier & Meyla

    TEXTURAL ANALYSIS AND STATISTICAL INVESTIGATION OF PATTERNS IN SYNTHETIC APERTURE SONAR IMAGES

    Get PDF
    Textural analysis and statistical investigation of patterns in synthetic aperture sonar (SAS) images is useful for oceanographic purposes such as biological habitat mapping or bottom type identification for offshore construction. Seafloor classification also has many tactical benefits for the U.S. Navy in terms of mine identification and undersea warfare. Common methods of texture analysis rely on statistical moments of image intensity, or more generally, the probability density function of the scene. One of the most common techniques uses Haralick’s Grey Level Co-occurrence Matrix (GLCM) to calculate image features used in the applications listed above. Although widely used, seafloor classification and segmentation are difficult using Haralick features. Typically, these features are calculated at a single scale. Improvements based on the understanding that patterns are multiscale was compared with this baseline, with a goal of improving seafloor classification. Synthetic aperture sonar (SAS) data was provided by the Norwegian Research Defense Establishment for this work, and was labeled into six distinct seafloor classes, with 757 total examples. We analyze the feature importance determined by neighborhood component analysis as a function of scale and direction to determine which spatial scale and azimuthal direction is most informative for good classification performance.Office of Naval Research, Arlington, VA , 22217Lieutenant, United States NavyApproved for public release. Distribution is unlimited

    What is the relationship between photospheric flow fields and solar flares?

    Full text link
    We estimated photospheric velocities by separately applying the Fourier Local Correlation Tracking (FLCT) and Differential Affine Velocity Estimator (DAVE) methods to 2708 co-registered pairs of SOHO/MDI magnetograms, with nominal 96-minute cadence and ~2" pixels, from 46 active regions (ARs) from 1996-1998 over the time interval t45 when each AR was within 45^o of disk center. For each magnetogram pair, we computed the average estimated radial magnetic field, B; and each tracking method produced an independently estimated flow field, u. We then quantitatively characterized these magnetic and flow fields by computing several extensive and intensive properties of each; extensive properties scale with AR size, while intensive properties do not depend directly on AR size. Intensive flow properties included moments of speeds, horizontal divergences, and radial curls; extensive flow properties included sums of these properties over each AR, and a crude proxy for the ideal Poynting flux, the total |u| B^2. Several magnetic quantities were also computed, including: total unsigned flux; a measure of the amount of unsigned flux near strong-field polarity inversion lines, R; and the total B^2. Next, using correlation and discriminant analysis, we investigated the associations between these properties and flares from the GOES flare catalog, when averaged over both t45 and shorter time windows, of 6 and 24 hours. We found R and total |u| B^2 to be most strongly associated with flares; no intensive flow properties were strongly associated with flares.Comment: 57 pages, 13 figures; revised content; added URL to manuscript with higher-quality image

    Analysis of Observer Performance in Detecting Signals with Location Uncertainty for Regularized Tomographic Image Reconstruction

    Full text link
    Our goal is to optimize regularized image reconstruction methods for emission tomography with respect to the task of detecting small lesions in the reconstructed images. To reflect medical practice realistically, we consider the location of the lesion to be unknown. This location uncertainty significantly complicates the mathematical analysis of model observer performance. We consider model observers whose decisions are based on finding the maximum value of a local test statistic over all possible locations. Khurd and Gindi (SPIE 2004) and Qi and Huesman (SPIE 2004) described analytical approximations of the moments of the local test statistics and used Monte Carlo simulations to evaluate the localization performance of such "maximum observers". We propose here an alternative approach, where tail probability approximations developed by Adler (AAP 2000) facilitate analytical evaluation of the detection performance of these observers. We illustrate how these approximations can be used to evaluate the probability of detection (for low probability of false alarm operating points) for the maximum channelized hotelling observer. Using our analyses, one can rank and optimize image reconstruction methods without requiring time-consuming Monte Carlo simulations.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/85960/1/Fessler205.pd
    corecore