3,772 research outputs found
A Novel Euler's Elastica based Segmentation Approach for Noisy Images via using the Progressive Hedging Algorithm
Euler's Elastica based unsupervised segmentation models have strong
capability of completing the missing boundaries for existing objects in a clean
image, but they are not working well for noisy images. This paper aims to
establish a Euler's Elastica based approach that properly deals with random
noises to improve the segmentation performance for noisy images. We solve the
corresponding optimization problem via using the progressive hedging algorithm
(PHA) with a step length suggested by the alternating direction method of
multipliers (ADMM). Technically, all the simplified convex versions of the
subproblems derived from the major framework of PHA can be obtained by using
the curvature weighted approach and the convex relaxation method. Then an
alternating optimization strategy is applied with the merits of using some
powerful accelerating techniques including the fast Fourier transform (FFT) and
generalized soft threshold formulas. Extensive experiments have been conducted
on both synthetic and real images, which validated some significant gains of
the proposed segmentation models and demonstrated the advantages of the
developed algorithm
B-spline snakes in two stages
In using Snake algorithms, the slow convergence speed is due to the large number of control points to be selected, as well as difficulties in setting the weighting factors that comprise the internal energies of the curve. Even in using the B-Spline snakes, splines cannot be fitted into the corner of the object completely. In this paper, a novel two-stage method based on B-Spline Snakes is proposed. It is superior both in accuracy and fast convergence speed over previous B-Spline Snakes. The first stage reduces the number of control points using potential function V(x,y) minimization. Hence, it allows the spline to quickly approach the minimum energy state. The second stage is designed to refine the B-Spline snakes based on the node points of the polynomials without knots. In other words, an elasticity spline is controlled by node points where knots are fixed. Simulation and validation of results are presented. Compared to the traditional B-Spline snakes, better performance was achieved using the method proposed in this paper.published_or_final_versio
A Frame Work for Parallel String Matching- A Computational Approach with Omega Model
Now a day2019;s parallel string matching problem is attracted by so many researchers because of the importance in information retrieval systems. While it is very easily stated and many of the simple algorithms perform very well in practice, numerous works have been published on the subject and research is still very active. In this paper we propose a omega parallel computing model for parallel string matching. Experimental results show that, on a multi-processor system, the omega model implementation of the proposed parallel string matching algorithm can reduce string matching time by more than 40%
On the segmentation and classification of hand radiographs
This research is part of a wider project to build predictive models of bone age using hand radiograph images. We examine ways of finding the outline of a hand from an X-ray as the first stage in segmenting the image into constituent bones. We assess a variety of algorithms including contouring, which has not previously been used in this context. We introduce a novel ensemble algorithm for combining outlines using two voting schemes, a likelihood ratio test and dynamic time warping (DTW). Our goal is to minimize the human intervention required, hence we investigate alternative ways of training a classifier to determine whether an outline is in fact correct or not. We evaluate outlining and classification on a set of 1370 images. We conclude that ensembling with DTW improves performance of all outlining algorithms, that the contouring algorithm used with the DTW ensemble performs the best of those assessed, and that the most effective classifier of hand outlines assessed is a random forest applied to outlines transformed into principal components
Synthesis of Positron Emission Tomography (PET) Images via Multi-channel Generative Adversarial Networks (GANs)
Positron emission tomography (PET) image synthesis plays an important role,
which can be used to boost the training data for computer aided diagnosis
systems. However, existing image synthesis methods have problems in
synthesizing the low resolution PET images. To address these limitations, we
propose multi-channel generative adversarial networks (M-GAN) based PET image
synthesis method. Different to the existing methods which rely on using
low-level features, the proposed M-GAN is capable to represent the features in
a high-level of semantic based on the adversarial learning concept. In
addition, M-GAN enables to take the input from the annotation (label) to
synthesize the high uptake regions e.g., tumors and from the computed
tomography (CT) images to constrain the appearance consistency and output the
synthetic PET images directly. Our results on 50 lung cancer PET-CT studies
indicate that our method was much closer to the real PET images when compared
with the existing methods.Comment: 9 pages, 2 figure
- …