Article thumbnail
Location of Repository

An Efficient Method for Deformable Segmentation of 3D US Prostate Images

By Yiqiang Zhan and Dinggang Shen

Abstract

Abstract We previously proposed a deformable model for automatic and accurate segmentation of prostate boundary from 3D ultrasound (US) images by matching both prostate shapes and tissue textures in US images[6]. Textures were characterized by a Gabor filter bank and further classified by support vector machines (SVM), in order to discriminate the prostate boundary from the US images. However, the step of tissue texture characterization and classification is very slow, which impedes the future applications of the proposed approach in clinic applications. To overcome this limitation, we firstly implement it in a 3-level multi-resolution framework, and then replace the step of SVM-based tissue classification and boundary identification by a Zernike moment-based edge detector in both low and middle resolutions, for fast capturing boundary information. In the high resolution, the step of SVM-based tissue classification and boundary identification is still kept for more accurate segmentation. However, SVM is extremely slow for tissue classification as it usually needs a large number of support vectors to construct a complicated separation hypersurface, due to the high overlay of texture features of prostate and non-prostate tissues in US images. To increase the efficiency of SVM, a new SVM training method is designed by effectively reducing the number of support vectors. Experimental results show that the proposed method is 10 times faster than the previous one, yet without losing any segmentation accuracy.

Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.323.6531
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • https://www.med.unc.edu/bric/i... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.