183 research outputs found
Colon centreline calculation for CT colonography using optimised 3D opological thinning
CT colonography is an emerging technique for colorectal
cancer screening. This technique facilitates noninvasive
imaging of the colon interior by generating virtual
reality models of the colon lumen. Manual navigation
through these models is a slow and tedious process.
It is possible to automate navigation by calculating the centreline
of the colon lumen. There are numerous well documented
approaches for centreline calculation. Many of
these techniques have been developed as alternatives to 3D
topological thinning which has been discounted by others
due to its computationally intensive nature. This paper describes
a fully automated, optimised version of 3D topological
thinning that has been specifically developed for calculating
the centreline of the human colon
A novel technique for reducing false positive detections in CAD-CTC
Computed tomography colonoscopy (CTC) is an emerging alternative to conventional colonoscopy for colorectal cancer screening. A series of computer assisted diagnosis (CAD) techniques have been developed for use in CTC. Although high levels of accuracy for polyp detection have been reported, the problem of excessive false positive detections still warrants attention. We present a CAD-CTC technique that has been developed specifically to reduce the number of false positive detections without compromising polyp detection accuracy. The technique incorporates a novel intermediate stage that restructures initial polyp candidates so that they conform more closely to the shape of actual polyps. The restructuring process causes false positives to expand to include more false positive characteristics, whereas, actual polyps retain their original polyp-like characteristics. An evaluation of the documented technique demonstrated that it can be successfully applied to the majority of polyp candidates, and that its use can reduce the number of false positive detections by up to 57.8%
A visual programming environment for machine vision engineers
This paper details a free image analysis and software development environment for machine vision application development. The environment provides high-level access to over 300 image manipulation, processing and analysis algorithms through a well-defined and easy to use graphical interface. Users can extend the core library using the developer's interface via a plug-in which features automatic source code generation, compilation with full error feedback and dynamic algorithm updates. Also discusses key issues associated with the environment and outline the advantages in adopting such a system for machine vision application development
Identification of body fat tissues in MRI data
In recent years non-invasive medical diagnostic techniques have been used widely in medical investigations. Among the various imaging modalities available, Magnetic Resonance Imaging is very attractive as it produces multi-slice images where the contrast between various types of body tissues such as muscle, ligaments and fat is well defined. The aim of this paper is to describe the implementation of an unsupervised image analysis algorithm able to identify the body fat tissues from a sequence of MR images encoded in DICOM format. The developed algorithm consists of three main steps. The first step pre-processes the MR images in order to reduce the level of noise. The second step extracts the image areas representing fat tissues by using an unsupervised clustering algorithm. Finally, image refinements are applied to reclassify the pixels adjacent to the initial fat estimate and to eliminate outliers. The experimental data indicates that the proposed implementation returns accurate results and furthermore is robust to noise and to greyscale in-homogeneity
Rapid automated measurement of body fat distribution from whole-body MRI
The accurate determination of a personâs total body fat is an important issue in medical analysis because obesity is a significant contributing factor to a variety of serious health problems. The medical literature identifies a wide range of diseases that are closely linked to obesity. Current methods of fat assessment are largely inaccurate, and most current methods of fat determination cannot show regional fat distribution, which is important in defining disease risk. We introduce a method that combines computer-aided techniques with whole-body MRI techniques and enables accurate quantification and visualization of total body fat burden and regional fat distribution. This technique may be important in identifying and treating at-risk populations
Comparing Formulations of Generalized Quantum Mechanics for Reparametrization-Invariant Systems
A class of decoherence schemes is described for implementing the principles
of generalized quantum theory in reparametrization-invariant `hyperbolic'
models such as minisuperspace quantum cosmology. The connection with
sum-over-histories constructions is exhibited and the physical equivalence or
inequivalence of different such schemes is analyzed. The discussion focuses on
comparing constructions based on the Klein-Gordon product with those based on
the induced (a.k.a. Rieffel, Refined Algebraic, Group Averaging, or Spectral
Analysis) inner product. It is shown that the Klein-Gordon and induced products
can be simply related for the models of interest. This fact is then used to
establish isomorphisms between certain decoherence schemes based on these
products.Comment: 21 pages ReVTe
Fast colon centreline calculation using optimised 3D topological thinning
Topological thinning can be used to accurately identify the central path through a computer model of the colon generated using computed tomography colonography. The central path can subsequently be used to simplify the task of navigation within the colon model. Unfortunately standard topological thinning is an extremely inefïŹcient process. We present an optimised version of topological thinning that signiïŹcantly improves the performance of centreline calculation without compromising the accuracy of the result. This is achieved by using lookup tables to reduce the computational burden associated with the thinning process
- âŠ