27 research outputs found

    A new approach to automated retinal vessel segmentation using multiscale analysis

    Get PDF
    Author name used in this publication: David ZhangRefereed conference paper2006-2007 > Academic research: refereed > Refereed conference paperVersion of RecordPublishe

    A New Approach to Automated Retinal Vessel Segmentation Using Multiscale Analysis

    Full text link

    DeepNav: Joint View Learning for Direct Optimal Path Perception in Cochlear Surgical Platform Navigation

    Get PDF
    Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D joint-view convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of 2 complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of 5 convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with size equal to the size of the final feature maps. The last fully connected layer of each network has 4 indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from the micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example the heading angle has less than 1° error with computation delays of less that <1 milliseconds

    Segmentation, registration,and selective watermarking of retinal images

    Get PDF
    In this dissertation, I investigated some fundamental issues related to medical image segmentation, registration, and watermarking. I used color retinal fundus images to perform my study because of the rich representation of different objects (blood vessels, microaneurysms, hemorrhages, exudates, etc.) that are pathologically important and have close resemblance in shapes and colors. To attack this complex subject, I developed a divide-and-conquer strategy to address related issues step-by-step and to optimize the parameters of different algorithm steps. Most, if not all, objects in our discussion are related. The algorithms for detection, registration, and protection of different objects need to consider how to differentiate the foreground from the background and be able to correctly characterize the features of the image objects and their geometric properties. To address these problems, I characterized the shapes of blood vessels in retinal images and proposed the algorithms to extract the features of blood vessels. A tracing algorithm was developed for the detection of blood vessels along the vascular network. Due to the noise interference and various image qualities, the robust segmentation techniques were used for the accurate characterization of the objects shapes and verification. Based on the segmentation results, a registration algorithm was developed, which uses the bifurcation and cross-over points of blood vessels to establish the correspondence between the images and derive the transformation that aligns the images. A Region-of-Interest (ROI) based watermarking scheme was proposed for image authenticity. It uses linear segments extracted from the image as reference locations for embedding and detecting watermark. Global and locally-randomized synchronization schemes were proposed for bit-sequence synchronization of a watermark. The scheme is robust against common image processing and geometric distortions (rotation and scaling), and it can detect alternations such as moving or removing of the image content

    Blood vessel detection in retinal images and its application in diabetic retinopathy screening

    Get PDF
    In this dissertation, I investigated computing algorithms for automated retinal blood vessel detection. Changes in blood vessel structures are important indicators of many diseases such as diabetes, hypertension, etc. Blood vessel is also very useful in tracking of disease progression, and for biometric authentication. In this dissertation, I proposed two algorithms to detect blood vessel maps in retina. The first algorithm is based on integration of a Gaussian tracing scheme and a Gabor-variance filter. This algorithm traces the large blood vessel in retinal images enhanced with adaptive histogram equalization. Small vessels are traced on further enhanced images by a Gabor-variance filter. The second algorithm is called a radial contrast transform (RCT) algorithm, which converts the intensity information in spatial domain to a high dimensional radial contrast domain. Different feature descriptors are designed to improve the speed, sensitivity, and expandability of the vessel detection system. Performances comparison of the two algorithms with those in the literature shows favorable and robust results. Furthermore, a new performance measure based on central line of blood vessels is proposed as an alternative to more reliable assessment of detection schemes for small vessels, because the significant variations at the edges of small vessels need not be considered. The proposed algorithms were successfully tested in the field for early diabetic retinopathy (DR) screening. A highly modular code library to take advantage of the parallel processing power of multi-core computer architecture was tested in a clinical trial. Performance results showed that our scheme can achieve similar or even better performance than human expert readers for detection of micro-aneurysms on difficult images

    Dendrite Tracking in Microscopic Images using Minimum Spanning Trees and Localized E-M

    Get PDF
    We describe in this document our preliminary results regarding the tracking of dendrites spreading from a neuron in confocal microscope im- ages. When using a small number of image layers, we obtain good results by combining a EM-based local estimate of the probability that an image pixel belongs to a neuron filament with the global tree properties of the complete set of dendrites. The optimal tree is obtained with a modified minimum-spanning tree procedure. We will argue that this approach extends naturally to the complete data volume and should give even better results

    QualitĂ€t und Nutzen - Über den Gebrauch von Zeit-Wert-Funktionen zur Integration qualitĂ€ts- und zeit-flexibler Aspekte in einer dynamischen Echtzeit-Einplanungsumgebung

    Get PDF
    Scheduling methodologies for real-time applications have been of keen interest to diverse research communities for several decades. Depending on the application area, algorithms have been developed that are tailored to specific requirements with respect to both the individual components of which an application is made up and the computational platform on which it is to be executed. Many real-time scheduling algorithms base their decisions solely or partly on timing constraints expressed by deadlines which must be met even under worst-case conditions. The increasing complexity of computing hardware means that worst-case execution time analysis becomes increasingly pessimistic. Scheduling hard real-time computations according to their worst-case execution times (which is common practice) will thus result, on average, in an increasing amount of spare capacity. The main goal of flexible real-time scheduling is to exploit this otherwise wasted capacity. Flexible scheduling schemes have been proposed to increase the ability of a real-time system to adapt to changing requirements and nondeterminism in the application behaviour. These models can be categorised as those whose source of flexibility is the quality of computations and those which are flexible regarding their timing constraints. This work describes a novel model which allows to specify both flexible timing constraints and quality profiles for an application. Furthermore, it demonstrates the applicability of this specification method to real-world examples and suggests a set of feasible scheduling algorithms for the proposed problem class.Einplanungsverfahren fĂŒr Echtzeitanwendungen stehen seit Jahrzehnten im Interesse verschiedener Forschungsgruppen. AbhĂ€ngig vom Anwendungsgebiet wurden Algorithmen entwickelt, welche an die spezifischen Anforderungen sowohl hinsichtlich der einzelnen Komponenten, aus welchen eine Anwendung besteht, als auch an die Rechnerplattform, auf der diese ausgefĂŒhrt werden sollen, angepasst sind. Viele Echtzeit-Einplanungsverfahren grĂŒnden ihre Entscheidungen ausschließlich oder teilweise auf Zeitbedingungen, welche auch bei Auftreten maximaler AusfĂŒhrungszeiten eingehalten werden mĂŒssen. Die zunehmende KomplexitĂ€t von Rechner-Hardware bedeutet, dass die Worst-Case-Analyse in steigendem Maße pessimistisch wird. Die Einplanung harter Echtzeit-Berechnungen anhand ihrer maximalen AusfĂŒhrungszeiten (was die gĂ€ngige Praxis darstellt) resultiert daher im Regelfall in einer frei verfĂŒgbaren RechenkapazitĂ€t in steigender Höhe. Das Hauptziel flexibler Echtzeit-Einplanungsverfahren ist es, diese ansonsten verschwendete KapazitĂ€t auszunutzen. Flexible Einplanungsverfahren wurden vorgeschlagen, welche die FĂ€higkeit eines Echtzeitsystems erhöhen, sich an verĂ€nderte Anforderungen und Nichtdeterminismus im Verhalten der Anwendung anzupassen. Diese Modelle können unterteilt werden in solche, deren Quelle der FlexibilitĂ€t die QualitĂ€t der Berechnungen ist, und jene, welche flexibel hinsichtlich ihrer Zeitbedingungen sind. Diese Arbeit beschreibt ein neuartiges Modell, welches es erlaubt, sowohl flexible Zeitbedingungen als auch QualitĂ€tsprofile fĂŒr eine Anwendung anzugeben. Außerdem demonstriert sie die Anwendbarkeit dieser Spezifikationsmethode auf reale Beispiele und schlĂ€gt eine Reihe von Einplanungsalgorithmen fĂŒr die vorgestellte Problemklasse vor

    RAPID 3D TRACING OF THE MOUSE BRAIN NEUROVASCULATURE WITH LOCAL MAXIMUM INTENSITY PROJECTION AND MOVING WINDOWS

    Get PDF
    Neurovascular models have played an important role in understanding neuronal function or medical conditions. In the past few decades, only small volumes of neurovascular data have been available. However, huge data sets are becoming available with high throughput instruments like the Knife-Edge Scanning Microscope (KESM). Therefore, fast and robust tracing methods become necessary for tracing such large data sets. However, most tracing methods are not effective in handling complex structures such as branches. Some methods can solve this issue, but they are not computationally efficient (i.e., slow). Motivated by the issue of speed and robustness, I introduce an effective and efficient fiber tracing algorithm for 2D and 3D data. In 2D tracing, I have implemented a Moving Window (MW) method which leads to a mathematical simplification and noise robustness in determining the trace direction. Moreover, it provides enhanced handling of branch points. During tracing, a Cubic Tangential Trace Spline (CTTS) is used as an accurate and fast nonlinear interpolation approach. For 3D tracing, I have designed a method based on local maximum intensity projection (MIP). MIP can utilize any existing 2D tracing algorithms for use in 3D tracing. It can also significantly reduce the search space. However, most neurovascular data are too complex to directly use MIP on a large scale. Therefore, we use MIP within a limited cube to get unambiguous projections, and repeat the MIP-based approach over the entire data set. For processing large amounts of data, we have to automate the tracing algorithms. Since the automated algorithms may not be 100 percent correct, validation is needed. I validated my approach by comparing the traced results to human labeled ground truth showing that the result of my approach is very similar to the ground truth. However, this validation is limited to small-scale real-world data due to the limitation of the manual labeling. Therefore, for large-scale data, I validated my approach using a model-based generator. The result suggests that my approach can also be used for large-scale real-world data. The main contributions of this research are as follows. My 2D tracing algorithm is fast enough to analyze, with linear processing time based on fiber length, large volumes of biological data and is good at handling branches. The new local MIP approach for 3D tracing provides significant performance improvement and it allows the reuse of any existing 2D tracing methods. The model-based generator enables tracing algorithms to be validated for large-scale real-world data. My approach is widely applicable for rapid and accurate tracing of large amounts of biomedical data
    corecore