7 research outputs found

    Optic nerve head three-dimensional shape analysis

    Get PDF
    We present a method for optic nerve head (ONH) 3-D shape analysis from retinal optical coherence tomography (OCT). The possibility to noninvasively acquire in vivo high-resolution 3-D volumes of the ONH using spectral domain OCT drives the need to develop tools that quantify the shape of this structure and extract information for clinical applications. The presented method automatically generates a 3-D ONH model and then allows the computation of several 3-D parameters describing the ONH. The method starts with a high-resolution OCT volume scan as input. From this scan, the model-defining inner limiting membrane (ILM) as inner surface and the retinal pigment epithelium as outer surface are segmented, and the Bruch's membrane opening (BMO) as the model origin is detected. Based on the generated ONH model by triangulated 3-D surface reconstruction, different parameters (areas, volumes, annular surface ring, minimum distances) of different ONH regions can then be computed. Additionally, the bending energy (roughness) in the BMO region on the ILM surface and 3-D BMO-MRW surface area are computed. We show that our method is reliable and robust across a large variety of ONH topologies (specific to this structure) and present a first clinical application

    Neue Methoden der Nachbearbeitung und Analyse retinaler optischer KohÀrenztomografieaufnahmen bei neurologischen Erkrankungen

    Get PDF
    Viele neurologische Krankheiten verursachen VerĂ€nderungen in der Netzhaut, die mit Hilfe der optischen KohĂ€renztomography (optical coherence tomography, OCT) dargestellt werden können. Dabei entstehen viele Bilddaten, deren Auswertung zeitintensiv ist und geschultes Personal erfordert. Ziel dieser Arbeit war die Entwicklung neuer Methoden zur Vorverarbeitung und Analyse retinaler OCT-Bilddaten, um Outcome-Parameter fĂŒr Studien und diagnostische Marker fĂŒr neurologische Erkrankungen zu verbessern. Dazu wurden Methoden fĂŒr zwei wichtige Aufnahmebereiche der Netzhaut, den Sehnervenkopf (optic nerve head, ONH) und die Makula, entwickelt. FĂŒr den ONH-Bereich wurde eine automatische Segmentierung auf Basis aktiver Konturen entwickelt, die eine akkurate Segmentierung der inneren Grenzmembran auch bei komplexer Topografie ermöglicht. FĂŒr den Bereich um die Makula entstand eine intraretinale Schichtensegmentierungspipeline, die von der Auswahl der Bilddaten ĂŒber die automatische Segmentierung sowie die manuelle Nachkorrektur bis zur Ausgabe verschiedener Schichtdicken in Tabellenform reicht. FĂŒr beide Aufnahmebereiche wurden mehrere Programme entwickelt, die auf einer gemeinsamen Basis zur Verarbeitung von OCT-Daten fußen. Eines dieser Programme bietet eine grafische OberflĂ€che zur manuellen Verarbeitung der Bilddaten. Mit dieser Software wurden Teile der Referenzdaten manuell erstellt, die innere Grenzmembran des ONH automatisch segmentiert sowie eine komfortable Nachbearbeitung von intraretinalen Segmentierungen vorgenommen. Dies ermöglichte die automatische Auswertung morphologischer Parameter des ONH, wovon einige signifikante Unterschiede zwischen Patienten mit neurologischen Krankheiten und gesunden Kontrollen zeigten. Weiter kam die Schichtensegmentierungspipeline beim Aufbau einer normativen Datenbank sowie in einer Studie zum Zusammenhang des retinalen Schadens mit der kritischen Flimmerfrequenz zum Einsatz. Ein Teil der Software wurde als freie und quelloffene Software (free and open-source software, FOSS) und der normative Datensatz fĂŒr die Verwendung in anderen Studien freigegeben. Beides wird bereits in weiteren Studien eingesetzt und wird auch die DurchfĂŒhrung zukĂŒnftiger Studien vereinfachen sowie die Entwicklung neuer Methoden unterstĂŒtzen.Many neurological diseases cause changes in the retina, which can be visualized using optical coherence tomography (OCT). This process produces large amounts of image data. Its evaluation is time-consuming and requires medically trained personnel. This dissertation aims to develop new methods for preprocessing and analyzing retinal OCT data in order to improve outcome parameters for clinical studies and diagnostic markers for neurological diseases. For this purpose, methods concerning the regions of two landmarks of the retina, the optic nerve head (ONH) and the macula, were developed. For the ONH, an automatic segmentation method based on active contours was developed, which allows accurate segmentation of the inner limiting membrane even in complex topography. For the macular region, an intraretinal layer segmentation pipeline from image data via automatic segmentation to manual post-correction and the output of different layer thicknesses in tabular form was developed. For both, ONH and macular region, several programs were developed, which share a common basis for processing OCT data. One of these programs offers a graphical user interface for the manual processing of image data. Parts of the reference data were created manually using this software. Moreover, the inner limiting membrane of the ONH was segmented automatically and post-processing of intraretinal segmentations was performed. This allowed for automatic evaluation of morphological parameters of the ONH, some of which showed significant differences between patients with neurological diseases and the healthy control group. Furthermore, the layer segmentation pipeline was utilized to create a normative database as well as to investigate the correlation of retinal damage and critical flicker frequency. Part of the software was released as free and open-source software (FOSS) and the normative data set was released for use in other studies. Both are already being used in further studies and will also aid in future studies, as well as support the development of new methods

    Segmentation automatique des images de fibres d’ADN pour la quantification du stress rĂ©plicatif

    Get PDF
    La rĂ©plication de l’ADN est un processus complexe gĂ©rĂ© par une multitude d’interactions molĂ©culaires permettant une transmission prĂ©cise de l’information gĂ©nĂ©tique de la cellule mĂšre vers les cellules filles. Parmi les facteurs pouvant porter atteinte Ă  la fidĂ©litĂ© de ce processus, on trouve le Stress RĂ©plicatif. Il s’agit de l’ensemble des phĂ©nomĂšnes entraĂźnant le ralentissement voire l’arrĂȘt anormal des fourches de rĂ©plication. S’il n’est pas maĂźtrisĂ©, le stress rĂ©plicatif peut causer des ruptures du double brin d’ADN ce qui peut avoir des consĂ©quences graves sur la stabilitĂ© du gĂ©nome, la survie de la cellule et conduire au dĂ©veloppement de cancers, de maladies neurodĂ©gĂ©nĂ©ratives ou d’autres troubles du dĂ©veloppement. Il existe plusieurs techniques d’imagerie de l’ADN par fluorescence permettant l’évaluation de la progression des fourches de rĂ©plication au niveau molĂ©culaire. Ces techniques reposent sur l’incorporation d’analogues de nuclĂ©otides tels que chloro- (CldU), iodo- (IdU), ou bromo-deoxyuridine (BrdU) dans le double brin en cours de rĂ©plication. L’expĂ©rience la plus classique repose sur l’incorporation successive de deux types d’analogues de nuclĂ©otides (IdU et CldU) dans le milieu cellulaire. Une fois ces nuclĂ©otides exogĂšnes intĂ©grĂ©s dans le double brin de l’ADN rĂ©pliquĂ©, on lyse les cellules et on rĂ©partit l’ADN sur une lame de microscope. Les brins contenant les nuclĂ©otides exogĂšnes peuvent ĂȘtre imagĂ©s par immunofluorescence. L’image obtenue est constituĂ©e de deux couleurs correspondant Ă  chacun des deux types d’analogues de nuclĂ©otides. La mesure des longueurs de chaque section fluorescente permet la quantification de la vitesse de progression des fourches de rĂ©plication et donc l’évaluation des effets du stress rĂ©plicatif. La mesure de la longueur des fibres fluorescentes d’ADN est gĂ©nĂ©ralement rĂ©alisĂ©e manuellement. Cette opĂ©ration, en plus d’ĂȘtre longue et fastidieuse, peut ĂȘtre sujette Ă  des variations inter- et intra- opĂ©rateurs provenant principalement de dĂ©fĂ©rences dans le choix des fibres. La dĂ©tection des fibres d’ADN est difficile car ces derniĂšres sont souvent fragmentĂ©es en plusieurs morceaux espacĂ©s et peuvent s’enchevĂȘtrer en agrĂ©gats. De plus, les fibres sont parfois difficile Ă  distinguer du bruit en arriĂšre-plan causĂ© par les liaisons non-spĂ©cifiques des anticorps fluorescents. MalgrĂ© la profusion des algorithmes de segmentation de structures curvilignes (vaisseaux sanguins, rĂ©seaux neuronaux, routes, fissures sur bĂ©ton...), trĂšs peu de travaux sont dĂ©diĂ©s au traitement des images de fibres d’ADN. Nous avons mis au point un algorithme intitulĂ© ADFA (Automated DNA Fiber Analysis) permettant la segmentation automatique des fibres d’ADN ainsi que la mesure de leur longueur respective. Cet algorithme se divise en trois parties : (i) Une extraction des objets de l’image par analyse des contours. Notre mĂ©thode de segmentation des contours se basera sur des techniques classiques d’analyse du gradient de l’image (Marr-Hildreth et de Canny). (ii) Un prolongement des objets adjacents afin de fusionner les fibres fragmentĂ©es. Nous avons dĂ©veloppĂ© une mĂ©thode de suivi (tracking) basĂ©e sur l’orientation et la continuitĂ© des objets adjacents. (iii) Une dĂ©termination du type d’analogue de nuclĂ©otide par comparaison des couleurs. Pour ce faire, nous analyserons les deux canaux (vert et rouge) de l’image le long de chaque fibre. Notre algorithme a Ă©tĂ© testĂ© sur un grand nombre d’images de qualitĂ© variable et acquises Ă  partir de diffĂ©rents contextes de stress rĂ©plicatif. La comparaison entre ADFA et plusieurs opĂ©rateurs humains montre une forte adĂ©quation entre les deux approches Ă  la fois Ă  l’échelle de chaque fibre et Ă  l’échelle plus globale de l’image. La comparaison d’échantillons soumis ou non soumis Ă  un stress rĂ©plicatif a aussi permis de valider les performances de notre algorithme. Enfin, nous avons Ă©tudiĂ© l’impact du temps d’incubation du second analogue de nuclĂ©otide sur les rĂ©sultats de l’algorithme. Notre algorithme est particuliĂšrement efficace sur des images contenant des fibres d’ADN relativement courtes et peu fractionnĂ©es. En revanche, notre mĂ©thode de suivi montre des limites lorsqu’il s’agit de fusionner correctement de longues fibres fortement fragmentĂ©es et superposĂ©es Ă  d’autres brins. Afin d’optimiser les performances d’ADFA, nous recommandons des temps d’incubation courts (20 Ă  30 minutes) pour chaque analogue de nuclĂ©otide dans le but d’obtenir des fibres courtes. Nous recommandons aussi de favoriser la dilution des brins sur la lame de microscope afin d’éviter la formation d’agrĂ©gats de fibres difficiles Ă  distinguer. ADFA est disponible en libre accĂšs et a pour vocation de servir de rĂ©fĂ©rence pour la mesure des brins d’ADN afin de pallier les problĂšmes de variabilitĂ©s inter-opĂ©rateurs.----------ABSTRACTDNA replication is tightly regulated by a great number of molecular interactions that ensure accurate transmission of genetic information to daughter cells. Replicative Stress refers to all the processes undermining the fidelity of DNA replication by slowing down or stalling DNA replication forks. Indeed, stalled replication forks may “collapse” into highly-genotoxic double strand breaks (DSB) which engender chromosomal rearrangements and genomic instability. Thus, replicative stress can constitute a critical determinant in both cancer development and treatment. Replicative stress is also implicated in the molecular pathogenesis of aging and neurodegenerative disease, as well as developmental disorders. Several fluorescence imaging techniques enable the evaluation of replication forks progression at the level of individual DNA molecules. Those techniques rely on the incorporation of exogene nucleotide analogs in nascent DNA at replication forks in living cells. In a typical experiment, sequential incorporation of two nucleotide analogs, e.g., IdU and CldU, is performed. Following cell lysis and spreading of DNA on microscopy slides, DNA molecules are then imaged by immunofluorescence. The obtained image is made up of two colors corresponding to each one of the two nucleotide analogs. Measurement of the respective lengths of these labeled stretches of DNA permits quantification of replication fork progression. Evaluation of DNA fiber length is generally performed manually. This procedure is laborious and subject to inter- and intra-user variability stemming in part from unintended bias in the choice of fibers to be measured. DNA fiber extraction is difficult because strands are often fragmented in lots of subparts and can be tangled in clusters. Moreover, the extraction of fibers can be difficult when the background is noised by non specific staining. Despite the large number of segmentation algorithms dedicated to curvilinear structures (blood vessels, neural networks, roads, concrete tracks...), few studies address the treatment of DNA fiber images. We developed an algorithm called ADFA (Automated DNA Fiber Analysis) which automatically segments DNA fibers and measures their respective length. Our approach can be divided into three parts: 1. Object extraction by a robust contour detection. Our contour segmentation method relies on two classical gradient analyses (Marr and Hildreth, 1980; Canny, 1986) 2. Fusion of adjacent fragmented fibers by analysing their continuity. We developped a tracking approach based on the orientation and the continuity of adjacent fibers. 3. Detection of the nucleotide analog label (IdU or CldU). To do so, we analyse the color profile on both channels (green and red) along each fiber. ADFA was tested on a database of different images of varying quality, signal to noise ratio, or fiber length which were acquired from two different microscopes. The comparison between ADFA and manual segmentations shows a high correlation both at the scale of the fiber and at the scale of the image. Moreover, we validate our algorithm by comparing samples submitted to replicative stress and controls. Finally, we studied the impact of the incubation time of the second nucleotide analog pulse. The performances of our algorithm are optimised for images containing relatively short and not fragmented DNA fibers. Our tracking methods may be limited when connecting highly split fibers superimposed to other strands. Therefore, we recommend to reduce the incubation time of each nucleotide analog to about 20-30 minutes in order to obtain short fibers. We also recommend to foster the dilution of fibers on the slide to reduce clustering of fluorescent DNA molecules. ADFA is freely available as an open-source software. It might be used as a reference tool to solve inter-intra user variability

    Modélisation et représentation dans l'espace des phénomÚnes photoniques inélastiques en biophotonique

    Get PDF
    Ce prĂ©sent mĂ©moire s’intĂ©resse Ă  la modĂ©lisation mathĂ©matique pour aborder la spatialitĂ© de signaux de spectroscopie Raman et de fluorescence dans des problĂ©matiques d’assistance au diagnostic et d’aide Ă  l’instrumentation. Dans un premier temps, ce mĂ©moire expose une technique de simulation adaptĂ©e Ă  un large spectre d’interaction photon-matiĂšre basĂ©e sur la rĂ©solution par tracĂ© de chemin MontĂ©-Carlo pour des domaines discrets. L’algorithme dĂ©veloppĂ©, le parcours cachĂ© des photons, supporte notamment les phĂ©nomĂšnes linĂ©aires, soit l’absorption et l’émission spontanĂ©e, les diffusions Ă©lastiques et inĂ©lastiques (Raman), les rĂ©flexions, les rĂ©fractions et la fluorescence. Le modĂšle a Ă©tĂ© conçu dans l’objectif d’ĂȘtre adaptĂ© Ă  la complexitĂ© des milieux biologiques, soit la complexitĂ© des interactions et des gĂ©omĂ©tries. La reprĂ©sentation discrĂšte de l’espace est rĂ©alisĂ©e par Marching Cube et l’ensemble des phĂ©nomĂšnes est simulĂ© simultanĂ©ment, pour plusieurs longueurs d’onde discrĂštes, afin de supporter les interactions entre les phĂ©nomĂšnes (diaphonie) et de produire une solution physiquement exacte. La solution a Ă©tĂ© implĂ©mentĂ©e dans un format de calcul gĂ©nĂ©rique sur un processeur graphique par adaptation du pipeline 3D. L’algorithme prĂ©sentĂ© aborde aussi des mĂ©thodes pour limiter l’utilisation de la mĂ©moire afin de prĂ©senter une solution non prohibitive aux phĂ©nomĂšnes Raman et de fluorescence Ă  plusieurs longueurs d’onde. De plus, la solution proposĂ©e intĂšgre une camĂ©ra, une visualisation de la fluence et une visualisation 3D des photons afin d’ĂȘtre adaptĂ©e au domaine de la biophysique. Finalement, les algorithmes dĂ©veloppĂ©s sont validĂ©s par la prĂ©diction de rĂ©sultats dĂ©terminĂ©s selon une base thĂ©orique et expĂ©rimentale. Le simulateur propose une mĂ©thode thĂ©orique pour calibrer les instruments de mesure optiques et pour Ă©valuer la portĂ©e d'un signal. Dans un second temps, ce mĂ©moire propose des mĂ©thodes de rĂ©duction de dimensionnalitĂ© pour optimiser la reconnaissance automatisĂ©e de volumes de donnĂ©es rattachĂ©s Ă  des modalitĂ©s optiques dans un contexte biomĂ©dical. Deux modalitĂ©s optiques sont plus spĂ©cialement visĂ©es, soit la microscopie Raman et la tomographie en cohĂ©rence optique. Dans le premier cas, un outil effectuant des analyses chimiomĂ©triques a Ă©tĂ© mis au point pour reproduire les images de coloration histologique avec la microscopie traditionnelle. L’algorithme a Ă©tĂ© proposĂ© pour des Ă©chantillons fixĂ©s sur des lames d’aluminium.----------Abstract This master’s thesis focuses on mathematical modelling to address the spatiality of Raman spectroscopy and fluorescence signals to assist instrumentation and diagnostics. Firstly, this thesis presents a simulation technique adapted to a broad spectrum of photon-matter interaction based on the Monte Carlo path tracing resolution for discrete domains. The developed algorithm, the hidden path of photons, notably supports linear phenomena, namely absorption and spontaneous emission, elastic and inelastic scattering (Raman), reflections, refractions and fluorescence. The model was designed with the objective of being adapted to the complexity of biological environments, of interactions and of geometries. The discrete representation of space is performed by Marching Cube and the set of phenomena is simulated simultaneously, for several discrete wavelengths, in order to support the interactions between the phenomena (crosstalk) and to produce a physically exact solution. The solution has been implemented in a general-purpose processing on graphics processing units format by adaptation of the 3D pipeline. The presented algorithm also addresses methods to limit the use of memory in order to present a non-prohibitive solution to Raman diffusion and fluorescence at several wavelengths. In addition, the proposed solution integrates a camera, a visualization of fluence and a 3D visualization of photons to be adapted to the field of biophysics. Finally, the algorithms developed are validated by the prediction of known results on a theoretical and empirical basis. The simulator represents a theoretical method for calibrating optical measuring instruments and determining the spatial range of a signal. Secondly, this thesis proposes dimensionality reduction methods to optimize the automated recognition of data volumes related to optical modalities in biomedical contexts. Two optical modalities are more specifically targeted, namely Raman microscopy and optical coherence tomography. In the first case, a tool performing chemometrics analysis was developed to reproduce histologic staining images with traditional microscopy. The algorithm has been proposed for samples fixed on aluminium microscope slides. By evaluating the contribution of the measured signal on an empty slide, algorithm seeks to evaluate the drop in concentration of the compounds of interest, making analogy to the gradual transparency in histology, thus offering a more faithful representation

    Méthodes numériques et statistiques pour l'analyse de trajectoire dans un cadre de geométrie Riemannienne.

    Get PDF
    This PhD proposes new Riemannian geometry tools for the analysis of longitudinal observations of neuro-degenerative subjects. First, we propose a numerical scheme to compute the parallel transport along geodesics. This scheme is efficient as long as the co-metric can be computed efficiently. Then, we tackle the issue of Riemannian manifold learning. We provide some minimal theoretical sanity checks to illustrate that the procedure of Riemannian metric estimation can be relevant. Then, we propose to learn a Riemannian manifold so as to model subject's progressions as geodesics on this manifold. This allows fast inference, extrapolation and classification of the subjects.Cette thĂšse porte sur l'Ă©laboration d'outils de gĂ©omĂ©trie riemannienne et de leur application en vue de la modĂ©lisation longitudinale de sujets atteints de maladies neuro-dĂ©gĂ©nĂ©ratives. Dans une premiĂšre partie, nous prouvons la convergence d'un schĂ©ma numĂ©rique pour le transport parallĂšle. Ce schĂ©ma reste efficace tant que l'inverse de la mĂ©trique peut ĂȘtre calculĂ© rapidement. Dans une deuxiĂšme partie, nous proposons l'apprentissage une variĂ©tĂ© et une mĂ©trique riemannienne. AprĂšs quelques rĂ©sultats thĂ©oriques encourageants, nous proposons d'optimiser la modĂ©lisation de progression de sujets comme des gĂ©odĂ©siques sur cette variĂ©tĂ©

    Atlas-based shape analysis and classification of retinal optical coherence tomography images using the functional shape (fshape) framework

    No full text
    International audienceWe propose a novel approach for quantitative shape variability analysis in reti-nal optical coherence tomography images using the functional shape (fshape) framework. The fshape framework uses surface geometry together with functional measures, such as retinal layer thickness defined on the layer surface, for registration across anatomical shapes. This is used to generate a population mean template of the geometry-function measures from each individual. Shape variability across multiple retinas can be measured by the geometrical deformation and functional residual between the template and each of the observations. To demonstrate the clinical relevance and application of the framework, we generated atlases of the inner layer surface and layer thickness of the Retinal Nerve Fiber Layer (RNFL) of glaucomatous and normal subjects, visualizing detailed spatial pattern of RNFL loss in glaucoma. Additionally, a regularized linear discriminant analysis classifier was used to automatically classify glau-coma, glaucoma-suspect, and control cases based on RNFL fshape metrics
    corecore