2,636 research outputs found
Using genetic algorithms in computer vision : registering images to 3D surface model
This paper shows a successful application of genetic algorithms in computer vision. We aim at building photorealistic 3D models of real-world objects by adding textural information to the geometry. In this paper we focus on the 2D-3D registration problem: given a 3D geometric model of an object, and optical images of the same object, we need to find the precise alignment of the 2D images to the 3D model. We generalise the photo-consistency approach of Clarkson et al. who assume calibrated cameras, thus only the pose of the object in the world needs to be estimated. Our method extends this approach to the case of uncalibrated cameras, when both intrinsic and extrinsic camera parameters are unknown. We formulate the problem as an optimisation and use a genetic algorithm to find a solution. We use semi-synthetic data to study the effects of different parameter settings on the registration. Additionally, experimental results on real data are presented to demonstrate the efficiency of the method
Structural alphabets derived from attractors in conformational space
Background: The hierarchical and partially redundant nature of protein structures justifies the definition of frequently occurring conformations of short fragments as 'states'. Collections of selected representatives for these states define Structural Alphabets, describing the most typical local conformations within protein structures. These alphabets form a bridge between the string-oriented methods of sequence analysis and the coordinate-oriented methods of protein structure analysis.Results: A Structural Alphabet has been derived by clustering all four-residue fragments of a high-resolution subset of the protein data bank and extracting the high-density states as representative conformational states. Each fragment is uniquely defined by a set of three independent angles corresponding to its degrees of freedom, capturing in simple and intuitive terms the properties of the conformational space. The fragments of the Structural Alphabet are equivalent to the conformational attractors and therefore yield a most informative encoding of proteins. Proteins can be reconstructed within the experimental uncertainty in structure determination and ensembles of structures can be encoded with accuracy and robustness.Conclusions: The density-based Structural Alphabet provides a novel tool to describe local conformations and it is specifically suitable for application in studies of protein dynamics. © 2010 Pandini et al; licensee BioMed Central Ltd
Advanced photoacoustic image reconstruction using the k-Wave toolbox
Reconstructing images from measured time domain signals is an essential step in tomography-mode photoacoustic imaging. However, in practice, there are many complicating factors that make it difficult to obtain high-resolution images. These include incomplete or undersampled data, filtering effects, acoustic and optical attenuation, and uncertainties in the material parameters. Here, the processing and image reconstruction steps routinely used by the Photoacoustic Imaging Group at University College London are discussed. These include correction for acoustic and optical attenuation, spatial resampling, material parameter selection, image reconstruction, and log compression. The effect of each of these steps is demonstrated using a representative in vivo dataset. All of the algorithms discussed form part of the open-source k-Wave toolbox (available from http://www.k-wave.org)
On the automatic detection of otolith features for fish species identification and their age estimation
This thesis deals with the automatic detection of features in signals, either extracted from photographs or captured by means of electronic sensors, and its possible application in the detection of morphological structures in fish otoliths so as to identify species and estimate their age at death. From a more biological perspective, otoliths, which are calcified structures located in the auditory system of all teleostean fish, constitute one of the main elements employed in the study and management of marine ecology. In this sense, the application of Fourier descriptors to otolith images, combined with component analysis, is habitually a first and a key step towards characterizing their morphology and identifying fish species. However, some of the main limitations arise from the poor interpretation that can be obtained with this representation and the use that is made of the coefficients, as generally they are selected manually for classification purposes, both in quantity and representativity. The automatic detection of irregularities in signals, and their interpretation, was first addressed in the so-called Best-Basis paradigm. In this sense, Saito's Local discriminant Bases algorithm (LDB) uses the Discrete Wavelet Packet Transform (DWPT) as the main descriptive tool for positioning the irregularities in the time-frequency space, and an energy-based discriminant measure to guide the automatic search of relevant features in this domain. Current density-based proposals have tried to overcome the limitations of the energy-based functions with relatively little success. However, other measure strategies more consistent with the true classification capability, and which can provide generalization while reducing the dimensionality of features, are yet to be developed. The proposal of this work focuses on a new framework for one-dimensional signals. An important conclusion extracted therein is that such generalization involves a mesure system of bounded values representing the density where no class overlaps. This determines severely the selection of features and the vector size that is needed for proper class identification, which must be implemented not only based on global discriminant values but also on the complementary information regarding the provision of samples in the domain. The new tools have been used in the biological study of different hake species, yielding good classification results. However, a major contribution lies on the further interpretation of features the tool performs, including the structure of irregularities, time-frequency position, extension support and degree of importance, which is highlighted automatically on the same images or signals. As for aging applications, a new demodulation strategy for compensating the nonlinear growth effect on the intensity profile has been developed. Although the method is, in principle, able to adapt automatically to the specific growth of individual specimens, preliminary results with LDB-based techniques suggest to study the effect of lighting conditions on the otoliths in order to design more reliable techniques for reducing image contrast variation. In the meantime, a new theoretic framework for otolith-based fish age estimation has been presented. This theory suggests that if the true fish growth curve is known, the regular periodicity of age structures in the demodulated profile is related to the radial length the original intensity profile is extracted from. Therefore, if this periodicity can be measured, it is possible to infer the exact fish age omitting feature extractors and classifiers. This could have important implications in the use of computational resources anc current aging approaches.El eje principal de esta tesis trata sobre la detección automática de singularidades en señales, tanto si se extraen de imágenes fotográ cas como si se capturan de sensores electrónicos, así como su posible aplicación en la detección de estructuras morfológicas en otolitos de peces para identi car especies, y realizar una estimación de la edad en el momento de su muerte.
Desde una vertiente más biológica, los otolitos, que son estructuras calcáreas alojadas en el sistema auditivo de todos los peces teleósteos, constituyen uno de los elementos principales en el estudio y la gestión de la ecología marina. En este sentido, el uso combinado de descriptores de Fourier y el análisis de componentes es el primer paso y la clave para caracterizar su morfología e identi car especies marinas. Sin embargo, una de las limitaciones principales de este sistema de representación subyace en la interpretación limitada que se puede obtener de las irregularidades, así como el uso que se hace de los coe cientes en tareas de clasi cación que, por lo general, acostumbra a seleccionarse manualmente tanto por lo que respecta a la cantidad y a su importancia.
La detección automática de irregularidades en señales, y su interpretación, se abordó por primera bajo el marco del Best-Basis paradigm. En este sentido, el algoritmo Local Discriminant Bases (LDB) de N. Saito utiliza la Transformada Wavelet Discreta (DWT) para describir el posicionamiento de características en el espacio tiempo-frecuencia, y una medida discriminante basada en la energía para guiar la búsqueda automática de características en dicho dominio. Propuestas recientes basadas en funciones de densidad han tratado de superar las limitaciones que presentaban las medidas de energía con un éxito relativo. No obstante, todavía están por desarrollar nuevas estrategias más
consistentes con la capacidad real de clasi cación y que ofrezcan mayor generalización al reducir la dimensión de los datos de entrada.
La propuesta de este trabajo se centra en un nuevo marco para señales unidimensionales.
Una conclusión principal que se extrae es que dicha generalización pasa por un marco de medidas de valores acotados que re ejen la densidad donde las clases no se solapan. Esto condiciona severamente el proceso de selección de características y el tamaño del vector necesario para identi car las clases correctamente, que se ha de establecer no sólo en base a valores discriminantes globales sino también en la información complementaria sobre la disposición de las muestras en el dominio.
Las nuevas herramientas han sido utilizadas en el estudio biológico de diferentes especies de merluza, donde se han conseguido buenos resultados de identi cación. No obstante, la contribución principal subyace en la interpretación que dicha herramienta hace de las características seleccionadas, y que incluye la estructura de las irregularidades, su posición temporal-frecuencial, extensión en el eje y grado de relevancia, el cual, se resalta automáticamente sobre la misma imagen o señal.
Por lo que respecta a la determinación de la edad, se ha planteado una nueva estrategia de demodulación para compensar el efecto del crecimiento no lineal en los per les de intensidad. Inicialmente, aunque el método implementa un proceso de optimización capaz de adaptarse al crecimiento individual de cada pez automáticamente, resultados preliminares obtenidos con técnicas basadas en el LDB sugieren estudiar el efecto de las condiciones lumínicas sobre los otolitos con el n de diseñar algoritmos que reduzcan la variación del contraste de la imagen más ablemente.
Mientras tanto, se ha planteado una nueva teoría para estimar la edad de los peces en base a otolitos. Esta teoría sugiere que si la curva de crecimiento real del pez se conoce, el período regular de los anillos en el per l demodulado está relacionado con la longitud total del radio donde se extrae el per l original. Por tanto, si dicha periodicidad es medible, es posible determinar la edad exacta sin necesidad de utilizar extractores de características o clasi cadores, lo cual tendría implicaciones importantes en el uso de recursos computacionales y en las técnicas actuales de estimación de la edad.L'eix principal d'aquesta tesi tracta sobre la detecció automàtica d'irregularitats en senyals, tant si s'extreuen de les imatges fotogrà ques com si es capturen de sensors electrònics, així com la seva possible aplicació en la detecció d'estructures morfològiques en otòlits de peixos per identi car espècies, i realitzar una estimació de l'edat en el moment de la seva mort.
Des de la vesant més biològica, els otòlits, que son estructures calcàries que es troben en el sistema auditiu de tots els peixos teleostis, constitueixen un dels elements principals en l'estudi i la gestió de l'ecologia marina. En aquest sentit, l'ús combinat de descriptors de Fourier i l'anàlisi de components es el primer pas i la clau per caracteritzar la seva morfologia i identi car espècies marines. No obstant, una de les limitacions principals d'aquest sistema de representació consisteix en la interpretació limitada de les irregularitats que pot desenvolupar, així com l'ús que es realitza dels coe cients en tasques de classi cació, els quals, acostumen a ser seleccionats manualment tant pel que
respecta a la quantitat com la seva importància. La detecció automàtica d'irregularitats en senyals, així com la seva interpretació, es
va tractar per primera vegada sota el marc del Best-Basis paradigm. En aquest sentit, l'algorisme Local Discriminant Bases (LDB) de N. Saito es basa en la Transformada Wavelet Discreta (DWT) per descriure el posicionament de característiques dintre de l'espai temporal-freqüencial, i en una mesura discriminant basada en l'energia per guiar la cerca automàtica de característiques dintre d'aquest domini. Propostes més recents basades en funcions de densitat han tractat de superar les limitacions de les mesures d'energia amb un èxit relatiu. No obstant, encara s'han de desenvolupar noves estratègies que siguin més consistents amb la capacitat real de classi cació i ofereixin més
generalització al reduir la dimensió de les dades d'entrada.
La proposta d'aquest treball es centra en un nou marc per senyals unidimensionals. Una de las conclusions principals que s'extreu es que aquesta generalització passa per establir un marc de mesures acotades on els valors re ecteixin la densitat on cap classe es solapa. Això condiciona bastant el procés de selecció de característiques i la mida del vector necessari per identi car les classes correctament, que s'han d'establir no només en base a valors discriminants globals si no també en informació complementària sobre la disposició de les mostres en el domini. Les noves eines s'han utilitzat en diferents estudis d'espècies de lluç, on s'han obtingut bons resultats d'identi cació. No obstant, l'aportació principal consisteix en la interpretació que l'eina extreu de les característiques seleccionades, i que inclou l'estructura de les irregularitats, la seva posició temporal-freqüencial, extensió en l'eix i grau de rellevància, el qual, es ressalta automàticament sobre les mateixa imatge o senyal.
En quan a l'àmbit de determinació de l'edat, s'ha plantejat una nova estratègia de demodulació de senyals per compensar l'efecte del creixement no lineal en els per ls d'intensitat. Tot i que inicialment aquesta tècnica desenvolupa un procés d'optimització capaç d'adaptar-se automàticament al creixement individual de cada peix, els resultats amb el LDB suggereixen estudiar l'efecte de les condicions lumíniques sobre els otòlits amb la nalitat de dissenyar algorismes que redueixin la variació del contrast de les imatges més ablement.
Mentrestant s'ha plantejat una nova teoria per realitzar estimacions d'edat en peixos en base als otòlits. Aquesta teoria suggereix que si la corba de creixement és coneguda, el període regular dels anells en el per l d'intensitat demodulat està relacionat amb la longitud total de radi d'on s'agafa el per l original. Per tant, si la periodicitat es pot mesurar, es possible conèixer l'edat exacta del peix sense usar extractors de
característiques o classi cadors, la qual cosa tindria implicacions importants en l'ús de recursos computacionals i en les tècniques actuals d'estimació de l'edat.Postprint (published version
Fast, Three-Dimensional Fluorescence Imaging of Living Cells
This thesis focuses on multi-plane fluorescence microscopy for fast live-cell imaging. To improve the performance of multi-plane microscopy, I developed new image analysis methods. I used these methods to measure and analyze the movements of cardiomyocytesand Dictyostelium discoideum cells.The multi-plane setup is based on a conventional wide-field microscope using a custom multiple beam-splitter in the detection path. This prism creates separate images of eight distinct focal planes in the sample. Since 3D volume is imaged without scanning, three-dimensional imaging at a very high speed becomes possible. However, as in conventional wide-field microscopy, the "missing cone" of spatial frequencies along the optical axis in the optical transfer function (OTF) prevents optical sectioning in such a microscope. This is in stark contrast to other truly three-dimensional imaging modalities like confocal and light-sheet microscopy. In order to overcome the lack of optical sectioning, I developed a new deconvolution method. Deconvolution describes methods that restore or sharpen an image based on physical assumptions and knowledge of the imaging process. Deconvolution methods have been widely used to sharpen images of microscopes and telescopes. The recently developed SUPPOSe algorithm is a deconvolution algorithm that uses a set of numerous virtual point sources. It tries to reconstruct an image by distributing these point sources in space and optimizing their positions so that the resulting image reproduces as good as possible the measured data. SUPPOSe has never been used for 3D images. Compared to other algorithms, this method has superior performance when the number of pixels is increased by interpolation. In this work, I extended the method to work also with 3D image data. The 3D-SUPPOSe program is suitable for analyzing data of our multi-plane setup. The multi-plane setup has only eight vertically aligned image planes. Furthermore, for accurate reconstruction of 3D images, I studied a method of correcting each image plane's relative brightness constituting an image, and I also developed a method of measuring the movement of point emitters in 3D space.
Using these methods, I measured and analyzed the beating motion of cardiomyocytes and the chemotaxis of Dicyosteilium discoidem. Cardiomyocytes are the cells of the heart muscle and consist of repetitive sarcomeres. These cells are characterized by fast and periodic movements, and so far the dynamics of these cells was studied only with two-dimensional imaging. In this thesis, the beating motion was analyzed by tracing the spatial distribution of the so-called z-discs, one of the constituent components of cardiomyocytes. I found that the vertical distribution of -actinine-2 in a single z-disc changed very rapidly, which may serve as a starting point for a better understanding the motion of cardiomyocytes.
\textit{Dictyostelium discoideum} is a well established single cell model organism that migrates along the gradient of a chemoattractant. One has conducted much research to understand the mechanism of chemotaxis, and many efforts have been made to understand the role of actin in the chemotactic motion. By suppressing the motor protein, myosin, a cell line was created that prevented the formation of normal actin filaments. In these myosin null cells, F-actin moves in a flow-like behaviour and induces cell movement. In this study, I imaged the actin dynamics, and I analyzed the flow using the newly created deconvolution and flow estimation methods. As a result of the analysis, the spatio-temporal correlation between pseudo-pod formation and dynamics and actin flow was investigated.2022-01-2
Recommended from our members
Searching for structure in complex data: a modern statistical quest
Current research in statistics has taken interesting new directions, as data collected from scientific studies has become increasingly complex. At first glance, the number of experiments conducted by a scientist must be fairly large in order for a statistician to draw correct conclusions based on noisy measurements of a large number of factors. However, statisticians may often uncover simpler structure in the data, enabling accurate statistical inference based on relatively few experiments. In this snapshot, we will introduce the concept of high-dimensional statistical estimation via optimization, and illustrate this principle using an example from medical imaging. We will also present several open questions which are actively being studied by researchers in statistics
Recommended from our members
Improved methods for single-particle cryogenic electron microscopy
Biological macromolecules such as enzymes are nanoscale machines. This is true in a concrete sense: if the atomic structure of a biological macromolecule can be obtained, the theories of mechanics and intermolecular forces can be applied to explain how the machine works in terms that engineers would understand, including motors, ratchets, gates and transducers. Nevertheless, biological macromolecules are complex, fragile and extremely small, so obtaining their structures is a challenging experimental endeavor. Single-particle cryogenic electron microscopy (cryo-EM) is a technique for determining the 3D structure of a biological macromolecule from a large set of 2D electron micrographs of individual structurally-identical particles. To obtain such images, a solution of the macromolecules must be prepared in the frozen-hydrated state, embedded in a thin electron-transparent glassy film of water. This specimen must then be imaged with a very short exposure to avoid radiation damage. A powerful computer must then be used to sort, align, and average the 2D particle images to back-calculate the 3D structure. At its best, cryo-EM can determine the structures of biological macromolecules to atomic resolution. In practice, this goal is usually not achieved. Cryo-EM has gotten significantly more powerful in the past few years due to improvements in equipment and methodology. Several of the most significant advances originated in the labs of David Agard and Yifan Cheng at UCSF. When I began my PhD with Yifan, the spirit in the lab was that cryo-EM could keep getting better and better: with enough engineering, determining the 3D structure of an arbitrary biological macromolecule would be as routine an experiment as gel electrophoresis or DNA sequencing. Inspired, I took on projects in the lab that I thought would move the field closer to that goal. In the first chapter of this thesis, I describe work I did supporting a project initiated by David Agard and his long-time scientific programmer Shawn Zheng. They developed and implemented an algorithm, MotionCor2, for correcting the complex, anisotropic movements that occur when a frozen-hydrated specimen interacts with the high-energy electron beam. My role was to benchmark MotionCor2 on a panel of real-world 3D reconstruction tasks. I was able to show that MotionCor2 restored the highest resolution details in the images, ultimately yielding significantly better structures than simpler algorithms. For me, this projected highlighted the importance of benchmarking an algorithm for use in routine real-world conditions with the right metrics. In chapter 1, I include the manuscript for the MotionCor2 study, formatted to highlight my contributions that were moved to the supplement in the original publication by Nature Methods. One of the major remaining issues with cryo-EM is sample preparation: preparing the thin freestanding films of frozen-hydrated particles necessarily exposes those particles to air-water interfaces. Many fragile macromolecular complexes denature when exposed to such interfaces, preventing structure determination with cryo-EM. In chapters 2 and 3, I describe my efforts to develop a simple, robust approach to stabilizing fragile macromolecular complexes during the vitrification process. In chapter 2, I develop a method for coating EM grids with an electron-transparent and functionalizable graphene-oxide support film. I demonstrate that such GO grids are compatible with high-resolution structure determination. This work was published in the Journal of Structural Biology in 2018. In chapter 3, I extend this work by functionalizing GO grids with nucleic acids, enabling routine structure determination of uncrosslinked chromatin specimens. In on-going work, I used nucleic acid grids to solve high-resolution structures of a highly fragile specimen, the snf2h-nucleosome complex, and analyzed the conformational heterogeneity of the nucleosome substrate. These results were made possible by the nucleic acid grid, as the other major approach for stabilizing chromatin specimens, chemical crosslinking, not work for this specimen.Perhaps the most fundamental problem with single-particle cryo-EM is the radiation sensitivity of frozen-hydrated macromolecules. To image biological matter with electrons is to destroy it, so obtaining images of undamaged specimens requires very short, highly under sampled exposures. The resultant images are extremely noisy and low contrast, with most particles barely visible from the background. In chapter 4, I describe a novel computational approach to generating contrast in cryo-EM. Using a recently described machine learning strategy for training a parameterized denoising algorithm, I developed a computer program, restore, that denoises cryo-EM images, greatly enhancing their contrast and interpretability. This program leverages recent advances in computer vision and deep learning which have not yet been widely used in cryo-EM image processing algorithms. To characterize the performance of the algorithm on real-world data, I extended conventional metrics for image resolution to measure how an arbitrary transformation affects images at different spatial frequencies. These novel metrics are general and may be useful for characterizing other nonlinear reconstruction algorithms in cryo-EM and medical imaging. Finally, I showed that denoised cryo-EM images maintain the high-resolution information required for accurate 3D reconstruction. Denoising can be applied to conventional cryo-EM images and can be reversed whenever necessary. I have made the software for restore program publicly available and have submitted a manuscript for peer-reviewed publication
- …