20 research outputs found

    Microarray image processing : a novel neural network framework

    Get PDF
    Due to the vast success of bioengineering techniques, a series of large-scale analysis tools has been developed to discover the functional organization of cells. Among them, cDNA microarray has emerged as a powerful technology that enables biologists to cDNA microarray technology has enabled biologists to study thousands of genes simultaneously within an entire organism, and thus obtain a better understanding of the gene interaction and regulation mechanisms involved. Although microarray technology has been developed so as to offer high tolerances, there exists high signal irregularity through the surface of the microarray image. The imperfection in the microarray image generation process causes noises of many types, which contaminate the resulting image. These errors and noises will propagate down through, and can significantly affect, all subsequent processing and analysis. Therefore, to realize the potential of such technology it is crucial to obtain high quality image data that would indeed reflect the underlying biology in the samples. One of the key steps in extracting information from a microarray image is segmentation: identifying which pixels within an image represent which gene. This area of spotted microarray image analysis has received relatively little attention relative to the advances in proceeding analysis stages. But, the lack of advanced image analysis, including the segmentation, results in sub-optimal data being used in all downstream analysis methods. Although there is recently much research on microarray image analysis with many methods have been proposed, some methods produce better results than others. In general, the most effective approaches require considerable run time (processing) power to process an entire image. Furthermore, there has been little progress on developing sufficiently fast yet efficient and effective algorithms the segmentation of the microarray image by using a highly sophisticated framework such as Cellular Neural Networks (CNNs). It is, therefore, the aim of this thesis to investigate and develop novel methods processing microarray images. The goal is to produce results that outperform the currently available approaches in terms of PSNR, k-means and ICC measurements.EThOS - Electronic Theses Online ServiceAleppo University, SyriaGBUnited Kingdo

    A bioinformatics framework for management and analysis of high throughput CGH microarray projects

    Get PDF
    High throughput experimental techniques have revolutionised biological research; these techniques enable researchers, in an unbiased fashion to survey entire biological systems such as all the somatic mutations in a tumour in a single experiment. Due to the often complex informatics demands of these techniques, robust computational solutions are required to ensure high quality reproducible results are generated. The challenge of this thesis was to develop such a computational solution for the management and analysis of high throughput microarray Comparative Genomic Hybridisation (aCGH) projects. This task also provided an opportunity to test the hypothesis that agile software development approaches are well suited for bioinformatics projects and that formalised development practices produce better quality software. This is an important question as formalised software development practices have been underused so far in the eld of bioinformatics. This thesis describes the development and application of a bioinformatics framework for the management and analysis of microarray CGH projects. The framework includes: a Laboratory Information Management System (LIMS) that manages and records all aspects of microarray CGH experimentation; a set of easy to use visualisation tools for aCGH experimental data; and a suite of object oriented Perl modules providing a exible way to construct data pipelines quickly using the statistical programming language R for quality control, normalisation and analysis. In order to test the framework, it was successfully applied in the aCGH pro ling of 94 ovarian tumour samples. Subsequent analysis of these data identi ed 4 well supported genomic regions which appear to in uence patient survival. The evaluation of agile practices implemented in this thesis has demonstrated that they are well suited to the development of bioinformatics solutions as they enable developers to react to the changes of this rapidly evolving eld, to create successful software solutions such as the bioinformatics framework presented here

    Pertanika Journal of Science & Technology

    Get PDF

    Development of a "genome-proxy" microarray for profiling marine microbial communities, and its application to a time series in Monterey Bay, California

    Get PDF
    Thesis (Ph. D.)--Joint Program in Biological Oceanography (Massachusetts Institute of Technology, Dept. of Biology; and the Woods Hole Oceanographic Institution), 2008.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Includes bibliographical references (p. 155-181).This thesis describes the development and application of a new tool for profiling marine microbial communities. Chapter 1 places the tool in the context of the range of methods used currently. Chapter 2 describes the development and validation of the "genome proxy" microarray, which targeted marine microbial genomes and genome fragments using sets of 70-mer oligonucleotide probes. In a natural community background, array signal was highly linearly correlated to target cell abundance (R² of 1.0), with a dynamic range from 10²-10⁶ cells/ml. Genotypes with >/=~80% average nucleotide identity to those targeted crosshybridized to target probesets but produced distinct, diagnostic patterns of hybridization. Chapter 3 describes the development an expanded array, targeting 268 microbial genotypes, and its use in profiling 57 samples from Monterey Bay. Comparison of array and pyrosequence data for three samples showed a strong linear correlation between target abundance using the two methods (R²=0.85- 0.91). Array profiles clustered into shallow versus deep, and the majority of targets showed depth-specific distributions consistent with previous observations. Although no correlation was observed to oceanographic season, bloom signatures were evident. Array-based insights into population structure suggested the existence of ecotypes among uncultured clades. Chapter 4 summarizes the work and discusses future directions.by Virginia Rich.Ph.D

    Validating DOE's Office of Science "capability" computing needs.

    Full text link

    Anales del XIII Congreso Argentino de Ciencias de la Computación (CACIC)

    Get PDF
    Contenido: Arquitecturas de computadoras Sistemas embebidos Arquitecturas orientadas a servicios (SOA) Redes de comunicaciones Redes heterogéneas Redes de Avanzada Redes inalámbricas Redes móviles Redes activas Administración y monitoreo de redes y servicios Calidad de Servicio (QoS, SLAs) Seguridad informática y autenticación, privacidad Infraestructura para firma digital y certificados digitales Análisis y detección de vulnerabilidades Sistemas operativos Sistemas P2P Middleware Infraestructura para grid Servicios de integración (Web Services o .Net)Red de Universidades con Carreras en Informática (RedUNCI

    Anales del XIII Congreso Argentino de Ciencias de la Computación (CACIC)

    Get PDF
    Contenido: Arquitecturas de computadoras Sistemas embebidos Arquitecturas orientadas a servicios (SOA) Redes de comunicaciones Redes heterogéneas Redes de Avanzada Redes inalámbricas Redes móviles Redes activas Administración y monitoreo de redes y servicios Calidad de Servicio (QoS, SLAs) Seguridad informática y autenticación, privacidad Infraestructura para firma digital y certificados digitales Análisis y detección de vulnerabilidades Sistemas operativos Sistemas P2P Middleware Infraestructura para grid Servicios de integración (Web Services o .Net)Red de Universidades con Carreras en Informática (RedUNCI

    Real-Time Path Planning for Automating Optical Tweezers based Particle Transport Operations

    Get PDF
    Optical tweezers (OT) have been developed to successfully trap, orient, and transport micro and nano scale components of many different sizes and shapes in a fluid medium. They can be viewed as robots made out of light. Components can be simply released from optical traps by switching off laser beams. By utilizing the principle of time sharing or holograms, multiple optical traps can perform several operations in parallel. These characteristics make optical tweezers a very promising technology for creating directed micro and nano scale assemblies. In the infra-red regime, they are useful in a large number of biological applications as well. This dissertation explores the problem of real-time path planning for autonomous OT based transport operations. Such operations pose interesting challenges as the environment is uncertain and dynamic due to the random Brownian motion of the particles and noise in the imaging based measurements. Silica microspheres having diameters between (1-20) µm are selected as model components. Offline simulations are performed to gather trapping probability data that serves as a measure of trap strength and reliability as a function of relative position of the particle under consideration with respect to the trap focus, and trap velocity. Simplified models are generated using Gaussian Radial Basis Functions to represent the data in a compact form. These metamodels can be queried at run-time to obtain estimated probability values accurately and efficiently. Simple trapping probability models are then utilized in a stochastic dynamic programming framework to compute optimum trap locations and velocities that minimizes the total, expected transport time by incorporating collision avoidance and recovery steps. A discrete version of an approximate partially observable Markov decision process algorithm, called the QMDP_NLTDV algorithm, is developed. Real-time performance is ensured by pruning the search space and enhancing convergence rates by introducing a non-linear value function. The algorithm is validated both using a simulator as well as a physical holographic tweezer set-up. Successful runs show that the automated planner is flexible, works well in reasonably crowded scenes, and is capable of transporting a specific particle to a given goal location by avoiding collisions either by circumventing or by trapping other freely diffusing particles. This technique for transporting individual particles is utilized within a decoupled and prioritized approach to move multiple particles simultaneously. An iterative version of a bipartite graph matching algorithm is also used to assign goal locations to target objects optimally. As in the case of single particle transport, simulation and some physical experiments are performed to validate the multi-particle planning approach
    corecore